Sunday, July 31, 2011

STP Conference in Nashville Continues - Virtual Systems

Choosing a session for Block 2 was just as difficult as block one.  There were three topics of interest to me and two familiar speakers.  There were sessions on Mobile Testing, Agile Testing, and Performance Testing.  Mobile testing certainly is the newest kid on the block, but I opted for performance testing.

There were three reasons I chose performance testing.  For one my colleagues were going to the Agile Testing session by Bob Galen.  Second I love learning more about performance testing.  And finally the presentation was being given by Dan Downing, who I consider a mentor in the field of performance testing.  The session was entitled "Performance Testing Virtualized Systems".

Dan introduced the topic by highlighting the pace at which organizations are moving to virtualization and the many pain points associated with this movement.  It was the next slide that captured my curiosity.

Dan would venture to explain the six critical factors for Testers.

The first was "anatomy of a Virtual System".  I will be honest in that I really do not know much about virtual systems.  So the key point for me was that I needed to know and understand the system under test.

Mapping workloads was the next topic.  Dan is brilliant enough to do this type of work.  This is something I would have to seek assistance on.  None the less if you are going to do performance testing you must know how data flows through the system and how various conditions can influence the data flow.

Dan's next point was regarding bottlenecks of a virtual system.  My first thought was that is why we are performance testing to find the bottlenecks in the system.  True that is why we would be performance testing, but Dan's point was understand the applications to confirm they are properly distributed based on you knowledge of the dedicated system.  So for me this tied the first to points together in that you need to understand the potential areas of risk and design your performance testing to properly measure the key areas of risk.

The next topic was about testing technique.  Dan talked about executing the tests in parallel.  Do not let time gaps call into question your test results because system performance can have different influences depending on the time of day.  One example might be that late in the evening database backups take place or ETL pulls moving data from system A to system B.  Honestly I do not think in my short performance testing career this technique occurred to me.  It makes sense because I have spent time in the past trying to understand nonsensical data.  If you have the resources to run in parallel, then I think this is a great idea.  If you do not have the resources then it comes back to the first three points about knowing the system under test.  Often performance testing is influenced by many disparate systems so you will still have potential for unexplainable results.

Mr. Downing talked about Test execution.  He split the load into three different phases.  Start with a  "light" load.  His point was to establish a baseline for each system where the system is performing optimally and there is limited competition for resources.  Then progress into a medium load test where everything is scaling properly and their are no bottlenecks, such as a database.  Finally move to the stress mode where you determine failure points and that the systems fail in the appropriate manner.

The next session of the talk was of course regarding the zillion of metrics you should measure and monitor.  I think Dan is spot on mentioning all of the measurements.  And if you have the tools and resources you should absolutely look at every thing.  Unfortunately when I have the time to help with performance testing I am squeezed for time.  I try to pick out maybe 5 key metrics to gather results on and use those results to determine if further monitoring is needed.  Performance results data can be overwhelming.  Dan certainly knows what tools to use and is a wizard at data analysis.  Having tools and systems in place to conduct real time monitoring can certainly ease the data analysis paralysis.

Dan spent some time in this presentation on data analysis.  The key take away for me was he had a focus on comparative data.  The goal of this performance test was to compare a dedicated system to a virtual system.  Assuming you know the dedicated system performance from a historical perspective, then you have the best heuristic.  Show data for both systems in the results.  Differences become more apparent and you then can investigate those differences.

For me it was a thought provoking presentation.  Dan concluded by stating how important it is for testers to keep up with the latest trends and develop the new skills to keep pace with these trends.  The performance testing fundamentals are the same, but testers need to stay ahead of the curve in order to provide value.

If you have never heard Dan speak, you should.  He has a great passion and enthusiasm for performance testing.  More importantly he likes to share this knowledge with everyone.  If you attend STPCon in Dallas and you love performance testing, introduce yourself to Mr. Dan Downing.

No comments: