Sunday, July 31, 2011

STP Conference in Nashville Continues - Virtual Systems

Choosing a session for Block 2 was just as difficult as block one.  There were three topics of interest to me and two familiar speakers.  There were sessions on Mobile Testing, Agile Testing, and Performance Testing.  Mobile testing certainly is the newest kid on the block, but I opted for performance testing.

There were three reasons I chose performance testing.  For one my colleagues were going to the Agile Testing session by Bob Galen.  Second I love learning more about performance testing.  And finally the presentation was being given by Dan Downing, who I consider a mentor in the field of performance testing.  The session was entitled "Performance Testing Virtualized Systems".

Dan introduced the topic by highlighting the pace at which organizations are moving to virtualization and the many pain points associated with this movement.  It was the next slide that captured my curiosity.

Dan would venture to explain the six critical factors for Testers.

The first was "anatomy of a Virtual System".  I will be honest in that I really do not know much about virtual systems.  So the key point for me was that I needed to know and understand the system under test.

Mapping workloads was the next topic.  Dan is brilliant enough to do this type of work.  This is something I would have to seek assistance on.  None the less if you are going to do performance testing you must know how data flows through the system and how various conditions can influence the data flow.

Dan's next point was regarding bottlenecks of a virtual system.  My first thought was that is why we are performance testing to find the bottlenecks in the system.  True that is why we would be performance testing, but Dan's point was understand the applications to confirm they are properly distributed based on you knowledge of the dedicated system.  So for me this tied the first to points together in that you need to understand the potential areas of risk and design your performance testing to properly measure the key areas of risk.

The next topic was about testing technique.  Dan talked about executing the tests in parallel.  Do not let time gaps call into question your test results because system performance can have different influences depending on the time of day.  One example might be that late in the evening database backups take place or ETL pulls moving data from system A to system B.  Honestly I do not think in my short performance testing career this technique occurred to me.  It makes sense because I have spent time in the past trying to understand nonsensical data.  If you have the resources to run in parallel, then I think this is a great idea.  If you do not have the resources then it comes back to the first three points about knowing the system under test.  Often performance testing is influenced by many disparate systems so you will still have potential for unexplainable results.

Mr. Downing talked about Test execution.  He split the load into three different phases.  Start with a  "light" load.  His point was to establish a baseline for each system where the system is performing optimally and there is limited competition for resources.  Then progress into a medium load test where everything is scaling properly and their are no bottlenecks, such as a database.  Finally move to the stress mode where you determine failure points and that the systems fail in the appropriate manner.

The next session of the talk was of course regarding the zillion of metrics you should measure and monitor.  I think Dan is spot on mentioning all of the measurements.  And if you have the tools and resources you should absolutely look at every thing.  Unfortunately when I have the time to help with performance testing I am squeezed for time.  I try to pick out maybe 5 key metrics to gather results on and use those results to determine if further monitoring is needed.  Performance results data can be overwhelming.  Dan certainly knows what tools to use and is a wizard at data analysis.  Having tools and systems in place to conduct real time monitoring can certainly ease the data analysis paralysis.

Dan spent some time in this presentation on data analysis.  The key take away for me was he had a focus on comparative data.  The goal of this performance test was to compare a dedicated system to a virtual system.  Assuming you know the dedicated system performance from a historical perspective, then you have the best heuristic.  Show data for both systems in the results.  Differences become more apparent and you then can investigate those differences.

For me it was a thought provoking presentation.  Dan concluded by stating how important it is for testers to keep up with the latest trends and develop the new skills to keep pace with these trends.  The performance testing fundamentals are the same, but testers need to stay ahead of the curve in order to provide value.

If you have never heard Dan speak, you should.  He has a great passion and enthusiasm for performance testing.  More importantly he likes to share this knowledge with everyone.  If you attend STPCon in Dallas and you love performance testing, introduce yourself to Mr. Dan Downing.

Sunday, July 24, 2011

Note on Performance Testing from Scott Barber

At the STP Conference in Nashville I had many daily decisions to make.  After James Bach's inspirational key note, came my first important decision.  There were two speakers both of whom wrote featured articles in the the book, "Beautiful Testing".   One of my co-workers was going to listen to Karen Johnson about "The Strategy Part of the Test Strategy", so I chose to listen to Scott Barber.

Scott in his normal flamboyant style delivered  a passionate presentation on performance testing, "A Performance Testing Life Story: From Conception to Headstone".  I also have a passion for performance testing, but I have a ton of learning to do.  Scott's presentation went right to the heart of the performance testing life cycle.  Here is a summary:

1.  Building a software product it is critical to consider performance within the architecture and design phase.  Performance should be part of the DNA by asking performance specific questions as the software concept evolves.

2.  We should set performance targets then profile and test the code at the component level.

3.  We should continue profiling, performance unit testings, but also add in environmental performance testing and load or stress testing.

4.  There should be a tuning phase where we do our best to optimize performance prior to launch.

5.  We should performance test every patch, update, or configuration change.

6.  Even sun setting applications should be monitored for performance.

Scott summarized his presentation by stating "Successful people and applications are tested, monitored and tuned all of their lives.

Most companies I have worked for performance testing is the last thing considered.  To some extent performance is thought about during the design phase, but not actually tested until the end of the life cycle.

Performance testing is hard!  A non-performance site can hurt a reputation so we should fold performance into the corporate DNA.

Bust out the performance tools and "Git Er Done"!

Wednesday, July 20, 2011

James Bach at STP Conference in Nashville

I can honestly state that the reason I have not been blogging lately is James Bach.  Just kidding of course, but he certainly inspired me with his keynote "Notes from a Testing Coach".  I have been extremely busy mentoring, learning, implementing, collaborating, testing, and innovating.  It has been extremely fun and I owe the energy to Mr. James Bach.

In his key note he opened by explaining the three kinds of practical credentials: portfolio (your past work), performance (demonstrating your ability), and reputation (stories told about you).  All of these things combine to establish credibility.  Testers should actively work on their portfolio.  Testers should consistently demonstrate their skills.  If you do these two things well hopefully "good" stories will be told about you.

James mentioned one of the things that can get in the way of mentoring are feelings.  He is very accurate in this assessment.  Once you go beyond feelings testers have the ability to leap tall buildings with a single bound.

The coaching process involves building relationships, challenges, allowing things to happen, retrospective or diagnosis of the problem, and collaboration.  There will be set backs as well as celebration of success. 

I recently visited London and I found myself hearing the words "Mind the Gap" in my sleep.  I remember Mr. Bach saying "mind the syllabus".  My interpretation of these words months after the keynote is as a mentor you should have a plan for teaching just has you should have a plan for Session Based testing.  I may have this way out of context at this point, so I will need to do some research.

James also talked about as a mentor you must be prepared to demonstrate to the student what you might do.  In other words, you may get to a point in your mentoring where you have to roll up your sleeves and lead by example.

Another huge lesson from this key note was his demonstration of the hidden picture.  I think the main point was to mess with his brother, but the demonstration illustrated how a tester can explore, change focus, change approach or technique, get reasonable coverage rapidly, yet not find a potentially large defect.

James gave an over view of the dice game.  I had inquired via email to James on how to execute the dice game.  Through email collaboration I got a reasonable idea of the intent of the game, but I was extremely fortunate to be able to learn more about this game in person with his brother Jon.  This hands on experience had a huge impact on me.  The conversation and the approaches Jon took clearly illustrated how testers can benefit by rapidly assessing patterns.  I now try to show this game to every tester I encounter.  I think it is fun and most importantly in invokes thought.  I even went to a local game store and bought the Cast Elk puzzle.  I have yet to solve it!  I keep trying but no success.  I know the answer is on You Tube, but I refuse to cave in.  During my travels I now buy puzzle books and attempt puzzles I never thought I could do.  I am extremely amazed at how fun learning and challenging yourself can be.  Thanks James and Jon for this inspiration.  Jon also turned me on to a site, http://www.sporcle.com/.

This is some of the value I took away from this key note presentation.  There was much more content that I do not recall. I can honestly say that this key note presentation was a true inspiration to me.  Not only as a tester, but in my every day life. 

A huge "Nice Bike" to James for the key note and to Jon for taking the time to experience the dice game with me.

Dice game ROCKS!