Monday, May 30, 2011

Workshop at STP Conference

 I attended a workshop prior to the STP Conference in Nashville called "Creating & Leading a High Performance Test Organization".  Bob Galen was the presenter.

Bob did a great job presenting a ton of material.  Honestly much of the material seemed like common sense, however the information was packaged nicely.  The material served as that gentle kick in the pants to remind you that you that as test leaders we need to revisit our foundations.

Tons of things were covered including recruiting, out sourcing, marketing test, defect management, leadership and communication.  I am going to focus in on two topics.

Effective Communication

Mr. Galen talked about knowing your audience and adopting their point of view.  This makes sense if you have some idea of who the person you are talking to is and what they do.  Even in this happy path situation how you phrase your communication can be extremely complex.  What happens if you meet the person for the very first time?  How do you get their point of view into context?

My conclusion to effective communication is that it is really hard.  We must always work on our communication skills.  Somehow in a conversation you must size up the moment.  In other words put some context around the current environment both setting and mood of the participants.

Bob's second point of "active listening" is probably the key to effective communication.  One approach might be to break the ice with an introducing statement, then carefully listen to the response.  Somehow we must grab the clues that help us to know our audience.  Honestly I tend not to be very good at the active listening part.  Just ask my wife!

One other point Bob had was "can your audience handle the truth and how much of the truth".  This is a tough one for me.  I have an opinion and I tend to share it regardless of the impact on the audience.  As a communicator I need to learn to be more judicious with my opinion and determine how much of the truth is appropriate for audience at hand.

Effective communication is critical in everything we do.   It is not easy and communication is something we can always improve upon.


At this work shop there was some interesting discussion around defects.  When to document defects and when not to document them.  One person in the audience felt it was critical to document every defect.  Others felt that there are times when it is appropriate not to spend the time documenting a defect.  I shall not continue this debate here, but I thought Bob had a couple of great points in his material.

"A good report is written, numbered, simple, understandable, reproducible, legible and non-judgmental."   I agree with this statement, but there is one attribute that stood out for me, non-judgmental.  As testers we need to do our best to remove emotion from a defect report.  We should be concise, state the discovery process, and add supporting material as facts.

Bob also provided a list of styles testers should consider putting together a defect report.

1. Condense–Say it clearly but briefly
2. Accurate–Is it truly a defect?
3. Neutralize–Just the facts
4. Precise–Explicitly, what is the problem?
5. Isolate–What has been done to isolate the problem?
6. Generalize–How general is the problem?
7. Re-create–essential environment, steps, conditions
8. Impact –To the customer, to testing, safety?
9. Debug–Debugging materials (logs, traces, dumps,
environment, etc.)
10.Evidence–Other documentation proving existence

I felt like this was a pretty good reminder of how to write an effective defect report.  Perhaps I can develop this into a little acronym - CAN-PIG-RIDE. 

I need to do a better job of effective communication and part of that effort is making sure defect reports are communicated effectively.

Happy Testing!

Friday, May 20, 2011

Defect Tracking

I still plan to post thoughts on the STP Conference in Nashville, but some recent posts has spawned some thought.

Lisa Crispin gave a presentation at Star East that sparked Gojko to write a post entitled "Bug Statistics are a Waste of Time".

I agree with the notion that we should clearly understand the business objectives and find ways to measure the value the features bring to a customer community.  I do not agree that looking at bug statistics is a waste of time.  History is one of the greatest oracles available to a tester.  How were we doing in the past compared to today?  Are there any lessons to be learned from a software systems past defects?  I certainly think so.

Lets assume that company X provides some value to someone and that company X knows how to measure their business objectives and value of those objectives using things like Net Promoter Score, Google Analytics, Agile Velocity,  and Get Satisfaction.

What can inspecting defect metrics add to the cause of determining value?

I view metrics as flashlights into a dark cave.  How do you know what is there unless you look?

A simple inspection of the total number of defects in the back log implies some level of technical debt.  I agree with Gojko that if defects simply sit in a backlog then we are wasting some time.  Teams should and must proactively triage, fix, or even throw away defects.  But not to document them especially in a non-searchable manner would be detrimental to the team.

Teams should occasionally have retrospectives on their processes.  Finding data regarding defect groupings is a fantastic lever for continuous improvement.  Where are the majority of our bugs historically clustering.  Where you find defect clusters you have the opportunity to change your process to reduce those clusters.  Agile teams especially should do look backs at some frequency.  How did we do last quarter?  How does trending look?

As a tester I have at times had an extremely difficult time advocating for a process change.  In several situations I have found the ammunition to influence change by showing historical trends.  Could we do this carrying around a notebook? Sure we could, but it would be difficult especially as to how time flies.

Let me toss this scenario into the mix.  You are a new tester at a large company or even a consultant.  Your mission is to understand the quality of the software and you have to do it fast.  It would be nice to shine a flash light into the new cave and know the areas of risk.  Yes you could put your hands on the application and start testing, but a sneak peak at the historical defect data could narrow in on the best place to start.

Yes some defect tracking tools really really suck, but our ability as testers to search, learn and educate provides great value even to Company X who knows how to monitor and measure value.

I shared Gojko's link with a large community of developers and testers.  I shared the link not because I agreed with it, but because it made me think.  I received back a quote that really struck a chord with me.

"Sure hope this isn’t the future of QA."

Losing the oracle of history would be a huge mistake.  Using metrics prudently, adapting the metrics to changing business value, and having conversations around the findings are key elements of Continuous Improvement.

If we influence a change for the better using metrics, then we certainly are not wasting our time! 

Read Gojko's post and the associated comments it definitely should spark some thought.

Happy Testing! 

Saturday, May 14, 2011

Funny thing Happened at STP in Nashville

Testers, I must find the time to catch up on my back log of posts.  I thought I would start out with an embarrassing tale.

Once upon a time there was a tester, Carl, who had signed up for a work shop at the STP conference in March 2011 at Nashville TN.  It was a beautiful day and the coffee was flowing.  Carl happened to be running late for the workshop.  All Carl knew was that the work shop was labeled Pre-5.

Carl raced up the stairs (very hard to do with a tender knee).  There at the top of the stairs on the right was Pre-5.  The instructors name was Bob.  Carl remembered his instructor's name was Bob.  The title of the work shop was Quality Monitoring and Coaching.  For some reason Carl seemed out of place. 

Everyone introduced themselves.  Of course I introduced myself and why I was there.  What did I say?  "Hi I am Carl Shaulis and I am a Test Manager at HomeAway.  I recently have been working close with Customer Service to facilitate quality and I am here to continue learning on how to bridge the gaps between our two teams."  Wait one second, that does not sound right.  Carl pulls out his conference brochure.  As he thumbs through the pages Fiona walks into the conference room.  Oh Carl knows Fiona from STP Las Vegas.  Carl must be in the right place.  Carl finally finds the work shop he should be in called "Creating and Leading a High Performance Test Organization" by Bob Galen.

Panic sets in!  I am in the wrong room.  I just gave a great introduction on how this presentation could benefit me.  What should Carl do?

Carl quickly packed up his gear and headed out of the room.  Carl found his intended work shop and enjoyed Spring STP Conference 2011.

Carl did get some friendly ribbing and crap from Abbie and Fiona.  Deservedly so!  The secret was out of the bag that there were two concurrent conferences  Contact Center Conference Expo 2011 and Software Test Professionals.

The really fun part is that Carl probably would have learned a hell of a lot if he had stayed in the incorrect work shop.  However, interacting with the testers and learning from Bob Galen was the right move.

Happy Testing!