Tuesday, 17 May 2011

Problem Analysis - Mind Maps & Thinking

Credit to Peter Haworth-Langford, Christin Wiedemann and Oscar Cosmo, with whom I was chatting back in December...

The topic of mind maps was discussed (in various usages) and I realised that maybe I had a usage that I need to write about and digest...

I frequently get asked to do a root cause analysis type of activity. Usually it's connected with faults reported against a product, but other times it might be to analyse a situation or process. In all cases I usually have a main question in my head, "what is the real problem?"

Below is an example of using these approaches for a root cause analysis of a fault. The analysis is used as a learning opportunity - to see if there is anything that needs to be changed or improved, including project and team structures and aspects of how they work.

A typical root cause analysis (RCA) in my shop might look like the following. This is one example and there are many other variants.

The example "presumes" there was some missing testing (which might reasonably have been expected to happen), but it could just as easily be missing design analysis, design coding error or any of a range of other issues.

I have two main techniques that I use to help reveal information - and so help me on my way to "the real problem". Note, I say this as though there is only one, but I'm well aware there are usually many different competing aspects.

The first is the 5-whys and the second is what I think of as the "are your lights on?" approach and I use them in distinct ways:

  • 5-whys - extracting raw data
  • Lights-on - attaching meaning to the analysis

The 5-Whys
This is the forensic search for raw data - collecting the different aspects of the problem - what the problem was, its solution, aspects related to team analysis, system design, aspects related to test design and execution, aspects of team dynamics, changes in timing and priorities that might any of these parts.

Actually, I don't restrict the analysis to "5" - it can be 2, 3, or 8 - as many as are needed to get a "good enough" picture or level of granularity (part of this is an understanding of how deep we need to dig).  Extending the above example, it might become:

This approach is heavily based on the Gause & Weinberg book, "Are your lights on?", which I recommended highly. This is an approach that helps answer:

  • What is the problem?
  • To whom is it a problem?
  • Whose problem is it?
  • Does the problem need solving?

Usually, by working through these aspects of the problem it is possible to determine the extent of the issue, if it is an issue that should be addressed, if this is something that can be done now and by who.

Sometimes the problem appears due to a change in project priority or timing, something doesn't get completed and, crucially, the customer is not aware that it shouldn't use feature X in this first drop, or that it has certain limitations.

Communication (or the absence of it) is, occasionally, a root cause in itself!

A Third Way
I said I had two main ways, well recently I've realised that I'm using a third way - I say recently, as I've really only attached (or discovered) the terminology for it: Framing.

Framing is heavily based around the work of Tversky & Kahneman, with some application by Russo.

This is the filter that is used to look at the problem. In the above example, I might look at the issue from the project perspective: what priorities existed on the project (stakeholder or customer), were there aspects outside the control or remit of the project, did any of the project aspects change and with what timing?

There might be team frames: how did the internal and external communication to and from the team work; was there a project change that didn't filter into the team; information availability, assumptions and other limitations.

Finally, there might be aspects that work on project, team or individual level: risk averse or risk taking attitudes; attitudes of uncertainty, lack of support or overconfidence; or a combination of many factors.

So, applying some framing aspects to this example might give:

The framing of the problem - and using multiple frames - helps to put the issues into perspective, the situational context - what did it mean at this particular point in time, in this particular project, in this particular team set-up, for this particular customer, etc, etc.

Full Circle
The framing aspect now fits quite well into my RCA approach, or any problem analysis approach:

  • Raw data is gathered: 5-whys is one approach
  • Situational context: Framing, ideally using a number of frames.
  • Meaning and decision: Lights-on

Mind maps help in a number of aspects here: visualisation and recording of thinking and exploration.

However, the big aspect is the thinking - using a range of tools to explore the problem, not limiting analysis to the problem artifact (a team didn't do xyz...) but adding in the situational aspects and weighing up the problem with the surrounding information to get to a better understanding of it (framing). Then the information can be used to help understand if it really is a problem that needs solving and by who (Lights-On).


  1. Gause & Weinberg, "Are Your Lights On?: How to Figure Out What the Problem Really Is" (Dorset House, 1990)
  2. Kahneman & Tversky, "The Framing of Decisions and the Psychology of Choice" (Science v211, 1981)
  3. Russo, "Decision Traps: The Ten Barriers to Decision-Making and How to Overcome Them" (Fireside, 1990)
  4. 5 Whys, (wikipedia)

Wednesday, 4 May 2011

Carnival of Testers #21

" #softwaretesting #testing "

"April april, din dumma sil
Jag kan lura dig vart jag vill!"
(Swedish saying when you've been "April fooled")

Well, no April fools jokes here - just proper writing and blogging about testing..

Testers Meeting
April saw the second SWET occurrence and the ninth LEWT. Many SWET attendees wrote their reflections. More for LEWT please? Elsewhere, testers were at meetups, at conferences and testing online.
  • Markus Gärtner gave a comprehensive write-up of his Management 3.0 course.
  • A good reflection on some options for visualizing testing was posted by Gojko Adzic
  • A thought experiment on factory vs context driven testing was started by Rikard Edgren, here. Some interesting discussion in the comments too!
  • Writing tips for software testers was the subject of Anne-Marie Charrett's post, here.
  • Entaggle has had a few posts already. Here is the first (of many) from Elisabeth Hendrickson.
  • Changes to Joe Strazzere's "people in testing" page were posted, here
  • John Stevenson looked at a relevant question about mentoring new testers, how he did it and some initial ideas, here.
  • Henrik Andersson, who hasn't blogged before (I think), made an interesting entry, here, with a piece on the need for diversity.
Until the next time ...