Tuesday, July 6, 2010

Learning's from Interview Board

Read an interesting post from Pari http://curioustester.blogspot.com/2010_07_01_archive.html
it reminded me of many interview panels that I have been part of ........
It is always so easy to be on the other side of the table ....you can ask anything and pass off with it.......and that is the imagination with which many an interviewee walk in ....... (BTW - I usually interview people with 8+ yrs of experience only)

Just like Pari resorted to asking a potential employee (I prefer referring them so till I do not reject them officially post an interview :)) if he could test a marker ....... I have used many similar simple examples like a coffee vending machine; swipe in/out machines; login on a home page;  etc

People are usually caught by surprise coz they come prepared for heavy questions with equally weighty answers :)

Where does it helps me ?
  • To identify a genuine testers and ensure not a checker is hired
  • Identify the concepts of the person
  • Identify the gaps that he brings to table as a heavy weight in his thought process

Few easy areas (read traps) where people get confused and easily get caught are -
  • Metrics and Measures - What are they collecting and reason to have them ?
  • Software Testability and Usability
  • Test Scenario and Test cases
  • Test Oracle and Test Heuristic
  • Why Automation ? - When and where to use it ? What is the thumb rule to decide on automation ?
  • What are they doing now in their current job profile and why are they doing so ? Why did they not use an 'XYZ' approach ?
  • Reliability of their test cases 
  •  ..........
[You thought I'll leak my complete questioning skill here ;) ...who knows when I'll interview you next !!]

Long and short of it .......we seriously lack thinking testers amongst the community .......... there are also people who have achieved a lot but will still struggle when subjected to such questions ....

The testing community needs a lot of change in thought process and we need to be the change .....

Monday, July 5, 2010

Negative and Positive Test Cases

It was interesting conversation between James Bach and Michael Bolton last week over twitter where the topic was focused around - Negative and Positive Test Cases.

One of Michael's comment "When you read a newspaper, do you *count* the stories? Count good news stories vs. bad news stories?" ....made me think why we as testers get swayed to the thought of classifying our test cases in such a  manner .......

Yes, when I read a news paper ...I do not classify my stories nor count them but then what becomes so different when I write test cases ???

Putting my test manager's cap, the few immediate things / reasons that came up -
  1. How do you ensure that you have your test scenarios totally covered ?
  2. Can you maintain an exhaustive checklist ?
  3. Or is it easier to classify them as negative and positive for the team to build easier understandability around what is expected from them ?
  4. Does it take me away from micro management ?
In my opinion the completeness of test cases covering a scenario is bound by both the negative and positive aspects required to test. So based on my experience and assumptions I feel that it is the ease of tracking that has got these terms of negative and positive in the scene and somewhere the thought process got lost and it started to become more of a process meant for tracking and keeping a count rather than measuring the value add in totality.