Wednesday, August 8, 2012

How to deliver a qualitative product


The mind-map I'm posting, has been created by looking back on different projects and consulting different colleagues, including my wife, the best Business Analyst I've ever seen. It is most probably not complete. But I think it is ready to start sharing it.

The idea behind this mind-map is to focus on key aspects required to deliver a qualitative software product of considerable size and how testers can contribute to obtain a higher level of quality of the product to be delivered.





If a tester wants to contribute to quality, they can do more than testing a product when it is delivered. Testers have many other duties during software development. They know the product and project in- and outputs and are aware of their risks. They understand the changing business needs and their explicit and implicit requirements. They challenge the architecture and analysis, coach developers and acceptants to check their delivered product effectively on standards. They facilitate in environment and data preparation. A tester understands and contributes to the release and deployment mechanisms. 


Planning & Control

It is the testers job to:
  • Take part in the project planning and approach; challenge the planning
  • Look into the processes and challenge them in function of quality delivery
  • Find dependencies and flag them up to project/programme management
  • Understand the test requirements and build a log of them
  • Advise on test tool selection in function of the budget, timeline, experience and test requirements.

 Design

It is the testers job to:
  • Understand the product needs and rationale behind the project
  • Test the designs and architectural models and point to their weaknesses
  • Find and report flaws in communication patterns and stakeholder involvement
  • Coach acceptants in the definition of E2E prototyping test cases; lead them and help them in the  E2E test preparation. 

Delivery

It is the testers job to:
  • Test, test, test, and automate tests
  • Coach developers in checking against standards
  • Contribute to and cooperate closely with deployment and release management
  • Coordinate proper and clean bug reporting and the bug life cycle 
  • Provide and manage the required test infrastructure.

Acceptance

It is the testers job to:
  • Coach the acceptants in their verification and testing process
  • Analyze and report test results in simple and understandable language
  • Provide release advise
  • Coordinate E2E test execution.

Communication

A tester is a great communicator. They understand the business needs, reason with analysts, challenge architects and managers and coach users and developers.

    Thursday, August 2, 2012

    Why TMap should be called CMap


    As I told before, I’ve been raised with TMap.  And since I arrived in, what one could call ‘test puberty’, I start to make up my mind about testing and find my own way.  My adventure lead me to a great blog post that made me think about testing and checking (http://www.ministryoftesting.com/2012/07/mindmaptesting-and-checking/) which lead me to Michael Bolton’s blog (http://www.developsense.com/blog/2009/08/testing-vs-checking/) and made me realize that TMap (Test Management Approach) should actually be called CMap (Check Management Approach) 
    If the above 2 links are new to you, I would recommend to go and read them.
    I don’t have the time or will to get into every detail of TMap to make my point, as this "bible" counts over 700 pages of valuable information about testing. Therefore, I’m about to touch the most important parts of the book and explain how I want to propose to apply changes to the book in the next version(s).
    Let’s start with the definition of testing.
    TMap®: Testing is a process that provides insight into, and advice on, quality and the related risks
    I would like to split it out into 2 new definitions
    1.     Checking is a process that possibly provides insight into, and advice on, quality and the related risks
    2.     Testing is an investigation that provides insight into, and advice on, quality and the related risks.
    Why is testing not a process?
    A process is a set of interrelated tasks that, together, transform inputs into outputs (http://en.wikipedia.org/wiki/Process_(engineering))
    So we should have:
    -        Input: Test base, test cases, expected results
    -        Output: Defects, test report
    In reality, testing can not be a process.  Why not? Because we can’t plan where we will find the bugs, so we can’t plan were we will need to look and for how long. Was the analysis/development consistent with the company’s history, is compliant with the business process? What were the real user expectations? In order to do so, we have to plan an investigation and re-plan, based upon our findings.
    Checking can be mapped out in a process and can be easily planned. What do we check ? When do we check ? How much re-checking do we estimate ? Which bugs did we find, checking which requirements ? We can check if requirements are implemented correctly. We can check if field validations are representing the analyzed error messages, we can verify the load and stress requirements...
    Check specification (scripting of checks) can only be done for the parts that we can check against expectations or a predefined standard. Testing we can do out of those boundaries. What else can the system do that was not specified? Do we want the application to be able to do this? Is it important to test this?

    What is testing?
    When we elaborate further on testing, according to TMap, each definition of testing compares a product with a predefined standard.
    Of course this is the ultimate way to deliver a set of beautiful metrics later on. But what about those expectations that were not predefined? What about those orphan features, those implicit requirements or poorly communicated company standards… or those gaps that nobody wanted to add to the predefined standard? Are those out of scope for testing because they were not defined?
    For Checking, I would agree. Checking is this very important task that great developers do and that is of high importance during the acceptance phase of product delivery.

    The 4 Essentials
    1. Business driven test management
    This area of TMap offers a great method and toolset to prepare for checking and testing with a correct business focus and performing proper product risk analysis. From the moment the risks are defined in the BDTM life cycle, testing and checking in my opinion part ways.
    Checking continues perfectly conform to the TMap methodology, where at a certain moment in time  check goals are consolidated, product risks are defined, test and check thoroughness is defined and check-scripts are being written and executed.
    Testing however will not consolidate. Risk areas might be reduced or increased, new goals might be introduced and other goals will disappear. Testing will be focused on those risks and goals within a changing environment. There will be more cycles passing trough business, there will be less script-preparation but a lot of testing will take place and will get documented.
    I would propose to separate BDTM from TMap as this toolset and method is a very valuable one for testing as well as for checking. I can only recommend it to anyone who hasn’t used it before.

    2. The structured test process
    See definition of testing. Testing is not a process, it is an investigation.
    Checking however can follow a process. The one described in TMap offers a great way to do so. My proposal is to change testing into checking here for the sake of terminology and testing.

    I have to add here that there is one lovely part in the TMap test process. Specification of the test infrastructure and the checklists provided to do so, helped me quite a lot to specify and not to forget about all artefacts I had to order to enable testing.

    3. Complete toolbox
    For starters, a toolbox in my opinion is never complete. Only a sales department will try to convince me that they can offer me a complete toolbox…. Not an expert, using tools. I have to be fair TMap doesn’t repeat the statement further in the book. It is mentioned that there is a large set of tools present, which cannot be denied.
    I would still request for the addition the tea-test and the shoe-test for example. Those tools, I tend to use on every occasion I get to test software and sad enough, stay very rewarding.

    TMap offers a great bunch of checklists and templates on http://www.tmap.net/
    I have seen and used almost all of the TMap test tools but always seem to come to the same conclusion that they are check-tools or used as check tools when they really could be test tools.
    For example: The well known BVA (Boundary Value Analysis)
    BVA is a great test tool but if you are looking for boundaries only in the predefined standard, you turn your test tool into a check tool. How do you know for sure if you know the real boundaries or if you know all boundaries? Checking applies BVA to the information provided in the standard. Testing is looking for the real boundaries in the system.

    4. Adaptive method

    I have to admit that TMap is an adaptive method. You don’t have to use any of it if you don’t want to and you can pick and choose the useful things that are relevant for your context.
    I propose to use the following great features TMap offers
    -        BDTM (taking into account that testing scope changes)
    -        All provided checklists and techniques
    -        Test Infrastructure part of the test process

     

    Conclusion

    TMap is a great method for checking and offers a bunch of useful checklists and templates. Those are downloadable from http://www.Tmap.net.
    BDTM is a good check Management method and offers a good base for a test management method.
    I would therefore extract BDTM out of TMap and re-brand TMap to CMap.
    I would refer to for example http://www.ministryoftesting.com or http://www.developsense.com to get to know what you're missing out on if you only check and don't test.

    Thursday, July 26, 2012

    Why testers know best...

    Because testers ask questions when they don't understand
    Because testers think about what can go wrong and not how it should work
    Because testers are not dedicated to go live in time
    Because testers still want to go live in time more than finding bugs that don't matter
    Because testers love finding bugs that matter
    Because testers don't have much advantage in telling 'I told you so'
    Because in the end, if a problem is not fixed in time, the tester suffers from that so they are driven to find issues and defects early
    Because testers don't have 'babies' to protect and therefore are the least biased people in the project
    Because testers have a critical technical eye for detail and understand what business wants in the same time


    Tuesday, July 24, 2012

    Forming Storming Norming Performing


    We kicked off with a team of completely diverse people, starting a project that was going to be the biggest challenge most of us ever faced.

    The first months of the project where more difficult than any previous other project I had been working on. Finding out our places in the project, how we would fit to the puzzle, was for everyone one of the biggest challenges. At first conflicts were avoided, and specific tasks were not performed. Later on, some of us got caught up in quarrels and fights and after months of hard work, as a team, we contributed to late and poor quality delivery.


    After a couple of months of suffering, we learned on a team building event that this is so very normal. After that we found that by good leadership, team members are being carried trough different phases in a model of a team-building process.
    The model I'm talking about is the model of Tuckman and considers the Forming, Storming, Norming and Performing phases in team building.

    If you ever get a chance to look at your own team from a distance, you'll definately recognize this model and if you're manager of a leader of a newly created team, you definately need to know these things if you want to improve your team atmosphere and quality of delivery.

    Forming

    The team is assembled. Team members don't know each other well and tend to act independently. There is no real group-feeling.  Conflict is avoided. Team members have a need for guidance.

    Storming

    Team members start to take their positions in the team. This process often leads tentions between team members and contradictive ideas.

    Norming

    Team members are aware of the need of rules and methods. Those rules and methods are defined within the team. The common objectives become clear and the required roles are divided over the team members. The team with all team members take responsibility for the delivery of the objectives.  The risk during the Norming phase is that the team loses either their creative spirit or the drive that brought them to the Norming phase.

    Performing

    The team is working together harmoniously to achieve the common objectives. The team is working independently and makes collaborate decisions. Performing teams are identified by high levels of independence, motivation, knowledge and competence.



    Unfortunately, many teams never make it past the Storming phase. However I dear to say that we made it at least to the Norming phase as a team, yet we still have a lot to deliver...

    Monday, July 16, 2012

    Bug Fix Bingo

    How to dynamically turn frustration of testers into positive energy?


    Bug Fix Bingo
    Developers seem to return with interesting reasons why a bug is not 'really' a bug.
    Testers on the other hand, need to understand that developers don't produce code but create art-work.

    Now, tester, when you come back from a soft-skill-battle with the developer where you finally, without hurting their feelings, were able to make them aware of this little mistake they made, you can return and have a little prize as a reward for your empathy and communication skills.
    In stead of getting frustrated and going back to your PC with a head-ache, you will be allowed to participate in the Bug Fix Bingo.


    bug fix bingo screenshot























    To the developers: Please don't take it too personal. This Bug Fix Bingo keeps your testers sane and in the end, those bugs really need fixing.
    The game was created by K. J. Ross & Associate ( http://www.kjross.com.au/page/Resources/Testing_Games/)


    Saturday, July 7, 2012

    Let's get rid of the safety net


     I couldn't possibly say it better than Gojko did... but I can write about it.

    What Gojko came to tell us in the Eurostar conference of 2011 was that with software testing as we see it most - A phase of executing test cases after the product development phase has ended - does not contribute to quality.
    It contributes to building a safety net for developers, managers, and even testers.


    Since testing is done after development, developers get encouraged to become lazy as they don't need to make sure that their development still really works.

    Testing should not be about
    • Logging defects that developers actually already know about. When for example simple client side field validations don't work, developers could know before a tester knows.
    • Waiting for a product to start testing and designing tests against a formalized analysis, to cover our asses instead of the product
    • Answering business on their requests instead of providing what they really need
    • Assume that all requirements and designs are clear and correctly defined

    Testing should be about
    • Helping analysts and developers by showing them how things can go wrong during design, development and delivery
    • Providing what business needs instead of what business wants
    • Help non-testers be better at testing
    • Test the complex and critical parts of a product
    • Find the real requirements and share them with all team-members
    • Being part of a team that is jointly responsible for the quality of the delivered product



     
    View more presentations from gojkoadzic



    Thursday, July 5, 2012

    Beware of inattentional blindness while testing


    What's "Inattentional blindness"? and what does it have to do with testing?
    I got advised about inattentional blindness by a colleague tester, who got it from another colleague tester, who got it from another colleague...

    This turns "Inattentional blindness" into a Test-Myth!

    Before I say anything further about it, I would like you to watch the very short video below and carefully follow it's instructions.

    (Don't read below the movie yet, as it will contain spoilers)




    Have you been surprised? There are different explanations for the reason behind it. (http://en.wikipedia.org/wiki/Inattentional_blindness )

      • Conspicuity - The ball draws your attention.
      • Mental Workload - You need to count so you focus on counting only
      • Expectation - You expect only passes to see and focus on the passes only and block other content
      • Capacity - You see this movie for the first time and you're not used to these exercises. You're not used to do this.

        Now, what is in it for a tester? The key message is:

        When you focus, DON'T FORGET TO DE-FOCUS!


        This is what you do when you execute pre-scripted test cases:
          • The test needs your attention - if you pass it, you better be sure about it! - Conspicuity
          • You need to focus on preparing the correct data and executing the correct steps - Mental Workload
          • You expect only a defect against what you are testing - Expectation
          • Especially during the first run, you're not yet used to go trough the procedural steps - Capacity

            It's important to achieve a goal, to pay attention and to be focused on what you're doing.
            It is equally important to look around at the things you did not prepare test scripts for and find the bugs.


            How can we do this?
              • If you feel the urge to prepare test scripts - don't detail them into descriptive test steps. Leave yourself some space to find alternative paths. Describe your paths on the moment of execution.
              • Per functionality, give yourself a time-boxed moment of time to look around in the application and find out about what you forgot about.
              • Put a post-it on your screen saying "de-focus" as you will forget from the moment you try to remember.