Recently I was on forum and noticed the thread "Why do we do Regression testing?". In it the person was complaining about their company's regression pack and asking if it was worth the effort of fixing it?
My answer was a little quick off the mark. I replied saying "Definitely! Its the backbone for each and every build!". But then this got me thinking, How many testers out there DO NOT practice Agile development? How many testers DO NOT work in an environment where automation of tests is 2nd nature?
I do not ask this question in a malicious way, I genuinely would like to know. When I go to SIGiST there are a number of questions in the presentations/workshops by people who have not worked in an Agile environment and, to be honest, seem scared by it. These are the same people when you mention something like "Continuous Integration" and they get that deer-in-headlights look on their face.
This leads me to my next question; Is there a "Rich get richer, Poor get poorer" divide happening in testing? People in conferences mention that the lines between development and testing is blurring. It's called grey-box testing. But your average manual tester in a large non-software institution (Banks, Insurance Companies, etc) have got their "break" in testing by being seconded to the test team for a project and never wanted to leave their secondment. They may not have been given opportunities to learn how to program or may not even have heard about concepts like Agile, TDD and all the other new ways testers do their jobs.
Sunday, 21 September 2008
Are regression packs still worthwhile?
To these people; the answer to my original question is still the same: "Yes! Definitely! Otherwise things that used to work may not work anymore. I would also suggest trying to automate the regression pack as much as possible so that you can concentrate on making testing more efficient. Try automate as much as possible so that you can concentrate on doing testing on those hard to test areas. Quality of software is still the main objective of testing."
Labels:
agile development,
manual testing,
regression,
training
Subscribe to:
Post Comments (Atom)
3 comments:
Hi,
While I do agree with your "yes, definitely", I think it's a guarded yes. For instance, we have a large suite of several thousand automated regression tests that we've run against every release of our products for the last 15-20 years(!). Sad as I am, I know what each and every test does, and have a good idea which ones regularly fail and a number of these have NEVER failed. Is it still worthwhile running these tests that simply never fail? Part of me says that yes it is - I'm sure the second we remove them we'll introduce a bug that these tests would have picked up. The other part of me knows that we have time pressures to get our automated regression test runs through quicker and quicker, despite adding new tests all the time. The only sane and rational reaction to this is to make sure you are running the most effective tests, removing older ones that are no longer relevant, therefore the regresison suite should be constantly reviewed to make sure the suite is as effective as possible.
As for your other comment, I'm definitely in the "rich get richer" segment, working in an agile environment, developing tests, product, docs etc. Working on these other features also (I believe) makes you a better tester in that you begin to see the end product from a different perspective, therefore can come up with new ideas for what to test to get the most effective coverage. I can't imagine going back to the horrible "old way" of running manual test scripts all day every day - It always stuns me that companies still think this is a worthwhile spend of time/resources!
Hi David,
With regards to to your question, "How many testers out there DO NOT practice Agile development?"
I don't have an exact answer in terms of the number of testers doing Agile. But here's some related information about the numbers of organizations doing Agile.
Dr. Dobb's did surveys of Agile usage in 2006, 2007, and 2008. Here's a link to their latest survey results:
http://www.ddj.com/architect/207600615
Long story short, 69% of organizations are using Agile in one or more projects and, obviously, 31% aren't.
Note: of the 69%, not necessarily ALL projects are Agile -- they could have 99 waterfall projects and only one Agile proejct, and still be included in the "yes we use Agile" category.
These surveys were done by a guy named Scott Ambler. Here are links to pages with his questions, raw data, and presentation of results:
http://www.ambysoft.com/surveys/practicesPrinciples2008.html
http://www.ambysoft.com/surveys/agileFebruary2008.html
http://www.ambysoft.com/surveys/agileMarch2007.html
http://www.ambysoft.com/surveys/agileMarch2006.html
Hope this helps!
I would have agree with you and Nick in that it is tempting to reply with an immediate yes, however there are probably some caveats to put around this:-
- If the effort of maintaining the regression pack is higher than that of gaining the same level of confidence through targeted manual testing then it may be more effective to rely on manual/exploratory testing. Humans are much more versatile in adapting to changes in the software than a badly written and brittle automated process.
- The test pack should be reviewed and refactored just as you would hope the SUT would be
- You need to have the correct mantailty in the organisation of the importance of the tests. When I first implemented the regression harness in the system I am currently working on the tests broke nearly every night due to changes such as cosmetic formatting of results output. 18 months on we have now created a culture where if a developer is checking in changes that will affect the nightly build he will speak to the test team beforehand and even arrange the most convenient time to make the check in to minimise the impact on the regression testing.
- Maintaining a manual regression pack can be as much effort as an automated one. My earliest attempts at manual regression packs innvloved fully prescribing every click and field entry to be performed in the test with exact results. I soon learned that documenting the intention of a test allows for far more flexibility whilst allowing for some exploration by the tester in the execution of that intention.
- Just because people say they are Agile does not mean they have automated regression - I know of examples where people are attempting agile but then running all of their regression manually!
I cannot imagine now testing software in an environment where it was not possible to implement and maintain an extensive automated regression test suite.
Post a Comment