I have been asked to try automate the testing of a wider range of for the group. This is going to be quite challenging since I have not automated desktop applications for about 15 months and the software we make is quite unique. I will also hopefully be teaching the other testers how to maintain the automated tests and eventually create their own.
This challenge has made me think back on my automated testing experience and ask "What makes a good automated test?". A good automated test has all the hallmarks of a good manual test except that its comparatively quick and cheap to run. This means that an automated test is successful when it finds bugs and repeatable where ever the software is installed.
In terms of web testing it would be a test script that can be created and then stored in a central place and be run from the central place. Ideally this would be also run on the continuous integration server when doing builds.
The thing that makes a good automated test great is scalability. I recently watched a YouTube video of the Selenium Users Meet-up. There were Google staff there saying that their testing farm handles 51000 tests and most Friday's all of these tests run. They then went on to complain that Selenium struggled to scale to that size. To be honest at that size I am not surprised!
P.S. To those who are waiting for the next Selenium Tutorial I am hoping to have it complete in the next week for everyone to use.
Wednesday, 26 March 2008
What makes a good automated test?
Friday, 7 March 2008
What would people like for the next Selenium Tutorial?
Thanks to everyone who has been using my site and all those who are subscribed to my RSS Feed. I haven't released another tutorial for couple of weeks and have started thinking of what would people like next for a Selenium tutorial.
I am struggling to think what people would really like. I would appreciate it if people would take the time to let me know, via the comments section of this blog, what tutorial they would like next. Please don't think that your questions are too novice. I would like to help as many people out as possible!
I look forward to your replies!
Friday, 15 February 2008
Is documenting test processes a bad idea in an Agile Development Environment?
I work in a smallish development team in Southampton as some of you know. There are other development teams in the group in United Kingdom and in France that all fall under the same group of companies of different sizes.
The unfortunate problem that we have is that there isn't a properly documented process for getting everything in place and ready for each testing cycle. Once we have everything ready we have another problem that I will hopefully discuss in a future posting.
This is where I had the thought "Is documenting test processes a bad idea in an Agile Development Environment?". If you have worked in an office with ISO9001 accreditation you will know that everything needs to be documented or if you have worked in a Six Sigma office you will have noticed that you have to put everything you do in a database so the "Black Belts" or "Master Black Belts" can see areas that need improvement.
So back to my question, "Is documenting test processes a bad idea in an Agile Development Environment?". My answer is definitely! And I am going to admit this is a fairly biased answer having started my career in large financial institutions where I was taught that everyone is expendable but their process knowledge isn't. This means that everything needs to be documented. However my thoughts on this is that things need to documented and followed to a point as long as that point does not impact on the creativity of the software developers and the testers.
Its this creativity that allows people to come up with new and exciting ways of developing and testing. Tools like the xUnit and Selenium and many more have come from this creativity. This then needs to be balanced with some form of Total Quality Management (TQM) so that everything can be reproduced if and when needed.
Documented processes can't be completely wrong, if they were then large companies like IBM, Citrix and Microsoft would have never got to where they are now!
Thursday, 14 February 2008
What has changed in the last year at work
Not that long ago it was my year anniversary with my employer. As with most people it just seemed to blow over but I have had a chance to reflect over what
changes there have been in my time there. The one thing that I can definitely say is that I has been a good year.
There was however a team of developers eager to make sure that they delivered a
good quality product.
way of developers to unit test thier front end code.
I have managed to get the development team to start using NUnit to create their unit tests instead of developing their own applications to unit test their work. This has meant the number of bugs that get into the builds has dropped dramatically.
These two changes have created a mindshift for all of the developers to a more
agile development cycle. This shift has not been that difficult and the reason
for this is the management team that I report to. They are always open to ideas
especially ones that save money. And agile development on web projects
definitely has this this appeal.
So next thing in my plan is to create an integration server so that all the
different development teams all over the UK and France can check in their work
and be confident that their changes will work on another environment.
Wednesday, 13 February 2008
Graphical Test Planning
In December I went to conference in London and one of the lectures that I went to listen to was about Graphical Test Planning by Hardeep Sharma of Citrix Systems.
Graphical Test Planning is a way of creating a test plan without having to write entire documents. I am sure that most of the readers
that will be reading this will be practise full agile development and are probably
thinking that I am mad.
Maybe I am mad but I really think that this is a great way to work out what needs to be testing and
is a good way to track progress of testing.
The entry below is more my thoughts having tried this technique on web projects
and have used a web examples. I hope you enjoy reading it!
Graphical Test Planning is done by creating a structural relationship diagrams.
These
allow product managers to know what is going to be tested from the day that they decide there is going to be a new version of the product.
What a Graphical Test Plan is or isn't
What a Graphical Test Plan is:
- A structured relationship diagram
- A list of behavioural areas that need to be tested
- A method of getting greating feedback about what needs to be tested
- A method of tracking progress of design and execution of testing
What a Graphical Test Plan isn't:
- A flow diagram
- A tester managers thoughts or mind map
- A feature list
- A hand-holding exercise for a test engineer on how to use the application
How To Create A Structured Relationship Diagram
A structured relationship diagram will show what behavioural areas need to be
tested on different platforms and different compilers. Below is an example of
what a SRD looks like for Selenium.
Its not 100% correct for selenium but I wanted to show something
with no coverage.
From this you can see what areas need to be tested and what doesn't. If we carry this theme on we can then
start drilling the behavioural areas down into more detailed SRDs and if we were to
carry on drilling down we will start to create another form of these diagrams
called a Test Case Diagram.
A Test case diagram looks a lot like a SRD except at the end of the digram instead of showing the behavioural areas it will show the expected results.
The great thing about this is that you can create the SRD in about 10 minutes and then do the test case diagrams in a few hours and this means that you
can have all this test information done before any code is done.
you would have saved yourself somewhere in the region of 1 man month and you are likely to have fixed a number
of flaws in the design before any code is near completion.
Hardeep Sharma for showing off this technique at the SIGIST Conference in
December.