"Engineering teams have traditionally been split between program managers, developers and testers. Yet with new cloud methods of building software, it often makes sense to have the developers test and fix bugs instead of a separate team of testers, Nadella said in an interview last week after unveiling his memo."
"Some of the cuts will be among software testers,"
In Australia, we still call people perform software testing "Testers". As we know from the book "How Google Tests Software", it is called "Software Engineer in Test" (SET) at Google. When I was in America last year, I saw many "SET" job ads on STARWEST conference.
Today, following a link from LinkedIn Update email, I was quite shocked that how quickly and big the change is: The number of SETs jobs are 10 times more than software testers
I talked and wrote a lot about test automation and continuous integration, which are the key reasons behind the success of some big names such as Facebook and LinkedIn. ("FaceBook pushes two releases per day"; LinkedIn joined to this elite club by "luring" Kevin Scott, the senior vice president of engineering and longtime Google veteran, "completely overhauled how LinkedIn develops and ships new updates" (wired magazine called this 'software revolution'). While few of my audience or readers deny the importance and benefits of automated testing and CI, some of them (I feel) thought these can only be done by software giants. Many failed (including some wise testers who purchased TestWise) due to the lack of motivation.
As a matter of fact, test automation and CI are vital to small projects just as to the big ones, if not more. Automated tests and CI compensate the usually limited testing and deployment resource often found in small teams. The very few companies who have done well wouldn't share how they apply test automation and CI (like Google and FaceBook), as they usually regard it as a secret. To be fair, while the technologies are usually open (such as Selenium, Jenkins), the application of test automation and CI varies (not much) due to the nature and culture of the business.
I share our secret here, on how we applied automated testing and CI to one of our new product: ClinicWise, an online clinic management software. The project was started as a side project about a year ago. A relative of mine, a dentist was about to open a dentist clinic, looking for a dental clinic software. He has used many of them before, but was not happy with any of them (too many bugs, complex and very expensive). For some reason, I offered a try to develop a customized application for him. Looking back now, it sounds a bit crazy, as I had no background in this health care area and the client is 8000km away.
Here are 'the secrets':
get skeleton application up running with database migration, unit test, code coverage, automated UI tests and deployment. It is not that hard, even if you haven't done it before. The key is to keep the skeleton application simple.
set up BuildWise server on the integration server, running all CI steps within BuildWise. This requires understanding of CI and build language. If you read and follow my book Practical Web Test Automation, which contains instructions and the sample build scripts.
implement common components such as user sign in, password reset, user management, access control, CRUD, pagination, file uploading, audit logging, etc. No matter which language you use, there are mature libraries for them. Make sure to add some automated UI tests as you go, and pass them in your build server. Discipline!
add exception reporting. When an error occurs, the system will send you an email with the full stack trace.
enhance the UI. Bootstrap and Font Awesome make it very easy to add progressional looks to your application.
talk to the client to identify two (just two) key components must have right now. For clinical software: client management and appointments.
After 40 hours (I counted) in spare time, the first production release was out. The client could put it in use: registering new clients and booking appointments.
consult with the client to understand the business request. Use the demo server for communication proved time-saving.
implement the feature/enhancement or fix the bug
check in and kick a build in CI server (in our case, BuildWise)
if all tests pass, release to the demo site for the client to check
release to the production on a time convenient to the client (such as mid-night)
The automated tests and CI prevent numerous errors from going to the production, also enable upgrades to latest version of frameworks and libraries (some scary stories there, thanks god for automated tests!). Software releases is NOT a stressful activity. The client is getting used to new releases, "it is getting better and better". On his recommendation, several dental and physio clinics signed up with ClinicWise.
Without test automation and CI, ClinicWise is not possible. The process prevented many bugs (most from ourselves, some from dependent library and infrastructure changes), the customers only see stable and 'keep getting better' software. More importantly, the frequent release enabled us getting quick feedback to keep development and our customers happy: the software what they really want!
You might have noticed, like 97% of blogs, this web blog has not been updated for a while. It does not mean I stopped writing, instead, I consolidated ideas and experiences into books, which I think will be more helpful to readers:
Being an ISV, the resource is limited, most likely only you (and your loyal partner if you are lucky) are to do all, in you spare time (don't ruin your family life). Working harder is not often enough, working smarter is the key.
Months ago, My brother, a dentist and his partners opened a big dental practice. They need a medical practice system, the best quote they got was $25000. My brother told me casually that the software not stable during the trial, but no other choices. I offered: I maybe can write one for you (I never did this kind of system before). I got my first release out within a week (spare time, about 20 working hours) for feedback, and in production within one month. Now they have been using it with satisfaction for 6 months. Just last week, one fellow dentist visited my brother practice and saw the system, he showed great interest and wanted to adopt it.
How? (My brother and I located are in different countries) the secret: test automation. To get their feedback, during the peak time, I released a new version pretty much each night. The automated tests prevent me from making mistakes. Seeing is believing.
TestWise test case stats
BuildWise CI build report (28 mins with database reset, that's quite a lot of tests)
StoryWise requirement coverage
I have two friends who built a very nice web application. During one discussion, they shared their slight concerns on observing more competitions, which is inevitable (Web applications means global competition). I said: "Keep improving your app. Don't worry until one of your competitor discovers TestWise."
It is not hard to imagine how much tax payers' money were wasted? Remember, payroll system is a only one kind of system. While Queensland is still in the middle of mining boom, the government is in huge debt. The largest sacking of public servants is about to happen next month: around 20000.
Also as a result, Queensland government is not willing to support National Disability Fund.
As an IT professional, it is heart-breaking to see IT disasters contribute (again) to misery to people lives, particularly disadvantaged ones.
We all heard of 'Test Driven XXX', though many haven't seen one working. Speaking of my experience, software test automation is the simplest, quickest, and most accurate way to measure whether the vendor software is up to the job. Don't fall into those fancy talk and slides, just ask: "Show me how do automate test your application?"
Some might argue it is just your opinion. Check out Auditor-General's report on QLD Health Payroll: Pay system not properly tested. QLD Health payroll is a IBM-SAP project, now a simple question: "How many IBM rational test software sold to QLD government after this report?" An even simpler and logic question: "Are they being used?"
"Can TestWise test Windows native apps?" I received this question now and then even the person asked knew TestWise is for testing web applications.
My answer is: YES or NO. (I know it is sounding like a politician).
First of all, testing Windows native apps is a lot harder, as there were no standard controls. Identifying some controls are particularly challenging or sometimes impossible. Using screen coords is not a good idea, as the test scripts are too delicate. To make it matter worse, assertion is very limited. For that reason, I prefer the term "automation" rather than "test automation" for native apps.
But there is still value doing automation for native apps. The technology I use is AutoIT3, a free and quite widely used windows scripting engine. AutoIT3 comes with Window Info tool, which can be used to identify control IDs or Control Coords
It is important to note that better using Window based Coord Mode, so that if have to control mouse in your test script, the test script still work when window moved.
I like Ruby, so I created a wrapper for AutoIT3. It is called RFormSpec, free and open-source. TestWise comes with RFormSpec, in many ways it works the same way: Page Object, Auto Complete and Refactoring. By using page objects, the automation script is quite maintainable.
At AgileWay, we used the term 'wiser tester' to address our customers.
Wise: "having the power of discerning and judging properly as to what is true or right" - dictionary.com
Just then I watched one introduction video of a testing product, the same old trick: 'scriptless automated testing' and 'record and playback'. I don't understand why on earth still some testers/testing managers/CIOs buy into it. If it is that simple, why your organization not using it now? More insightful thought: if it works as it advertised, list reasons to why your company cannot replace you (as an automated tester) with a new graduate.
I have seen too many times:
* A new test manager/CIO is ambitious to introduce test automation (it is nothing to brag about manual testing)
* Purchased a very expensive testing tool package
* Testers (assigned to do test automation) became slaves of recorded test scripts and didn't work
* Report: not suitable for the organisation
* Keep paying remaining yearly (usually 3) product support
You can see obvious flaws there. If the testing product was as good as these salesmen said, why not ask them cutting half sales talk time and write real tests for typical web applications in your organisation? (Testing, compared programming, is practical and knowledge is quite transferable. I usually tell my potential customers, after this brief talk, if permitted, I will spend 10 minutes with your tester writing some real tests for your app.) Now I am getting to understand the same stories repeated again and again in organisation are just games, especially common in governments. This might help you understand why so many governments in debt crisis, :-). If might take efforts for projects to decide suitable programming architecture: Java, .NET and Ruby. But for automated functional testing, in my opinion, it is black and white.
Wise people takes the control, wise testers will ask logic questions such as:
* If programmers made a change to web pages, how I can maintain the test scripts effectively?
* How to handle popups such as user login?
* When I have a lot of test scripts, it takes too long to execute, how can I manage better?
* How can I include test automation execution in a Continuous Integration process? (Show me an working example)
* How easy is your test scripts to read?
In developed countries, such as Australia, more and more IT jobs are out-sourced to India and other countries, that's the fact. Just like manufacturing, to compete, you need to improve productivity and innovate (like FaceBook, 100+ users use Watir to do testing, guess what, a free open-source test framework). When I see unwise testers/managers jumped up and down on setting manual testing processes and against agile development, I don't what to say (may be 'sad little creature' in one of Pixar movies), it just seem to me that someone is digging a tomb to bury himself. The quick feedback loop time in agile teams keep your jobs here. Let me tell these unwise testers, the testers in out-sourced countries follow the boring specified manual process better than you.