I am amazed at how much emphasis people put on technical and domain skills when asking for a tester. Back in the days when I was looking for Test Analyst or Test Lead jobs, I would often be discouraged from applying because it would state something like; “Must be proficient in Cobol programming” (yes, I’m that old!), or, “Extensive knowledge of the mortgage industry essential”. Now, two decades later, just the briefest glance at job listings today throws up; “Previous experience working with CRM systems” and “Experience of working in Telecommunications sector” as essential requirements for Test Analyst roles and similar statements can be seen in almost every listing. Even then, I used to think to myself, “Well, I’m a good tester. I can read requirements and derive tests from them. Why do I need an in depth knowledge of the system or business area?”.
More recently, as somebody who hired test analysts, I saw three distinct types of applicant: solid testers with little or no experience in the system or domain; poor testers with a lot of experience in the system or domain and (rarely) solid testers with a lot of experience in the system or domain. Of these three, I won’t deny that the latter often worked out best if I could find them, but solid testers without the system experience often proved to be almost as effective and always out-performed poor testers with system knowledge. Today, I would always recruit the career tester who demonstrates an interest in and commitment to understanding the processes of testing but doesn’t have the specific system knowledge over an applicant with an in depth understanding of the system but little perceivable interest in or commitment to testing as a career. Why? Because, simply put, it’s almost always easier to teach a good tester enough about a system to be an effective tester of it (which isn’t necessarily to an expert level) than it is to teach a system expert to be a good tester.
I have seen organisations spend enormous amounts of time and money training their testers to be experts in their systems and not realise any appreciable benefits from that investment. Especially considering the increasing use of collaborative working methods, testers today need a much more rounded set of skills: test process understanding; communication; stakeholder management; risk mitigation and so on. Assuming that strong system knowledge is all that is required to ensure effective testing is a dangerously naïve view.
So, what should we be looking for in our testers? In my opinion, I would look for the following attributes, in descending order of importance:
- Commitment to a career as a tester
- Understanding of application of effective test processes
- Appreciation of risk mitigation strategies in testing
- Communication skills; the ability to cogently explain plans and outcomes of testing to stakeholders
- Relevant system and domain knowledge
- Relevant technical knowledge
How does this relate to what I do now; assessing quality processes and advising organisations on how they can optimise and improve the way that they work? As a TMMi Lead Assessor, I speak to people at all levels of an organisation to assure that there are appropriate, effective and consistently applied processes in place to ensure that the risks associated with software development are identified and mitigated through testing activities. The TMMi framework talks a lot about the skills that I have discussed above: test process; risk mitigation and communication with stakeholders at all stages of the development lifecycle. It doesn’t really specifically mention technical or domain skills. I know that organisations that have been assessed to a TMMi maturity level are carrying out effective testing and, of course, that many of them will have testers with domain and system knowledge. However, I would contend that this knowledge is secondary to the implementation of the processes that TMMi describes. We need to take this on board and ensure that we recruit testers for the most relevant skills that they have.
Dave Allan says
Hi Simon
Long time no see!
I couldnt agree more about the value of solid experience in the testing specialism being a potentially greater asset for test resources than system-specific expertise, or industry experience, whether that be retail, telecoms, or the investment banking “closed shop”. There are exceptions, of course, particularly when it comes to the growing requirement for resources capable of engaging in infrastructure test activities, due to companies having to upgrade their environments and services as a generation of legacy systems become reliant on hardware, operating systems, and services which are no longer supported by the vendors.
That said, I sometimes wonder what role the recruitment industry has had in pigeon-holing candidates into specific categories, which can lead to weaker candidates being presented to clients, simply because they have some experience in the client’s industry sector.
Back in the mists of time, recruitment consultants were often ex-contractors, or at least had some IT experience, and seemed to have a greater understanding of the roles for which they they were trying to recruit candidates. These days it sometimes feels as if the recruitment process is driven solely by buzz-word searches of CV databases and automated candidate selection software, rather than any real understanding of the specific roles they are trying to fill.
The initial short-listing is often left to twenty-something graduate trainees (with eloquent titles such as Talent Acquisition Consultant!), whose candidate list then is vetted by an Account Manager who often has limited IT experience themselves. Given the number of candidates chasing each role, I can understand why they want to streamline the process, and the modern-day approach is clearly made easier for them if the candidate selection filters are very specific, but clients are likely to be presented with a somewhat skewed sub-set of the available testers as a result, and I am sure this contributes to the phenomenon you describe in your article.
Steve Wilson says
Regardless of a solid testing framework or not, testers with little or no domain knowledge is not a recipe for testing success where the Client see’s value for money. Testing resources need to have a knowledge of the Industry, an appreciation of what the organisation does and some experience of the functional business area that they are to test.
Clients don’t want an army of testers who are unable to converse with business user’s, do not understand the contents of the existing test pack and are unable to amend existing or create new test cases for new requirements. Having to knowledge transfer to your resident testing team is an unwanted overhead which Clients quickly identify and learning on the job should not be an invisible part of a Statement of Work.
I have seen the above and it carried so much testing/operational and reputational risk that it instantly became a “Testing War Story”. .Needless to say, it imploded….
divya says
Wonderful article, very useful and well explanation. Your post is extremely
incredible. I will refer this to my candidates
Roman Nits says
Totally agree with the article.
Can add that during my interview of QA I found that no sense to filter out them by domain knowledge at all (banking, financial market) The minimum from technical area I need for functional QAs – is base knowledge of SQL. Other 80% of my interview I try to understand his process knowledge, communicational skills, risk etc. I had been working for 1 year by the time I undertood that such approach works.
Since that time I had different success story, but maybe most of them – are 2 fellows, that came to my interview as interns (no experience at all). After 3 years one of them grew up to TM, another one candidate to Snr. Automation. For 3 years. Without any experience at the start. It is still paintfull a bit recognize if that guys would came to me a bit earlier, my mistaken vision of needed skills and values in candidates could cause that such specialists will find themself out of my project (and I am sure they will).