Friday, 11 August 2017

On Why the "Testers Should Know How to Code" Mantra could Hurt Us All

On 31st July 2017, the tester and blogger Trish Khoo initiated a twitter poll on the topic "Should all testers be learning to code?". This created a near 50% split in opinion and numerous twitter comments for either side.

As a tester with a coding background who is also studying a computing degree, I took a great interest in the debate and read up other perspectives on the subject. There are various competing arguments - a selection of which I have outlined below.

Perspectives on Testers Knowing Coding

Trish Khoo, in her follow up blog to the survey above, "Yes all testers should learn how to code", argues using Australian, US and UK sources that a basic level of programming is now routinely being taught to schoolchildren and will be seen as much of a fundamental skill requirement as Maths and Science. All testers working in software development should know programming to future-proof their careers.

Joel Montvelisky's very detailed 2017 article for PractiTest "Stop being a NON-Technical Tester!" advocates that a tester should have sufficient coding skill to do the following -

  • Understand the Architecture of the Product under Test
  • Review the Code under Test (i.e. SQL Queries, scripts and configuration files)
  • Automate repetitive sanity and smoke or setup tasks
  • Use free or paid automation tools such as Selenium, QTP etc.
  • Troubleshoot from Logs and other System Feeds
  • Run bespoke SQL queries
  • Talk the language of their technical peers
Elisabeth Hendrickson's seminal 2010 blog article "Do Testers Have to Write Code?" argues that only testers doing scripted automation require programming skills, however extends her argument to include her own survey of US tester job ad data to advocate that every person serious about being a professional tester should know at least one language (she recommends SQL) as a minimum. In a way, her argument is in the same category as the one argued by Trish Khoo.

Rob Lambert's 2014 Social Tester blog article "Why Testers Really Should Learn to Code" states Elisabeth Hendrickson's position far more bluntly - the job market demands it and testers who cannot code are committing career suicide and will be pushed out of the job market by those who can.

Michael Bolton's article "At Least Three Good Reasons for Testers to Learn to Program" does not oblige testers to know programming (in fact in the comments he advocates diversity of skills at the individual and team level), however he advocates learning to provide more opportunities for tooling, insight into how computers and programs work (and may not work) and humility and empathy with programmers about the fact that coding can be very difficult.

Alessandra Moreira's 2013 article "Should Testers Learn to Code" takes a balanced approach referencing other articles including those above without necessarily taking a firm stance. It makes the point that many good testers cannot code and are still effective and that not all testers actually enjoy coding.

I think the debate actually breaks into two different questions.
  • Are there benefits to testers to learn some coding?
  • Should all testers be expected or obliged to know how to code?

My Perspective

Based on my own experience I think that the answer to Question 1 is undoubtedly yes. There are many benefits in testing generally (although not all testing jobs in equal measure) to knowing some programming. My knowledge of SQL (from my pre-testing days as a DBA) has been a great boon to my usefulness and success in various web and data-centric projects and I have been able to bring test automation and data creation into environments that did not have it previously due to a knowledge of C# and Java. Coding knowledge opens the door not just to test automation but also interesting and important areas such as penetration testing, unit and integration testing (usually done by developers but with increasing involvement with test analysts) and performance testing.

Question 2 is far more problematic and a firm answer of "yes" - taken as standard in the industry - I believe would be wrong and dangerous for the testing community. Some reasons are stated below.

1) Exaggeration of the Importance of Programming Skill

Programming is a very useful skill, however it is not critical for all roles. There are still many functional testing projects that carry on just fine with purely manual testing (my last role was one of these) or where automation is impractical or done by an existing development resource. A keen tester looking to code heavily, like a fish out of water, might well be frustrated and atrophy in these roles. Common sense and judgement needs to be applied.

Elisabeth Hendrikson notes -

"Testers who specialize in exploratory testing bring a different and extremely valuable set of skills to the party. Good testers have critical thinking, analytical, and investigative skills. They understand risk and have a deep understanding where bugs tend to hide. They have excellent communication skills. Most good testers have some measure of technical skill such as system administration, databases, networks, etc. that lends itself to gray box testing. But some of the very best testers I’ve worked with could not have coded their way out of a For Loop."

There are other skills required for almost every testing job - planning, critical thinking, tenacity, conscientiousness, teamwork, written and verbal communication, soft skills, management and leadership, bug reporting and advocacy. Testers without any programming skill looking to get into and advance their careers would be better served improving in these areas first instead of learning coding from scratch. Hiring managers for projects where test automation is either not required or can be allocated to others would be best served by hiring for generic core skills as emphasized above as opposed to having a tester who can program "just in case".

2) The Plurality of Tester Backgrounds and Perspectives will be Damaged

The testing field is uniquely accommodating to those from a wide range of backgrounds and disciplines - very few of which would have required programming - who can provide a plurality of perspectives and immediate utility.

  • Testers from business and industry fields, BA and the service desk can bring domain knowledge and a commercial, user-oriented focus and perspective and experience of the kind of failures that are most critical and should be looked out for.
  • Those from science and engineering backgrounds can bring great analytical, mathematical, experimental and system modelling skill.
  • Those from arts and humanities backgrounds are good at analysing data from diverse sources and documents and can provide great verbal and written communication and reporting skill. Musicians and foreign language grads already deal with complex systems riddled with rules and exceptions. They will find their niche in testing.
  • Some of our greatest testers have had no formal education but great practicality, a hard work ethic, passion for technology and no shortage of analytical or soft skills.

As a community, and following on from 1), we need to protect this diversity of background and perspective if we want to achieve great things for our clients and employers. Requiring that testers need to be coders by default applies a needless barrier that prevents those from outside who can provide much to software development teams from getting a foothold in the industry.

3) How Much Programming Skill is Enough? In what Areas? How would this be demonstrated?

If we impose a requirement that all testers, upon entering the profession, require programming skill to be hireable or useful, the testing community would probably have to define a basic curriculum at least as a guide. What is the base minimum? Even in roles that require some programming skill, the "amount" required is highly contextual.

Montvelisky states in his article that the base minimum would be the ability to edit and read configuration files, execute SQL queries, automate test setup tasks and use frameworks such as Selenium and QTP / UFT. As a minimum this sounds reasonable however considering the sheer numbers and flux of operating systems, setup tools, scripting languages and test frameworks out there, from a learning perspective even this is enormous work.

  • As an example, we may ask for simple scripting skills. Do we expect knowledge of Windows Shell, Powershell, Linux BASH, Perl, Python, even JavaScript for NodeJS? All of these are extremely useful to know and I have used some already at work and study.
  • For data retrieval, we would probably mandate SQL as a minimum however NoSQL DBs such as MongoDB are extremely and increasingly popular these days. Can we afford to miss it out? Why not also REST API and Web Services tools? JSON and XML?
  • Regarding test automation frameworks, we have Selenium, Postman, SoapUI, Cucumber etc. but are these enough? Many companies in the corporate world still use QTP, SilkTest and TestComplete - each with their own scripting languages and tooling. We would struggle not to include them - and since they are high cost and proprietary it is difficult for students to get their hands on them to learn outside of teams that have them. Even Selenium contains bindings in several languages including Java, C#, Python and JavaScript.

The above would be difficult to achieve even for recent CS and software engineering grads. The testing community cannot even reach a consensus on what is testing and what is "checking", so could it agree on a basic programming requirement? I doubt it could and this would cause uncertainty and confusion for all of us. Groups like the ISTQB and IEEE would be tempted use the vacuum to impose a minimum test programming standard (and even provide certification in it), which others such as the Context Driven and Rapid Software Testing communities would fight hard to resist - creating another great schism in the testing community.

We could let individual recruiters and teams decide (as is done in development), however this would mean that since it is impossible to learn enough to be at a professional level in all of the above, new testers would have to specialise before even getting their first jobs - the decision closing off large areas of the job market as a result. Established testers who may have no interest in programming whatsoever will have to spend enormous time and resources to upskill, specialise and learn tools they don't care for just to be considered "adequate", despite their skills in other areas and otherwise stellar achievements to date. Is that fair to them?

Mandating that testers must have some minimum programming knowledge opens up a minefield of questions and concerns that the testing community will struggle to agree on. This could well lead onto point 4 below.

4) We Encourage Unhelpful and even Lazy Recruiting Practices

As someone who has done commercial development in the past, is studying computing at postgraduate level and takes a great interest in programming now (although happily committed to being a tester), I spend some time looking at the various IT and dev industry forums. Regarding recruitment, there are various complaints I have come across from others -

  • Employers and recruitment consultants that "demand the world" - requiring an unreasonably long list of programming languages and frameworks, rejecting any applications with slightly different but still quickly transferable skills, frameworks and underlying concepts.
  • Job Ads requiring for "entry-level" developer roles requiring CS degrees (even for relatively simple programming tasks that can be done by non-graduates) and years of commercial experience - suspected simply existing to cut down the number of applications.
  • Job Ads requiring years of commercial experience in the latest and trendiest tools of the day, which hurts the chances of those outside of new startups and innovative projects, those developing in BAU and corporate environments using long-established tools and older programmers.
Without a consensus on Point 3, I suspect that some of the more lazy and unhelpful recruitment practices mentioned above will flood into testing recruitment. I have already seen some ads for experienced testers requiring a CS degree as a minimum, which disregards the skills, achievements and experience of those with backgrounds from other domains but who could still do the job.

Final Points

Dorothy Graham, in her 2014 blog article "Testers Should Learn to Code?", is strident that a mandatory coding requirement is a dangerous attitude and lists various thought provoking reasons, some that overlap the above. A selection -

  • Test Managers can use this as a justification to get rid of good and productive testers.
  • Not All Testers will ever be good at or interested in programming.
  • Devaluing of testing skills over coding skills
  • Tester-Developers will either choose to or be forced into being developers, thus we lose people from the testing profession.

I agree with her opinions, and hope that this blog article and those linked to continue to be a useful part of the testing-coding debate. An expectation that testers must be able to code may not help and in fact cause needless chagrin in our profession.


  1. Why are we still "debating" this?

    Standing by the above. Paul.

    1. Thanks Paul, your article raise some very interesting points we should ponder over.

  2. For me this is a question of balance of skills and work focus. I've always been a "Technical Tester" type and that is because of my educational background in the Natural Sciences (Zoology). As part of my college studies I did take CS classes in programming, and that was because I wanted to combine the two things. I was also lucky to have programming classes in High School, so that had some influence on me as well. I started off as a programmer, but got into testing because of circumstances at that time. It wasn't because I was a bad programmer, I was decent and wrote solid code. But I had a knack for Testing and saw it as a niche that I could excel in that time (1988), and a few years later (1992) automation was added to my skill set. Again, another niche area that I could capitalize on. But I've always tried to balance the two, testing and programming, via my work with automation. I've always seen it as a tool to aid in testing and not as a replacement for human testers.

    What I've seen over the last 5-10 years is this desire to fully swing the pendulum to one side, automation and programming, and force the testing work/world out of balance. The desire to "automate" the human testers work, much like automation in manufacturing, with a "robot" (computer) has caused a lot of headaches and confusion. This is in large part to reduce/eliminate the cost of human resources for the testing work. It's starting to look like the Star Trek (TOS) episode called the "The Ultimate Computer", and testers are becoming "Captain Dunsel" (Dunsel - a part which serves no useful purpose).

    This is partly due to the heavy push to implement the Test Automation Pyramid philosophy on projects and thus a shift in focus and skills has occurred. But it has been such a hurried and frantic push/move that it has created its own problems. Thus the discussion we are now in. My point is this, we need to find the balance again. Get things back to a state of equilibrium. How we do that is what is key.

    Jim Hazen

    1. Thanks for your comment. For me I see programming skill as a specialism that testers can learn if it interests them. I am certainly interested in learning programming. However one of the painful lessons we have all learned in recent years is that ill-considered automation is anything but a saving in human resources but an expensive waste which serves nothing. It is not a panacea for testing and in my opinion is still second fiddle to basic test planning, execution and reporting as well as the analytical and soft skills all testing jobs require. I hope there is a balance that can be achieved.

  3. Excellently balanced piece which elegantly describes the objections I have to being forced into some area I have no desire to visit. Not coding does not mean non-technical. Well done

  4. Hello, Paul!
    Great article, first of all and thanks for getting all these sources and opinions toghether.
    Here's what I think.
    If testers are afraid of the steep learning curve that programming, in particular - new frameworks, technologies, tools, patterns etc. offers, they are on the wrong track. Testing is constant learning, if one is afraid of learning, or afraid of complexity, he/she picked a wrong proffesion.
    As for programming skills, I think they are useful.
    As for the current state of testing - I think it's useful to know programming not to the extend of creating programs, but at least understanding how programs work, what are they composed of, what's the process of executing them, what are all of these different type formats. It sounds almost rediculuous to have a web tester that has no knowledge of what is web and how is the client-server communication working.
    As for the future state of testing, like it or not testing craft is shifting from "just testing" to "coded testing", if this is right or not, or whether we like it or not is a matter of another discussion, the fact is - it's happening. So, if we want to even "sit on the discussion table" and share our oppinion on that question and be credible, we have to be able to create testing programs and be able to advocate why they can or can not perform testign that's satisfactory for our client's needs.

    Best regards,
    Mr. Slavchev

    1. Thanks. I mention various articles that advocate all testers knowing programming from a job-requirement/utility point of view. While I do state that testers benefit from knowing some programming and even computer science, and agree that there is a shift to coded testing, I still come across many manual and exploratory testing roles where knowledge automation plays second fiddle to basic manual testing and analytical skills. Also, even if it were the case that all the manual testing jobs were being turned into automation, I still feel that it is up to the testing community to challenge the view that coding and automation are a panacea to the full range of non-coding skills and perspectives provided by the skilled tester - all of which are important to the success of IT projects.

    2. Also, as you stated -

      "I think it's useful to know programming not to the extend of creating programs, but at least understanding how programs work, what are they composed of, what's the process of executing them, what are all of these different type formats. It sounds almost rediculuous to have a web tester that has no knowledge of what is web and how is the client-server communication working."

      I agree with all of this, however it is possible to get by knowing these at a certain level without necessarily being able to, say, write JavaScript web applications, server-side apps in Python or configure an Apache server. The question is - if we decide that all testers should have some sort of technical knowledge, what sort of level is suitable for someone entering the testing profession?

  5. BTW, it's "Elisabeth" Hendrickson not "Elizabeth". Disclosure: I'm a former co-worker of hers.

    1. Thanks and apologies GLMeece, I've now corrected the spelling.

  6. Hi Paul,

    Thanks - you make a lot of very interesting points and nicely summarise a lot of discussion (as well as stimulating some more in the comments).

    We seem to have a tendency (as humans) to try to make everyone the same - "what's good for the goose is good for the gander". But people working in testing have such a great variety of skills and backgrounds - our diversity is our strength. We should celebrate different views and skills, not try to get everyone to have the same skills (however useful to many).

    Thanks again.

    Regards, Dot

  7. Actually I read it yesterday but I had some thoughts about it and today I wanted to read it again because it is very well written. digital force meter