Friday 11 August 2017

On Why the "Testers Should Know How to Code" Mantra could Hurt Us All

On 31st July 2017, the tester and blogger Trish Khoo initiated a twitter poll on the topic "Should all testers be learning to code?". This created a near 50% split in opinion and numerous twitter comments for either side.



As a tester with a coding background who is also studying a computing degree, I took a great interest in the debate and read up other perspectives on the subject. There are various competing arguments - a selection of which I have outlined below.

Perspectives on Testers Knowing Coding


Trish Khoo, in her follow up blog to the survey above, "Yes all testers should learn how to code", argues using Australian, US and UK sources that a basic level of programming is now routinely being taught to schoolchildren and will be seen as much of a fundamental skill requirement as Maths and Science. All testers working in software development should know programming to future-proof their careers.

Joel Montvelisky's very detailed 2017 article for PractiTest "Stop being a NON-Technical Tester!" advocates that a tester should have sufficient coding skill to do the following -

  • Understand the Architecture of the Product under Test
  • Review the Code under Test (i.e. SQL Queries, scripts and configuration files)
  • Automate repetitive sanity and smoke or setup tasks
  • Use free or paid automation tools such as Selenium, QTP etc.
  • Troubleshoot from Logs and other System Feeds
  • Run bespoke SQL queries
  • Talk the language of their technical peers
Elisabeth Hendrickson's seminal 2010 blog article "Do Testers Have to Write Code?" argues that only testers doing scripted automation require programming skills, however extends her argument to include her own survey of US tester job ad data to advocate that every person serious about being a professional tester should know at least one language (she recommends SQL) as a minimum. In a way, her argument is in the same category as the one argued by Trish Khoo.

Rob Lambert's 2014 Social Tester blog article "Why Testers Really Should Learn to Code" states Elisabeth Hendrickson's position far more bluntly - the job market demands it and testers who cannot code are committing career suicide and will be pushed out of the job market by those who can.

Michael Bolton's article "At Least Three Good Reasons for Testers to Learn to Program" does not oblige testers to know programming (in fact in the comments he advocates diversity of skills at the individual and team level), however he advocates learning to provide more opportunities for tooling, insight into how computers and programs work (and may not work) and humility and empathy with programmers about the fact that coding can be very difficult.

Alessandra Moreira's 2013 article "Should Testers Learn to Code" takes a balanced approach referencing other articles including those above without necessarily taking a firm stance. It makes the point that many good testers cannot code and are still effective and that not all testers actually enjoy coding.

I think the debate actually breaks into two different questions.
  • Are there benefits to testers to learn some coding?
  • Should all testers be expected or obliged to know how to code?

My Perspective

Based on my own experience I think that the answer to Question 1 is undoubtedly yes. There are many benefits in testing generally (although not all testing jobs in equal measure) to knowing some programming. My knowledge of SQL (from my pre-testing days as a DBA) has been a great boon to my usefulness and success in various web and data-centric projects and I have been able to bring test automation and data creation into environments that did not have it previously due to a knowledge of C# and Java. Coding knowledge opens the door not just to test automation but also interesting and important areas such as penetration testing, unit and integration testing (usually done by developers but with increasing involvement with test analysts) and performance testing.

Question 2 is far more problematic and a firm answer of "yes" - taken as standard in the industry - I believe would be wrong and dangerous for the testing community. Some reasons are stated below.

1) Exaggeration of the Importance of Programming Skill


Programming is a very useful skill, however it is not critical for all roles. There are still many functional testing projects that carry on just fine with purely manual testing (my last role was one of these) or where automation is impractical or done by an existing development resource. A keen tester looking to code heavily, like a fish out of water, might well be frustrated and atrophy in these roles. Common sense and judgement needs to be applied.

Elisabeth Hendrikson notes -

"Testers who specialize in exploratory testing bring a different and extremely valuable set of skills to the party. Good testers have critical thinking, analytical, and investigative skills. They understand risk and have a deep understanding where bugs tend to hide. They have excellent communication skills. Most good testers have some measure of technical skill such as system administration, databases, networks, etc. that lends itself to gray box testing. But some of the very best testers I’ve worked with could not have coded their way out of a For Loop."

There are other skills required for almost every testing job - planning, critical thinking, tenacity, conscientiousness, teamwork, written and verbal communication, soft skills, management and leadership, bug reporting and advocacy. Testers without any programming skill looking to get into and advance their careers would be better served improving in these areas first instead of learning coding from scratch. Hiring managers for projects where test automation is either not required or can be allocated to others would be best served by hiring for generic core skills as emphasized above as opposed to having a tester who can program "just in case".

2) The Plurality of Tester Backgrounds and Perspectives will be Damaged


The testing field is uniquely accommodating to those from a wide range of backgrounds and disciplines - very few of which would have required programming - who can provide a plurality of perspectives and immediate utility.

  • Testers from business and industry fields, BA and the service desk can bring domain knowledge and a commercial, user-oriented focus and perspective and experience of the kind of failures that are most critical and should be looked out for.
  • Those from science and engineering backgrounds can bring great analytical, mathematical, experimental and system modelling skill.
  • Those from arts and humanities backgrounds are good at analysing data from diverse sources and documents and can provide great verbal and written communication and reporting skill. Musicians and foreign language grads already deal with complex systems riddled with rules and exceptions. They will find their niche in testing.
  • Some of our greatest testers have had no formal education but great practicality, a hard work ethic, passion for technology and no shortage of analytical or soft skills.

As a community, and following on from 1), we need to protect this diversity of background and perspective if we want to achieve great things for our clients and employers. Requiring that testers need to be coders by default applies a needless barrier that prevents those from outside who can provide much to software development teams from getting a foothold in the industry.

3) How Much Programming Skill is Enough? In what Areas? How would this be demonstrated?


If we impose a requirement that all testers, upon entering the profession, require programming skill to be hireable or useful, the testing community would probably have to define a basic curriculum at least as a guide. What is the base minimum? Even in roles that require some programming skill, the "amount" required is highly contextual.

Montvelisky states in his article that the base minimum would be the ability to edit and read configuration files, execute SQL queries, automate test setup tasks and use frameworks such as Selenium and QTP / UFT. As a minimum this sounds reasonable however considering the sheer numbers and flux of operating systems, setup tools, scripting languages and test frameworks out there, from a learning perspective even this is enormous work.

  • As an example, we may ask for simple scripting skills. Do we expect knowledge of Windows Shell, Powershell, Linux BASH, Perl, Python, even JavaScript for NodeJS? All of these are extremely useful to know and I have used some already at work and study.
  • For data retrieval, we would probably mandate SQL as a minimum however NoSQL DBs such as MongoDB are extremely and increasingly popular these days. Can we afford to miss it out? Why not also REST API and Web Services tools? JSON and XML?
  • Regarding test automation frameworks, we have Selenium, Postman, SoapUI, Cucumber etc. but are these enough? Many companies in the corporate world still use QTP, SilkTest and TestComplete - each with their own scripting languages and tooling. We would struggle not to include them - and since they are high cost and proprietary it is difficult for students to get their hands on them to learn outside of teams that have them. Even Selenium contains bindings in several languages including Java, C#, Python and JavaScript.

The above would be difficult to achieve even for recent CS and software engineering grads. The testing community cannot even reach a consensus on what is testing and what is "checking", so could it agree on a basic programming requirement? I doubt it could and this would cause uncertainty and confusion for all of us. Groups like the ISTQB and IEEE would be tempted use the vacuum to impose a minimum test programming standard (and even provide certification in it), which others such as the Context Driven and Rapid Software Testing communities would fight hard to resist - creating another great schism in the testing community.

We could let individual recruiters and teams decide (as is done in development), however this would mean that since it is impossible to learn enough to be at a professional level in all of the above, new testers would have to specialise before even getting their first jobs - the decision closing off large areas of the job market as a result. Established testers who may have no interest in programming whatsoever will have to spend enormous time and resources to upskill, specialise and learn tools they don't care for just to be considered "adequate", despite their skills in other areas and otherwise stellar achievements to date. Is that fair to them?

Mandating that testers must have some minimum programming knowledge opens up a minefield of questions and concerns that the testing community will struggle to agree on. This could well lead onto point 4 below.

4) We Encourage Unhelpful and even Lazy Recruiting Practices


As someone who has done commercial development in the past, is studying computing at postgraduate level and takes a great interest in programming now (although happily committed to being a tester), I spend some time looking at the various IT and dev industry forums. Regarding recruitment, there are various complaints I have come across from others -

  • Employers and recruitment consultants that "demand the world" - requiring an unreasonably long list of programming languages and frameworks, rejecting any applications with slightly different but still quickly transferable skills, frameworks and underlying concepts.
  • Job Ads requiring for "entry-level" developer roles requiring CS degrees (even for relatively simple programming tasks that can be done by non-graduates) and years of commercial experience - suspected simply existing to cut down the number of applications.
  • Job Ads requiring years of commercial experience in the latest and trendiest tools of the day, which hurts the chances of those outside of new startups and innovative projects, those developing in BAU and corporate environments using long-established tools and older programmers.
Without a consensus on Point 3, I suspect that some of the more lazy and unhelpful recruitment practices mentioned above will flood into testing recruitment. I have already seen some ads for experienced testers requiring a CS degree as a minimum, which disregards the skills, achievements and experience of those with backgrounds from other domains but who could still do the job.

Final Points


Dorothy Graham, in her 2014 blog article "Testers Should Learn to Code?", is strident that a mandatory coding requirement is a dangerous attitude and lists various thought provoking reasons, some that overlap the above. A selection -

  • Test Managers can use this as a justification to get rid of good and productive testers.
  • Not All Testers will ever be good at or interested in programming.
  • Devaluing of testing skills over coding skills
  • Tester-Developers will either choose to or be forced into being developers, thus we lose people from the testing profession.

I agree with her opinions, and hope that this blog article and those linked to continue to be a useful part of the testing-coding debate. An expectation that testers must be able to code may not help and in fact cause needless chagrin in our profession.

Monday 7 August 2017

On the Need to Test for Beauty and Elegance

I have spent the last month off work, visiting parts of the UK, Paris and Seoul with my wife. During this time I have spent much time engaged in visiting museums and historic sights and strangely enough pondering over my half-formed and admittedly likely flawed ideas regarding what beauty and elegance is.

What IT products would end up in a museum or art gallery in 100 years time? Computers that didn'ty look great but represented an impressive landmark in CS such as Tom Kilburn's Baby? Beautiful and stylish Apple iMacs and iPads? More "functional" but equally noteworthy VIC 20s and Atari STs? What about software? Computer games such as the Tomb Raider and Destiny series are beautiful to look at and very fun to play but the much more mundane Microsoft Office has been more important to people in a wide range of areas. Can one genre be seen as "art" and elevated to the Louvre one day, sharing a wing with the Botticellis and the Venus de Milo, and the other deemed more fitting for industrial museums such as the venerable Museum of Science and Industry in Manchester - sharing a wing with the steam engines and 19th Century cotton weavers? Is this a poor distinction to make and should all software be treated equally?

Sandro Botticelli: Madonna and Child with St. John the Baptist, c. 1470–1475, Louvre (CC: Wikipedia)
Replica of Tom Kilburn's Manchester Small Scale Experimental Machine, nicknamed "Baby". The world's first stored program computer. (CC: Wikipedia)



How many development teams see their end product as aesthetically pleasing and write this into their requirements? Who is the best judge of something so abstract and intangible - a UI expert, key users, the testers or the manager? In the middle and latter stages of a project, when the deadline is upon us and the resource restraints come thick and fast, how much of aesthetic quality sacrificed to reach the goal? How often do testers actually raise an issue that a design or its implementation is ugly or inelegant?

I have worked with some (what I regarded as) damn ugly and inelegant software in some past projects, yet I admit that rarely have I put up my hand and stated that some feature was just not stylish or pretty enough. I have improved on this recently. In my experience the all too common wisdom in teams has been that it was "what was agreed in the spec" or "what the client wants" or "what the designers/devs came up with and it is too late to change it". In terms of aesthetics, I have simply tended to focus on useability and how closely the user interface matches the design. In this respect I should have done better and fought against the perceived wisdom - in some cases nothing more than excuses for inaction.

However useability is not the same as elegance and "matching the design" is not the same as beauty. All software aimed to be used by people will be judged by its aesthetic qualities and elegance. We want software that elates the heart as well as satisfies the need. Testers have a critical role to play in achieving this.

The focus (especially in agile development) on inclusion of features above all else - done in a rushed fashion without care for aesthetics - detracts from the elegance (where simplicity is a large part) and beauty of the eventual product. Even after production deployment, we create environments where the final has new challenges to cope with - be they oversized and dynamic ad content that change the layout of web pages or overly complicated and badly designed later-introduced game levels and characters - that give a poor impression to our users.

This can also be caused by testers being brought in at the development (and later) stages of the project - after requirements and design are "locked down". The current trend towards "shift-left" may resolve this. Another cause for certain projects heavy focus on meeting the letter of requirements over thinking about what is best for the users - which takes leadership from the stakeholders and project management down to  resolve but for which testers must be willing to ask the difficult questions to achieve.

Regarding who is best to judge what is beautiful or elegant or what even these terms mean in IT, I have no definitive answer and put the question out to the community. There are web and UI design standards which can help but they do not solve the problem. At one level, product walkthroughs and the inclusion of management and user views at every iteration can at least provide opinion and consensus (where all are allowed to speak with equal merit), however even in these cases the focus can tend towards what is functionally sufficient and not the aesthetics.

I implore all of us working in development teams to challenge views about design and look and feel as early as possible in the lifecycle. Testers should ask hard questions about the aesthetic qualities of the product, however without acceptance from stakeholders, management, business analysts, designers and developers nothing will change. Clearly there are many examples of stunning and elegant products created that people like as well as use (Apple is a case in point), so good practice does exist. We all need to find and follow it.