How to select a tool in an ecosystem that generates frameworks faster than Zerg Swarm, and kills them with the enthusiasm of Darwin on amphetamines?
I asked people in the survey to rate their experience with the tools. Some questions looked at the process benefits, such the confidence to release frequently and preventing bugs. Other questions inquired into tool usability, how easily developers work with their tools, and how easily they can maintain and understand test cases. The combined scores for those groups of questions give a nice indication about how happy teams are with the chosen tools. Of course, the process benefits depend more on the contents of the test cases than the tools themselves, but a big difference in the benefits rating could point out that the tool is not suitable for some key workflow.
From that viewpoint, a good choice for a tool would be something that:
Before we get into the numbers, here’s a quick note on data accuracy. Based on the 683 responses, and the latest estimate for the number of developers worldwide, the margin of error for the survey results is 4% with a 95% confidence level. Also, note that the benefits and usability questions were formulated so that people rated their entire setup, not an individual tool, which is important where a combination of tools comes into play.
The original research also contained questions on related tools such as browser runners and continuous integration platforms. If you’d like to dig deeper into the data or see the answers to other questions, download the full results.
Just looking at the popularity, three tools clearly stand out. The first two will be no surprise for anyone involved in the community more than a few years, but the bronze medal was quite surprising for me:
Top 5 most popular test automation tools
Here is how the users rated their experiences with the top three tools:
Mocha wins slightly in terms of the positive effects on the development process, but the other tools are following closely, meaning there’s likely no major issue with any of the top three for key modern workflows. Jest leads in developer usability, insignificantly over Mocha, but significantly over Jasmine. All the other tools are significantly less popular.
Removing responses where teams used more than one tool, the rankings stay the same:
The recent popularity of Jest is probably caused by the meteoric rise of React, as they both come from the same source. In the survey, about 80% of people using Jest also use React.
Front-end frameworks used by teams in the survey
Here’s how the numbers change when focused on a few popular application types:
Roughly 43% of the respondents use React in their work. Jest and Mocha seem to be similarly popular for this use case, but Jest leads in developer usability for testing React applications.
Roughly 34% of the respondents use some version of Angular. In the Angular application development world, Jest drops off the radar and another tool pops up into the third place: Cucumber JS. Jasmine is by far the most popular choice. People using Cucumber seem to have higher ratings for their test setup, but the overall differences here are inconclusive, so I’d need more data before making any big recommendations.
Roughly 20% of the respondents work only on back-end applications. Unfortunately, the numbers here start getting a bit untrustworthy, but Mocha seems to be significantly more popular than the rest, and better rated. I would, however, like to collect more responses for this case before making any conclusions. Maybe you can help?
As I mentioned before, the numbers add up to more than 100% because many teams used more than one tool. In fact, only about a half of the teams used a single tool.
Number of test automation frameworks used by teams in the survey
Looking at the popular combinations, I was quite surprised with the results!
The most popular combination seems to be Jasmine and Mocha, which is very curious as they both tend to solve the same problem. I can only assume that this is because people used one tool early on and switched to another one, so different parts are covered by different tools. Another option, based on the previous observations about back-end testing and Angular, would be that people use Mocha on the back-end code and Jasmine for front-end code. However, that’s not the really interesting part of the tool combination table.
Check out the pattern emerging when ordering by benefits:
Top tool combinations, ordered by process benefits as voted by teams in the survey
Although Cucumber JS on its own isn’t popular enough to get to the top 3 choices on the list, combining Cucumber with one of the more popular tools seems to make a big difference for bug prevention and deployment confidence. Cucumber and Mocha together scored 74% on the benefits scale, higher than any other tool in isolation, or any other combination.
A similar pattern emerges when ordering by usability:
Top tool combinations, ordered by developer usability as voted by teams in the survey
Cucumber and Jest as a combination seem to win by a huge margin in terms of making developers happy about maintenance, writing and understanding automated tests. With a 70% score, this higher than any other isolated tool, or any other combination.
We don’t use React so I’ve not really paid attention to Jest before, but the results of this survey will make me think twice about the tool choice for the next project.
Jest seems to win narrowly in terms of developer happiness. Although it is a relatively new kid on the block, given the corporate backing by Facebook, it’s likely to stay around for a long time. It also seems to be crossing over from just React into a more generic tool, confirmed by the 20% of its users that do not use React.
The other interesting conclusion seems to be that combining Cucumber with any of the popular unit-testing tools increases the overall rating — quite significantly. I assume the explanation is because Cucumber tends to solve a different problem from Jasmine, Mocha or Jest. The three most popular tools are aimed at developers, where Cucumber is more geared towards cross-functional work and looking at quality from a slightly higher level. Extracting higher-level test cases from a developer-oriented tool likely makes both parts cleaner and easier to work with. Although most people (wrongly) equate Cucumber with browser-level user interface testing, it’s a decent tool for extracting business-readable scenarios that can be validated by domain experts, so this is a valid choice for back-end work as well.
My conclusion from this is that the combination of Jest and Cucumber seems to strike the best balance between process benefits and developer usability overall. If you’re starting a new project with a clean slate, that seems to be a good choice for the test framework, closely followed by Mocha and Cucumber.