When do people report bugs? You probably never asked yourself that question, which was one of the reasons we wanted to answer it. The second reason was that we had the right type of data available, since our tool helps teams report and fix bugs. Long story short, we decided to learn more about the way software development teams approach the bug reporting process. For that exercise we've taken a random sample of 3000 bug recordings uploaded to our database in the last 2 months. Here's what we found.
Turns out that 2pm is the golden hour for reporting bugs. It is interesting to see that activity is growing in the morning up to 9am, and then again towards the afternoon, peaking at 2pm. There is also a local spike at 5. It's hard to know for sure why that is. A possible explanation is that some people do a round of testing before the end of their work day. That brings us to the next point - working hours.
Is there even such a thing as a standard working day in software development circles? Turns out - not really. We compared how many recordings are uploaded within the regular 9-to-5 as opposed to other times.
The split is almost equal, which was at first a bit surprising to us. When you look at the previous graph (bugs reported by hour), there seems to be more activity during the normal working day. You therefore don't expect an almost perfect 50/50 distribution. The explanation is quite straightforward though - 9-to-5 is only 1/3 of the day. So the remaining period is twice as long in terms of hours, but the rate of reporting bugs is two times lower. That calls for a question: are the individuals in product teams more flexible these days? Or are we just working longer and should reconsider our habits?
We expected people to file fewer bug reports on Fridays and weekends, than the rest of the week. This assumption proved to be correct. What we didn't expect is that the first 2 days would see higher activity. Our assumption was that there would be little to no difference between Monday, Tuesday, Wednesday and Thursday.
According to a 2009 study by Capers Jones, 85% of bugs are caught before the code gets to production. Is that the case today?
In our particular sample about 2/3 of bugs are reported on pre-production environments (local and staging). That number is lower than the one in the original research by Capers Jones.
One of the plausible reasons can be a different user group. While Capers Jones seems to have focused primarily on software development teams, our product is used not only by Engineers, QA, PMs and Designers. We see an increasing amount of activity from customer support teams, for example. That means 2 things. First of all, business and operations roles in general don't use staging, so they can only report bugs on production. Secondly, certain things they might report as bugs, aren't actually bugs. Even a person, who knows the product inside-out, can't immediately tell if something's indeed a bug every single time. Some behaviours can be caused by hard-to-find settings, ad blockers or connection issues.
We can't say for sure if this assumption correctly explains the 20% difference in the number of bugs reported on pre-production environments. Many factors are at play: company size, development methodology (agile/waterfall), quality assurance process, complexity of business logic and even how a particular company defines as a bug.
For that exercise we used the titles of bug reports to create a word cloud and see the most common themes.
The result turned out to be rather predictable. Words like Bug, Error, Page, Test were the most common. To be frank, we did not expect "Collection" and "Date" to be that frequent - could be due to the sample, though.
As the next step we removed the 7 highest ranking words to understand the rest of the group better. Now it's more apparent that adjectives and verbs are also on the list. Despite that, majority of the words are nouns.
What we found the most interesting, and in a very positive way, is what words were NOT on top of the list. From personal experience we know that sometimes people describe bugs in very general terms, i.e. "Nothing works", "X is broken". To help companies avoid that, we have even created a bug report template to help especially non-technical people communicate bugs better. Yet, in our sample that behaviour was gladly uncommon.
Despite having only 17% of the general market, macOS is the most popular choice among bug reporters. More bug reports were made using this operating system, than Windows and Linux combined. Could it be because macOS makes you more productive?
The majority of usage in the last 2 months came from the previous version of macOS (Mojave). Lately though we see more activity from Catalina, as you might expect.
The data we have on Windows is a lot less granular. We can only say that almost all recordings were uploaded using Windows 10.
At the moment our product is officially supported only on Chrome, which is why we decided to exclude information about other browsers. We can however say that Chrome accounts for about 80% of our website traffic, followed by Firefox and Safari.
The earliest bug reports in our dataset are 2 months old. At the same time, very few of them came from Chrome versions released over 4 months ago. It seems that Google is fairly good in making sure that users upgrade to newer versions.
We can only recommend this strategy as a best practice - it makes things much easier to debug. It also ensures a more homogeneous user base, which has positive implications not only on software development teams, but also technical support.
To conclude, we decided to take the data we had and create a bug report you are most likely to see. To do that, we combined the information from this article with the figures that didn't make it into the graphs above.
And here it is - the most "average" bug report of this fall.
Title: Bug error. Test collection date issue.