Beware, not all traffic is created equally. If you’re a website owner running advertising and SEO campaigns, the chances are that traffic is important to you. An effective campaign can lead to a healthy increase in visitors, but watch out - statistically speaking, the majority of browsers that navigate onto your pages will not only be uninterested in your product but won’t actually be human at all.
Alas, the internet is crammed full of invisible bots that have a habit of turning up uninvited onto websites. Whether you own a local business or blogging site, it’s likely that you’ve been visited by a robot without even realising - and have probably already been paid a visit today.
So, what do these bots actually do? And how does it affect the overall quality of traffic that your campaign receives? Let’s take a deeper delve into the varied world of traffic quality, and look at how to measure the type of traffic you receive.
(Some bots may be good, some may be bad - but they’re marginally more common website visitors than humans. Image: Imperva)
Several studies have indicated that over half of internet traffic today is comprised of little robots performing automated and repetitive tasks for their developers.
For example, SEMRush is one of the good bots that would pay you a visit when you run your website through a site audit tool.
Some of these bots are completely harmless when they navigate onto your website, while others can carry cruel and sometimes devastating ramifications.
Of course, this is the last thing you’ll need to hear if you’ve already set up an advertising campaign intent on drawing in new customers. It’s imperative that those who access your pages are doing so with the intent of making a purchase or at least promoting your brand.
Bot activity is rife online today, and it’s vital that users know when their advertising channels are drawing in genuine conversions as opposed to a network of passive bots, so to better understand how to measure traffic quality, let’s take a look at the type of visitor bots that your website is likely to come across.
While the notion of attracting bots as visitors doesn’t sound like the most useful form of website traffic, they can actually hold plenty of benefits to businesses and individual site owners alike. When somebody runs a search that’s related to your products or services, whether or not your website appears within the list of relevant results is down to the work of so-called ‘good bots’.
Commercial crawlers make up 2.9% of the average website’s traffic and can come in the form of the reputable GoogleBot, Bingbot and Baidu Spider, for example. Good bots account for around 22.9% of the typical visitors that a website receives, and generally respect the regulations imposed on their crawling activity and indexing rates, which are typically defined in a website’s robots.txt file. Some crawlers can be prevented from indexing websites if they’re determined to be irrelevant to the business in question.
Besides search engine crawlers, good bots can also be found in partner bots, like Slackbot, social bots like Facebook Bot, website monitoring bots like Pingdom, backlink checker bots like SEMRushBot and aggregator bots like Feedly.
It’s worth noting that even so-called good bots can pose a few problems at times, and increased bot activity can conspire against websites with limited server capacity. Of course, a significant problem for websites running advertising campaigns is that this form of traffic imitation can skew the performance of their strategies.
Bad bots are the scourge of the internet. They account for around 28.9% website traffic online and are designed to negatively impact your website. These types of bots operate in a more evasive style and are typically used by fraudsters, hackers and other cybercriminals looking to perform illegal activities online. Some bad bots can be sent by third-party scrapers or online competitors to steal your website’s content.
Other bad bots can make thousands of page visits in such a quick space of time that it crashes your servers, or slows your site down for genuine users. Bad bots can scrape your content and publish it somewhere else on the web - negatively impacting your search engine ranking positions. Sometimes, stolen content can even outrank your original articles or news stories within Google. This can affect the bottom line of website owners who have either worked hard to create new and original content or paid professionals to craft pieces for their site.
Other bad bots can spam website comments sections and forums, as well as overload classified advertisements with hundreds of thousands of fabricated leads. Furthermore, bad bots can carry out cart abandonment on eCommerce sites, negatively affect web analytics, impersonate accounts to steal banking information and generally disrupt the security services offered up by your website.
Bad bots are on the rise, and this is down in no small part to the prevalence of impersonator bots. Accounting for 24.3% of all visitors that the average website receives, impersonators have developed human characteristics that can outwit a fair share of security setups in order to carry out malicious activities on a given website. Bad bots are becoming an increasingly popular tool during election campaigning to trick social media users into believing certain sets of ideologies.
(Dark social shares can be similarly misleading when it comes to charting the quality of traffic you receive. Image: Talkwalker)
If the threat of bad bots undermining your advertising metrics wasn’t enough, there’s also the fearsome presence of ‘dark social’ to deal with. Dark social refers to the invisible shares that occur through platforms like Messenger, SMS and WhatsApp.
If you see an advert for a product you like online and copy the URL to paste in a WhatsApp conversation, this counts as a dark social share. It will appear as a ‘direct’ share but the website in question won’t know where you got the advertisement from - leaving it open to being a form of referral traffic that’s attributed to the incorrect channel.
According to Talkwalker, the term Dark Social was first used by Alexis Madrigal of The Atlantic, who claimed that we’re only seeing and measuring the tip of the sharing iceberg. The problem with Dark Social is that it skews our interpretations of the success of our marketing campaigns.
The prevalence of Dark Social shares is huge. In fact, 84% of shares occur in this invisible manner. While all shares can be interpreted as a good thing, it’s vital that you have a clear insight over what’s working and what isn’t when it comes to marketing campaigns.
The task of ensuring that your products and services reach the right audiences at the right time is worth a lot of money, and if one particular campaign is performing better than another, but the metrics on hand can’t see it, then you’re essentially throwing money down the drain in persevering with an ineffective approach to driving conversions.
Finding ways of measuring traffic quality accurately is not only useful but highly valuable too. Luckily, there are plenty of high-quality web analytics and security services out there that promise to provide accurate insights into the quality of your traffic to determine whether or not you’re in a position to drive human conversions or are simply opening yourself up to bots.
(Chart from Finteza showing high-quality and low-quality traffic arrivals on a website. Image: Finteza)
All things considered, there are a number of signals that can help you measure quality of your website traffic:
Check where most of your traffic is coming from. Is it your target country? If you’re running a local business, receiving most of your traffic from the other part of the world might not be the thing you wished for.
The simplest way to check the location of your users is to use Google Analytics. Go to Audience > Geo > Location
Keep track of visitors that drive conversions (e.g. email signups, purchases and form submissions). Check the country they’re coming from and what pages drive these conversions.
Setting up Google Analytics conversion goals will help you track that.
Check how many people leave without viewing other pages on your website.
Seeing a downward bounce rate trend is a good sign.
Ensure to check how many pages visitors view in a single visit along with the time they spend on the site.
By filtering out and combining all of the four metrics above, you’ll be able to identify those visitors that come under the “good traffic” category.
Advertising campaigns work in different ways and various approaches can carry opposing results. There are plenty of analytical tools out there that will help you to find out whether or not the advertising services you’re paying for are carrying value for money.
If you find that your big marketing campaign is filled with impersonator bots, it could be time to pull the plug and find another avenue to drive conversions.