Analysing Analytics Solutions — Part One of the Series
While nearly all the following analytics solutions work by adding tracking code to each of your pages, they offer very different insights into your data and varying levels of privacy to your audience; not all solutions give you the option to filter out bot traffic manually, for example.
The analytics solutions we will be looking at in the series are:
Selected Analytics Providers
Fathom Analytics is a solution that provides website statistics without tracking or storing users’ personal data. The software monitors your website’s performance and collects trends and data insights. It is privacy-focused and GDPR as well as ePrivacy compliant. Their analytics tool measures the audience’s activity and provides data to find outliers in their behaviours. You can filter data to showcase effective content strategies or identify new opportunities.
The tool also allows you to get a detailed overview of your audience’s behaviour as they interact with the website. In addition, you can build custom reports tailored to your department’s specific needs. Fathom Analytics is easy to use and highly customizable. For developers, Fathom Analytics allows you to block your traffic, even “localhost” activity while you’re coding. However, like Google Analytics, it does filter out bot traffic and therefore does not give you the full picture.
Plausible is an open-source website analytics tool that provides insights on one single page. It is available as a hosted or self-hosted solution. It is a popular choice for privacy-conscious site owners, because all the site measurement is carried out anonymously. Plausible does not use cookies and no personal data is collected. There are no persistent identifiers and no cross-site or cross-device tracking. Your site data is not used for any other purpose than your analytics.
Plausible is very simple to use – it provides an at-a-glance guide to the most important metrics on one page. The script is 45 times smaller than Google Analytics’ one, which cuts down on page weight, leading to faster loading times in addition to reducing your site’s carbon footprint. Plausible allows you to identify trends by setting custom events or page URLs. You can measure the traffic with automated email reports and you can share your stats privately by generating a shared link. Plausible filters out bot traffic automatically, so website managers have no visibility on bot traffic.
StatCounter offers a simple interface for basic site analytics and reports on web page views, sessions, site visitors, and new visitors. This basic dashboard is free. Statcounter also provides advanced paid features, such as reports for bounce rate, conversion rate, and paid traffic.
In addition to helping users analyse traffic trends over time, StatCounter monitors how paid traffic is performing on ad networks such as Google, Facebook and Twitter. The platform helps to detect click fraud and see how much budget is being wasted by it. It also notifies you when a critical visitor returns to the site. With this tool, you can watch the entire user journey and identify possible issues with navigation, site structure, and flow.
Google Analytics is the most popular web analytics solution, currently having over 85% market share, according to W3 Techs. Google Analytics is free, easy to configure and flexible. It provides a wealth of information. This includes not only the number of page views and visits but also insight into your site’s visitors. Google Analytics allows you to determine who is visiting your site, where they come from, and how often they have visited in the past.
Google Analytics has a standard, built-in feature called ‘Bot Filtering.’ This feature can be found under the “view settings” within a Google Analytics admin panel, where there is the possibility to tick the field “Exclude all hits from known bots and spiders” under the bot filtering section. Lack of visibility of bot traffic in Google Analytics means that you are not getting the full picture of your analytics data.
Where does our DeviceAtlas Solution fit in?
In addition to our accurate device recognition, a key benefit of DeviceAtlas is our bot identification capability. We have our own content serving network that incorporates methods to separate human visitors from robots. Good bots self-identify via their user agent string, and in these cases DeviceAtlas is able to provide the bot name. Our bot identification analytics help customers answer questions about their traffic by analysing the ever-increasing number of events (HTTP requests, Workers requests, Spectrum events) that we log every day.
No other device intelligence solution in the market hosts websites themselves, and as a result they only see HTTP headers from their traffic sources, without the context provided by visibility of the activity of the visitor. As a result, they are unable to distinguish between bot and human visitors through direct measurement. In addition, the DeviceAtlas API algorithm examines every character in the UA string, rather than just looking for tokens or patterns. As a result, it is sensitive to very minor changes in the UA string, such as spacing or single character or casing changes, and this permits such traffic to be separately identified as bot traffic where applicable.
DeviceAtlas is a member of the Interactive Advertising Bureau (IAB) and subscribes to the IAB Spider and Bot list. DeviceAtlas is regularly tested against the IAB list to ensure completeness and accuracy. The IAB list contains accept list and deny list token information, but the DeviceAtlas approach is more robust than this due to the parsing approach described previously and is able to identify bots which the IAB approach cannot.
Bot traffic that we cannot track
- Botnets — Where a desktop is compromised via malware, it can become part of a botnet. In this situation, it becomes a source of both human and bot traffic. Since the UA headers are identical in each case, DeviceAtlas classifies this as human traffic.
- Masquerading bots — Where a bot masquerades perfectly as a desktop or mobile browser, with identical headers to the real device, it is not possible to classify it as a bot based on header inspection alone.
More insights on Bot Traffic
- Introduction to Bot Traffic — Part One of our Bot Analytics Series
- Bot Analytics Series Part 2: The Importance of Monitoring Bot Traffic
Next in the Analysing Analytics Series
We will be comparing and contrasting the features and capability of different analytics solutions to identify and measure traffic, while flagging any missed opportunities.
Please sign up for DeviceAtlas updates if you would like to be notified when next part in the series is published.
DeviceAtlas Free Trial
There are many approaches to analysing your website’s traffic but not all give visibility on bot traffic.
To find out more about DeviceAtlas’ solution, sign up for a free trial today.