Your cup of tea!

Using New Relic for Technical SEO

While New Relic is primarily not imagined as a tool for SEOs, if you already have proper tracking in New Relic set-up for your website (for example Adobe Commerce Cloud), it would make sense to utilise it for technical SEO purposes as well.

That’s what we decided to do at 418.

We’ve built ourselves an SEO dashboard in New Relic for quick daily overview at a glance.

In this tutorial I will guide you through the process of creating some very useful custom dashboards for these purposes.

Creating your first custom dashboard in New Relic

Log-in to New Relic and in the left sidebar click on “Dashboards” -> “+ Create a dashboard” button in the top right and select “Create a new dashboard” in the right sidebar modal that popped up.

Name your dashboard as you please and set-up permission levels as needed for your specific use case.

Create a chart showing bot traffic to your website in the last 24 hours in New Relic

Now that we have our custom dashboard, we want to fill it up with some useful data for SEOs.

The first thing that we want to visualise and keep an eye on in our custom New Relic SEO dashboard is traffic we receive from different bots.

To do that we will utilse NRQL queries.

In New Relic, navigate to “Query your data” from the left sidebar and add the following query to the query builder:

SELECT count(*) FROM Log FACET request_user_agent WHERE request_user_agent LIKE '%bot%' SINCE 24 HOURS AGO TIMESERIES

This should generate something like this for you:

New Relic Chart showing bots activity over the last 24 hours

This will match all user agents that have the word “bot” in their name, which is the grand majority of them and will usually suffice.

If you want to be extra careful you can also get a list of known bad bots user agents and add them separately to the LIKE part of the query, for example:

SELECT count(*) FROM Log FACET request_user_agent WHERE (request_user_agent LIKE '%PetalBot%' OR request_user_agent LIKE '%SEMrushBot%' OR request_user_agent LIKE '%DotBot%') SINCE 24 HOURS AGO TIMESERIES

And so on, just add as many bots as you want from some list of bad bots you acquired.

Now, this graph while useful, could be more useful if we had a better feeling of what the total traffic bots do per hour to us amounts to. To achieve this, we should flip the “Chart type” in the dropdown on the right side from “Line” to “stacked Bar”.

Now we have a graph that looks something like this:

An image of a stacked bar graph in New Relic used for technical SEO

Now we can clearly see at which time we have spikes in bot traffic on our website and which bots are responsible for it and we can act accordingly.

Even when you take proper action against bots you do not wish to crawl your website (be it through robots.txt disallow, or blocking their user agents from the server side), it is good to keep track of this graph and see if they are actually good bots that listen to your robots.txt directives or if they ignore them.

For this reason, we want to add this chart to our custom SEO dashboard we created in step 1 of this tutorial.

To do that, just click on the “Add to dashboard” button in the bottom right corner and select the dashboard you created for this purpose.

Look into which URLs return 404, 500 or similar errors

While many people use crawlers such as Screming Frog to find URLs with such issues, these crawlers don’t really tell you how often people or bots encounter those URLs on your website.

Our New Relic data does!

To get a list of URLs that returned 4XX or 5XX errors we can use a simple NRQL query:

SELECT count(*) FROM Transaction WHERE httpResponseCode > '400' SINCE 1 day ago FACET request.uri AS 'URL', httpResponseCode LIMIT MAX

The LIMIT MAX here is an important part of the query because we’re using “FACET”, which by default and the default amount of results you would get when using FACET is 10 without the LIMIT MAX part.

This should generate something nice for you such as this table:

A table of 4XX and 5XX errors using New Relic for technical SEO

This is now sorted by “count” so you can see the most common ones visited, but you can also sort it by http response code to view specific errors first. Just like last time, just hit the “Add to dashboard” button in the bottom right to add this to your custom dashboard.

If, for example, you know that you don’t really use 302 redirects, you can create yourself a query that will return a table with all 302 URLs that were visited in the past 24 hours using this NRQL query:

SELECT count(*) FROM Transaction WHERE httpResponseCode ='302' SINCE 1 day ago FACET request.uri AS 'URL', httpResponseCode LIMIT MAX

Similar to that, if you wish to know which of your 301 redirects are most often used, and maybe change the source links that are causing those redirects to point to the proper new URL instead of redirecting thousands of users and bots every day, you can find that out using this query:

SELECT count(*) FROM Transaction WHERE httpResponseCode ='301' SINCE 1 day ago FACET request.uri AS 'URL', httpResponseCode LIMIT MAX

Create a chart comparing Googlebot activity with the previous day

This can get very useful in your daily SEO dashboard where you can easily spot anomalies that happened after latest deploy or changes to your robots.txt or similar events. Try this query:

SELECT count(*) FROM Log WHERE `request_user_agent` LIKE '%Googlebot%' TIMESERIES MAX COMPARE WITH 1 day ago

This will give you a chart like this that you can add to your dashboard, as always using the button on the bottom right:

Chart showing Googlebot activity on the website for 1 hour compared tot he same 1 hour in the previous day

Build and examine important funnels

Using NRQL’s “funnel” in a query, you can build specific funnels of URLs yours customers visit and track their completion rate. Here is an example:

FROM PageView SELECT funnel(session, WHERE pageUrl LIKE '%checkout%' as 'Checkout', WHERE pageUrl LIKE '%checkout/onepage/success%' as 'Checkout success') SINCE 1 day AGO

You can add more URLs to the funnel or modify these URLs to your liking. This NRQL query would result in a graph that looks like this:

A chart showing checkout completion in Magento 2 / Adobe Commerce using NRQL funnel in New Relic

Figure out if your campaigns are breaking through your cache

If you work in digital marketing for some time you’ve probably experienced it a few times. That moment when you shoot yourself in the foot (from the website performance perspective).

It is a good idea to keep track of your page cache hits and misses to spot any anomalies, maybe some of your recent campaign parameters are constantly breaking through the cache while they shouldn’t.

This simple NRQL query will create a chart for you which will give you the daily overview at a glance in your custom dashboard:

SELECT count(*) FROM Log FACET cache_status SINCE 1 day AGO TIMESERIES

This will generate a chart such as this one:

New Relic Cache Status HIT MISS PASS SYNTH ERROR HITPASS chart

Written by Toni Aničić – Founder & CEO at 418 d.o.o. on 13th of April 2023

NEED HELP WITH YOUR MAGENTO 2 / ADOBE COMMERCE SEO?

Talk to Agency 418!