Cookie Consent by Free Privacy Policy Generator

The Best Of

Go to the Best Of the SEO Community.

Noah
Noah
Jan 19, 2025, 11:13 AM
Forwarded from another channel:
Forwarded thread from another channel:
Loren Ross
Loren Ross
Nov 15, 2024, 9:18 AM
Hi!
Question about getting accurate page speed data at scale.
I’m curious what tools you all use to get accurate page speed data at scale? GT Metrix seems like the best option, I don’t love the Core Web Vitals UX data, and Google Analytics page speeds seemed wildly inaccurate, but wanted to get your thoughts.
Thanks in advance
Loren
Boris Kuslitskiy
Boris Kuslitskiy
Nov 15, 2024, 9:20 AM
For what purpose? Crux is what's relevant to seo directly. GSC has data. I personally like gtmetrix for tracking key pages, but haven't used it at scale.
Whatever other tools you're using can probably accept a pagespeed api key and pull data for you.
Shawn Huber
Shawn Huber
Nov 15, 2024, 9:35 AM
Screaming Frog connects to PageSpeed Insights and does a great job at scale
Loren Ross
Loren Ross
Nov 15, 2024, 9:36 AM
Ya pagespeed insights with Screaming Frog definitely makes sense
Shawn Huber
Shawn Huber
Nov 15, 2024, 9:37 AM
Though if you have the technical chops, or a solid team you can connect Crux to GA4 and get that data directly in the platform. You can then create great Looker Studio dashboards
Shawn Huber
Shawn Huber
Nov 15, 2024, 9:37 AM
The beauty of this approach is that it will tell you the specific div that is causing any of the scores to not be green
Dave Smart
Dave Smart
Nov 15, 2024, 12:02 PM
The problem with accurate is that there isn't really 1 score for anything, no one LCP, CLS, INP score a URL has. It's variable on so many things, some of them not directly in your control, like latest flagship device on WiFi vs. budget android on Spotty 3g.
So what you need is representative.
CrUX is a good thing, especially if you have some decent traffic, so good URL level coverage.
But ideally you would want to collect your own real user metrics, I'll add in things like and as services that are simple to set up. If you already have something like new relic or sentry, you may have the ability to gather it through them too.
With your own data, you can get more fine-grained, like there's poor LCP, but only from certain countries etc.
If you have dev resources, point them to , you can get that sweet attribution data @Shawn Huber mentioned (it's not in the CrUX data, unfortunately)
Lab data, like gtmetrics is still super useful because it's a much more controlled environment, but really it's best to use it comparatively, i.e did the change you pushed increase or decrease that metric? But keep in mind you then need to look at the real user metrics to validate if it actually helped in the real world.
See how people experience the speed of your website, then identify and fix performance issues.
SpeedCurve: SpeedCurve | Website Performance Monitoring
Essential metrics for a healthy site. Contribute to GoogleChrome/web-vitals development by creating an account on GitHub.
GitHub: GitHub - GoogleChrome/web-vitals: Essential metrics for a healthy site.
Kyle Faber
Kyle Faber
Nov 15, 2024, 12:23 PM
What are your thoughts on datadog, @Dave Smart?
Shawn Huber
Shawn Huber
Nov 15, 2024, 12:25 PM
I've used that - it works pretty good but unless you have a huge budget doesn't keep historical data for more than a few weeks
Shawn Huber
Shawn Huber
Nov 15, 2024, 12:26 PM
could be that's all we're wanting to pay that restricts my data window
Dave Smart
Dave Smart
Nov 15, 2024, 12:28 PM
It's a neat tool, but can be a bit pricey "just" for CWV metrics, like @Shawn Huber says, but if you're using the other stuff too, it's pretty neat, and a bit easier to navigate than say new relic
Joel Herbert
Joel Herbert
Nov 16, 2024, 11:32 AM
I work closely with QA and Engineering, so I rely on both WPT, but DebugBear via an API. WPT has been my go-to, but was too closely as we needed to implement their API to track our multiple sites. DebugBear was cheaper, at least until we wanted to scale, but loved the simplicity of the UI.
CrUX is great for tracking and reporting out, but... not diagnosing real-time users (synthetic) issues.
Engineering has just implemented DataDog as well, but yeah, I hear it's pretty costly...and somewhat of a cumbersome UI for my liking.

Our Values

What we believe in

Building friendships

Kindness

Giving

Elevating others

Creating Signal

Discussing ideas respectfully

What has no home here

Diminishing others

Gatekeeping

Taking without giving back

Spamming others

Arguing

Selling links and guest posts


Sign up for our Newsletter

Join our mailing list for updates

By signing up, you agree to our Privacy Policy and Terms of Service. We may send you occasional newsletters and promotional emails about our products and services. You can opt-out at any time.

Apply now to join our amazing community.

Powered by MODXModx Logo
the blazing fast + secure open source CMS.