SEO

Page Speed Insights: Site Performance & Core Web Vitals

Analyze page performance with Page Speed Insights. Understand Core Web Vitals, interpret CrUX field data, and diagnose site speed issues.

301.0k
page speed insights
Monthly Search Volume

PageSpeed Insights (PSI) is Google's free tool that analyzes how fast your web pages load on mobile and desktop devices. It combines real-world user data from the Chrome User Experience Report (CrUX) with simulated lab tests from Lighthouse to diagnose performance issues. For marketers and SEO practitioners, PSI provides the direct connection between technical performance and search visibility, showing exactly where users experience friction that impacts search engine rankings.

What is PageSpeed Insights?

PSI reports on the user experience of a specific page across mobile and desktop environments. The tool integrates two distinct data sources: field data from the Chrome User Experience Report (CrUX), which captures anonymized performance from real users over a 28-day period, and lab data generated by Lighthouse, which simulates page loads under controlled conditions. Note that the PageSpeed Insights for Google Chrome has been deprecated; Google directs users to the online version at pagespeed.web.dev.

Why PageSpeed Insights matters

  • Validates real user pain points: Unlike synthetic tests, PSI's field data shows how actual visitors experience your site across diverse devices and network conditions.
  • Directly impacts SEO: Speed influences search rankings. PSI identifies specific optimizations to improve visibility.
  • Pinpoints technical debt: Lab diagnostics reveal exactly which assets, scripts, or render-blocking resources slow down your pages.
  • Updates faster than monthly reports: PSI aggregates CrUX data daily, allowing you to monitor recent changes rather than waiting for monthly BigQuery datasets.
  • Pass/fail assessment: The tool provides a clear Core Web Vitals verdict (pass/fail) based on whether the 75th percentile of LCP, INP, and CLS metrics meet "Good" thresholds.

How PageSpeed Insights works

  1. Enter your URL at pagespeed.web.dev and run the analysis.
  2. PSI pulls field data from CrUX for the specific URL (or falls back to origin-level data if the URL lacks sufficient samples).
  3. The tool reports the 75th percentile for each metric, categorizing experiences as Good, Needs Improvement, or Poor.
  4. Lighthouse simulates the page load using a mid-tier Moto G4 device on a mobile network for mobile tests, and an emulated desktop with wired connection for desktop tests.
  5. PSI generates scores for Performance, Accessibility, Best Practices, and SEO categories, with specific audits detailing optimization opportunities.

Best practices

  • Prioritize the 75th percentile over averages: Google selects the 75th percentile to ensure your site performs well for users under the most challenging conditions, not just optimal ones.
  • Focus on Core Web Vitals first: Address Largest Contentful Paint (LCP), Interaction to Next Paint (INP), and Cumulative Layout Shift (CLS) before optimizing secondary metrics like First Contentful Paint or Time to First Byte.
  • Compare mobile and desktop separately: Mobile tests simulate a Moto G4 on a mobile network, while desktop uses wired connections. Optimize for the constraints your primary audience uses.
  • Use field data to validate, lab data to debug: When CrUX shows poor real-user experiences but Lighthouse scores look good, investigate variability in user devices and networks rather than dismissing the issue.
  • Check daily for trending: Since PSI updates every 24 hours covering the trailing 28 days, monitor regularly to catch performance regressions faster than monthly reporting allows.

Common mistakes

  • Mistake: Treating a "good" lab score as guarantee of user satisfaction. You might see a green 90+ Lighthouse score while CrUX data shows real users hit poor thresholds due to older devices or slow networks. Fix: Always cross-reference field data before declaring victory.
  • Mistake: Obsessing over TTFB or FCP while ignoring INP. While Time to First Byte and First Contentful Paint appear in reports, they do not determine your Core Web Vitals pass/fail status. Fix: Allocate resources to LCP, CLS, and INP first.
  • Mistake: Testing staging environments or recently published pages and expecting CrUX data. The Chrome User Experience Report only includes public, crawlable, indexable URLs with sufficient distinct user samples over 28 days. Fix: For new pages, rely on origin-level data or wait for traffic accumulation.
  • Mistake: Expecting identical scores between consecutive tests. Network availability, hardware contention, and datacenter location variability cause legitimate fluctuations. Fix: Run multiple tests and look for patterns rather than single snapshots.

Examples

  • Example scenario: A product page shows 90% of LCP experiences as "good" (under 2.5s) but fails the Core Web Vitals assessment due to poor INP (over 500ms). This indicates the page loads fast visually but responds slowly to clicks, requiring JavaScript optimization rather than image compression.
  • Example scenario: A blog post published last week returns "No Data" for URL-level field data. PSI automatically falls back to origin-level data showing the entire domain has poor CLS scores. You diagnose the issue as a site-wide intrusive banner causing layout shifts across all pages.
  • Example scenario: Your lab score jumps between 85 and 95 between tests with no code changes. Checking the Lighthouse report reveals tests ran from different Google datacenters under varying network conditions. You implement fixes to reduce Total Blocking Time to stabilize performance despite network variance.

FAQ

Why do my lab and field data contradict each other? Lab data simulates a single device on fixed network conditions, while field data aggregates real experiences across diverse environments. Field data reflects historical performance over 28 days, so recent optimizations might not appear immediately.

What does the 75th percentile mean? Google uses the 75th percentile to represent the most frustrating user experiences on your site. If your LCP 75th percentile is 2.0 seconds, 75% of users experienced faster loads and 25% experienced slower ones.

Why is there no real-user data for my URL? CrUX requires the URL to be public, crawlable, indexable, and have sufficient distinct samples. New pages or low-traffic pages often lack data. PSI then falls back to origin-level data or shows "No Data" if the entire origin lacks samples.

How often does PSI update? PSI aggregates new field data daily, covering the previous 28 days. This differs from the BigQuery CrUX dataset, which updates monthly.

What device does Lighthouse use? Mobile tests simulate a mid-tier Moto G4 on a mobile network. Desktop tests use an emulated desktop with a wired connection.

What is a good PSI score? Lab scores of 90 or above are considered good, 50 to 89 need improvement, and below 50 are poor. For field data, metrics must meet "Good" thresholds at the 75th percentile to pass Core Web Vitals.

Start Your SEO Research in Seconds

5 free searches/day • No credit card needed • Access all features