Core Web Vitals Report
Measure what matters. Fix what's slow.
The Core Web Vitals report surfaces LCP, INP, and CLS for your key pages. Use it to identify performance issues, prioritize fixes based on impact, and track improvements over time. Results roll into client-ready SEO reports so stakeholders see the progress.
3
Core Metrics
Lab + Field
Data Sources
On-Demand
Refresh Available
“We used to guess which pages were slow. Now we know exactly what to fix and can show clients the improvement over time. CWV reports became a standard part of our monthly reviews.”
— Daniel K., Technical SEO Lead
Metrics Included (LCP, INP, CLS)
Core Web Vitals measure three aspects of user experience: loading speed, interactivity, and visual stability. Here's what each metric tells you.
LCP (Largest Contentful Paint)
What it measures: How long it takes for the largest visible content element (usually a hero image, heading, or text block) to render on screen.
Why it matters: LCP reflects perceived loading speed. Users notice when the main content takes too long to appear, and Google uses LCP as a ranking signal.
≤ 2.5s
2.5–4.0s
> 4.0s
Common causes of poor LCP:
- Slow server response time (TTFB)
- Render-blocking CSS and JavaScript
- Large, unoptimized hero images
- Slow third-party resources
INP (Interaction to Next Paint)
What it measures: How quickly the page responds to user interactions (clicks, taps, key presses). INP captures the delay between input and visual feedback.
Why it matters: Responsiveness affects whether a site feels snappy or sluggish. Pages that lag on interactions frustrate users and signal poor UX to search engines.
≤ 200ms
200–500ms
> 500ms
Common causes of poor INP:
- Heavy JavaScript execution
- Long tasks blocking the main thread
- Third-party scripts competing for resources
- Unoptimized event handlers
CLS (Cumulative Layout Shift)
What it measures: How much the page layout shifts unexpectedly during loading. CLS quantifies visual stability—elements moving around after they've already rendered.
Why it matters: Layout shifts are disorienting. Users click the wrong thing when content jumps. Unexpected shifts erode trust and hurt engagement.
≤ 0.1
0.1–0.25
> 0.25
Common causes of poor CLS:
- Images and embeds without explicit dimensions
- Ads or banners injected without reserved space
- Web fonts causing text reflow (FOUT/FOIT)
- Dynamically loaded content pushing existing elements
Lab vs Field: How to Interpret Your Numbers
Core Web Vitals data comes from two sources, and they don't always match. Understanding the difference helps you interpret results correctly.
Lab Data
Lab data comes from synthetic tests—a simulated page load under controlled conditions. PageSpeed Insights runs the page through a standard device and network profile and reports what it measures.
What it's good for:
- Diagnosing specific issues
- Testing changes before they go live
- Consistent, reproducible measurements
Limitations:
- Doesn't reflect real user conditions
- Can miss issues on certain devices or networks
- Results vary between runs
Field Data
Field data (also called Real User Metrics or RUM) comes from actual visitors. Google collects anonymized performance data from Chrome users and aggregates it over a rolling window (typically 28 days).
What it's good for:
- Understanding actual user experience
- Seeing how diverse devices and networks affect performance
- Tracking real-world improvement over time
Limitations:
- Requires sufficient traffic to generate data
- Updates lag (rolling window)
- Not available for all pages or origins
Why They Differ
It's normal to see different numbers in lab and field data. Lab tests simulate one scenario; field data reflects thousands of different user conditions. A page might score well in lab tests but poorly in the field if real users are on slower devices or networks.
How to use both:
- Use lab data for debugging and diagnosing specific issues
- Use field data to track real-world user experience over time
- Don't expect instant field improvement after fixes—the rolling window takes time to reflect changes
How We Prioritize Core Web Vitals Fixes
Not all performance issues deserve equal attention. Here's how to prioritize.
Fix High-Traffic Pages First
Start with pages that drive the most organic traffic and conversions. Improving CWV on your top landing pages has more impact than fixing a low-traffic blog post. Use your SEO Reporting Dashboard to identify which pages matter most.
Prioritize Template-Level Fixes
If the same issue affects multiple pages (e.g., a site-wide header script or a shared image component), fixing it once improves CWV across the entire template. Global fixes compound.
Use the Impact-Effort Matrix
| Low Effort | Medium Effort | High Effort | |
|---|---|---|---|
| High Impact | Do first | Plan soon | Evaluate ROI |
| Medium Impact | Quick wins | Schedule | Deprioritize |
| Low Impact | If easy | Skip | Skip |
For LCP
- 1Optimize the largest content element (hero image, heading)
- 2Reduce server response time (TTFB)
- 3Remove render-blocking resources
- 4Preload critical assets
For INP
- 1Break up long JavaScript tasks
- 2Defer non-critical scripts
- 3Audit third-party script impact
- 4Optimize event handlers
For CLS
- 1Add explicit dimensions to images and embeds
- 2Reserve space for ads and dynamic content
- 3Use font-display strategies to prevent reflow
- 4Avoid injecting content above the fold after load
Data Freshness and Refresh Behavior
Understanding how data updates helps you interpret results and set expectations.
What the Report Shows
The Core Web Vitals report displays the latest PageSpeed Insights data available at the time of the last refresh. This includes both lab results (from the most recent synthetic test) and field data (if available for that origin or URL).
Caching and Quotas
CWV data can be cached for performance and to manage API quota limits. If you refresh frequently, you may see the same results until the cache expires or new data becomes available. PageSpeed Insights has rate limits. During heavy usage periods, refresh requests may be queued or temporarily restricted.
Refresh on Demand
You can trigger a refresh to pull the latest available data. This is useful after deploying fixes—you want to see if lab scores improved. Keep in mind that field data won't reflect changes immediately due to the rolling window.
What "Fresh" Means
- Lab data: Reflects the most recent synthetic test, which updates when you refresh
- Field data: Reflects a rolling window of real user data, typically 28 days, which updates gradually
Note: Don't expect real-time updates. Treat the report as a snapshot that you refresh when you need current numbers.
How Core Web Vitals Connects to Technical SEO and Reporting
CWV reporting doesn't exist in isolation. Here's how it fits into the broader SearchSignal workflow.
Part of Technical SEO Scan
Core Web Vitals checks are included in the Technical SEO Scan. When you run a technical scan, CWV results appear alongside crawl findings, meta tag analysis, and structured data validation.
Included in Client Reports
CWV results can be included in reports you share with clients via the SEO Reporting Dashboard. Clients see performance metrics alongside traffic, keyword, and technical health data—all in one place.
Diagnose Root Causes
When CWV scores are poor, the Technical SEO Scan helps identify root causes. CWV tells you what's slow. Technical scan findings help you understand why.
Root Cause Mapping
Poor LCP?
Check for render-blocking resources, large images, slow server response
Poor INP?
Look for heavy JavaScript, third-party script load, main-thread blocking
Poor CLS?
Review images without dimensions, injected content, font loading strategy
Troubleshooting PageSpeed and CWV Reports
When results are missing or unexpected, here's how to diagnose.
CWV Report Is Blank or Fails
Likely cause:
Provider quota limits reached, transient API failure, URL is invalid or not publicly accessible, page is blocked from fetching.
Fix:
Try again later when quota resets. Test a single URL first to confirm it works. Verify the page loads publicly without authentication or geo-restrictions.
Results Look Inconsistent Between Runs
Likely cause:
Lab test variability (network simulation, server response fluctuation), caching, or actual changes to the page between runs.
Fix:
Compare trends over multiple runs rather than fixating on a single score. Focus on consistent patterns—if LCP is always above 4 seconds, that's the signal, not whether it's 4.1 or 4.3.
Field Data Not Available
Likely cause:
Not enough real-user traffic for Google to generate field data for that URL or origin.
Fix:
Rely on lab data for diagnostics. Field data will become available as traffic grows. For low-traffic pages, lab data is your primary signal.
Only Some Pages Show Results
Likely cause:
Page template differences, crawl scope limitations, or quota constraints limiting how many URLs were tested.
Fix:
Prioritize your most important landing pages first. Expand scope incrementally as quota allows. Not every page needs CWV monitoring—focus on pages that matter.
Scores Improved in Lab But Not in Field
Likely cause:
Field data uses a rolling window (typically 28 days). Recent improvements won't appear immediately.
Fix:
Wait for the field data window to reflect your changes. Continue monitoring. If lab scores are consistently good, field data will catch up.
Core Web Vitals FAQs
Common questions about Core Web Vitals reporting.
Want to see how your pages perform? Run a Core Web Vitals report and get actionable insights.
Run My CWV ReportRun Your Core Web Vitals Report
Measure LCP, INP, and CLS for your key pages. Understand what's slow, prioritize fixes, and track improvements over time.
Run My CWV Report