Page Speed & Core Web Vitals
Google measures it. Users feel it.
Core Web Vitals are Google's performance ranking signal — three numbers measuring how users actually experience your pages. Fast LCP. Responsive INP. Stable CLS. Hit all three consistently and you rank better, convert more, bounce less. Fail any and Google visibly penalises you in mobile search. The good news: fixing these is mostly diagnosis + specific technical changes.
Lighthouse audit
Auditing https://example.ie · mobile · emulated
94
Performance
100
Accessibility
92
Best Practices
100
SEO
Core Web Vitals · field data
All good
LCP
GOOD
1.8s
Largest Contentful Paint
INP
GOOD
120ms
Interaction to Next Paint
CLS
GOOD
0.04
Cumulative Layout Shift
Inside the metrics
LCP · INP · CLS — what each actually measures.
LCP
Largest Contentful Paint
Time to render the biggest above-fold element (hero image/headline).
Good≤ 2.5s
Needs improvement2.5–4s
Poor> 4s
Usual culprits
Slow server response, render-blocking resources, unoptimised hero images, late-loading webfonts.
INP
Interaction to Next Paint
How long the page takes to respond to taps, clicks, and keypresses.
Good≤ 200ms
Needs improvement200–500ms
Poor> 500ms
Usual culprits
Heavy JavaScript on the main thread, large event handlers, third-party scripts, un-debounced inputs.
CLS
Cumulative Layout Shift
How much layout jumps around as the page loads (0 = stable, 1 = chaotic).
Good≤ 0.1
Needs improvement0.1–0.25
Poor> 0.25
Usual culprits
Images/videos without dimensions, ads loading in, late-inserted content, font-loading FOUT/FOIT.
The fix playbook
The specific technical changes that improve each metric.
Fixing LCP
Preload hero image
`<link rel="preload" as="image" href="hero.webp">` signals the browser start downloading the biggest image immediately, not wait for CSS parse.
Compress + use modern formats
Convert JPEG/PNG hero images to WebP or AVIF. 30-50% smaller at same quality. Served via `<picture>` with fallback for old browsers.
CDN + edge caching
Hero image served from nearest PoP, not origin in Dublin. Cuts latency from 200-500ms to 20-50ms for international visitors.
Lazy-load below-fold, eager-load above
`loading="lazy"` everywhere except the hero. Hero should be `fetchpriority="high"` to jump queue.
Fixing INP
Break up long JavaScript tasks
Any handler >50ms blocks the main thread. Split with `requestIdleCallback` or `setTimeout(fn, 0)`. Measure with DevTools Performance panel.
Debounce / throttle user input
Search-as-you-type, scroll handlers, resize listeners — add debounce (150-300ms). Cuts redundant work dramatically.
Remove or defer third-party scripts
Chat widgets, analytics beacons, A/B test scripts eat main-thread time. Load with `defer` or `async` — or delay until user interacts.
Code-split by route
Don't ship all JS for every page. Next.js/Vite auto-split by route. Initial bundle under 170KB gzipped is a solid target.
Fixing CLS
Set width + height on every image + video
Browser reserves space before asset loads, preventing shift. Even responsive images need aspect-ratio CSS.
Reserve space for ads + embeds
Ad slots — `min-height` so the placeholder doesn't collapse when ad loads. Same for YouTube/Twitter embeds.
Use font-display: optional or swap with size-adjust
Webfont swap causes text to reflow. `size-adjust` in @font-face matches metrics to fallback, eliminating layout shift.
Don't insert content above existing content
Cookie banners, newsletter popups, announcement bars — these shift page content down. Use fixed-position overlays instead.
The tool stack
Lab data vs field data — and why both matter.
Lighthouse is a simulation (lab); CrUX is what real users experience (field). Lab reveals causes; field measures actual impact. You need both.
Lab tools
Chrome DevTools Lighthouse
Fast local audit. Run on any page. Great for iteration during dev.
PageSpeed Insights
Web-based; combines Lighthouse lab + CrUX field data. Public, shareable URL.
WebPageTest
Deep waterfall analysis, test from specific locations + device profiles. Free tier generous.
Lighthouse CI
Run in GitHub Actions on every PR. Fail builds if scores regress. Baseline enforcement.
Field tools
Search Console CWV report
Real user metrics for your site, grouped by URL pattern. Official Google source.
Chrome UX Report (CrUX)
28-day aggregate of Chrome users. Public data per origin. The source for Google's ranking signal.
Vercel / Cloudflare Analytics
Your own RUM data if hosted on these. No third-party script needed.
SpeedCurve / New Relic / Datadog RUM
Enterprise real user monitoring. Per-page, per-segment performance tracking.
Myth-busting
What people get wrong about page speed.
Lab score ≠ field performance
Lighthouse 95 is lovely but CrUX field data is what Google ranks on. Fast on your dev machine; slow on real 4G mobile. Measure both.
"Mobile-friendly" badge is not enough
That just means the page renders without horizontal scroll. Doesn't measure CWV. Two different criteria — fix both separately.
A CDN alone doesn't fix CWV
CDN helps TTFB/LCP. Doesn't address main-thread JS (INP) or layout shifts (CLS). Still need code-level work. CDN is necessary, not sufficient.
One slow page hurts site-wide
Google evaluates CWV at URL-pattern level. Your blog category might be fine; checkout flow failing. Segment + fix pattern-by-pattern.
More caching = better scores
Aggressive HTML caching breaks dynamic content. Cache what's actually static. Don't cache what shouldn't be cached just to game Lighthouse.
Auto-fixers rarely work fully
"WP Rocket + cache + optimise" button gets you maybe 60% of the way. Getting from "meh" to "good" needs human eyes on specific issues, not autopilot.
Frequently asked
Core Web Vitals questions.
As a tiebreaker signal: real, measurable. Not as significant as content quality or backlinks, but when your site + a competitor's are similar, CWV can decide who ranks. More importantly, bad CWV hurts conversion rates and bounce — those matter regardless of direct ranking impact.
Related
Related SEO + performance
Let's fix your Core Web Vitals.
60-minute performance audit. Share your site; we run Lighthouse + WebPageTest + Search Console CWV together. Identify the 3-5 specific changes that will move your metrics from amber/red to green. Scope + price after.
