Crescentek

Technical SEO

Make sure Google can find and trust every page.

Crawlability, indexation, rendering, site architecture, Core Web Vitals, HTTPS, mobile usability. The invisible layer that decides whether your content ever gets a fair shot at ranking. We find what's broken and fix it.

LCP
2.1s
Good
INP
180ms
Good
CLS
0.18
Needs work
crescentek-crawler@v2
Crawling...
→ crawling /
Crawl log
00:01
200 /
6
OK
2
Warn
2
Err
Nine issue categories we actively fix

Every one changes what Google sees.

Crawlability
robots.txt, blocked resources, crawl budget waste, redirect chains, orphan pages — the stuff stopping Google from seeing your content.
Indexation
Canonical tags, noindex rules, pagination signals, hreflang for multi-language — telling Google what to actually rank.
Rendering
JS-rendered sites, hydration issues, lazy-loaded content Google can't see, SSR vs CSR decisions on existing stacks.
Site architecture
URL structure, internal link depth, category hierarchy, breadcrumbs — the bones that decide what gets found.
Core Web Vitals
LCP, INP, CLS — the three numbers Google uses. Real-world measured from the field (CrUX), not lab tools.
Mobile usability
Tap targets, font size, viewport config, interstitial placement. Mobile-first indexing means mobile = your site.
HTTPS & security
SSL correctness, mixed content warnings, HSTS, content security policy — both ranking signal and trust signal.
Link architecture
Broken internals, dead externals, redirect chains, pointless nofollow, missing contextual links between clusters.
Error handling
Soft 404s, real 404s with inbound links, 5xx monitoring, log file analysis to see what Googlebot actually hits.
Core Web Vitals, in honest detail

What the three metrics actually measure — and what typically fixes them.

LCP — Largest Contentful Paint
< 2.5s = good · 2.5–4s = needs work · > 4s = poor
Time from navigation to the largest visible element (hero image, H1 block) being painted
What usually fixes it
  • Optimise hero image (WebP/AVIF, correct sizes)
  • Preload the LCP resource
  • Server response time (TTFB) <600ms
  • CDN for static assets
  • Remove render-blocking JS/CSS above the fold
INP — Interaction to Next Paint
< 200ms = good · 200–500ms = needs work · > 500ms = poor
Worst-case responsiveness of any interaction (click, tap, keypress) during a session
What usually fixes it
  • Break up long JS tasks (<50ms chunks)
  • Defer non-critical third-party scripts
  • Use Web Workers for heavy computation
  • Debounce input handlers
  • Review event listener bloat (scroll, resize)
CLS — Cumulative Layout Shift
< 0.1 = good · 0.1–0.25 = needs work · > 0.25 = poor
How much unexpected movement of visible elements happens during page lifecycle
What usually fixes it
  • Dimensions on all images (width + height attributes)
  • Reserve space for ads / embeds
  • Don't inject content above existing content
  • Transform-based animations only
  • Font-display: optional or swap with matched fallbacks
What we deliver

An audit that your developer can actually action.

Not a 120-page PDF that sits in a Google Drive folder. A prioritised list of fixes with effort estimates, severity, and example implementations — written so a dev can pick up the ticket and ship.

Crawl diagnostic report
Every URL categorised (indexable / excluded / error / redirect). Screaming Frog + log files analysed together. Export-ready for Jira or Linear.
Prioritised fix list
Each issue tagged Critical / High / Medium / Low. Effort estimate in dev-hours. Expected ranking/traffic impact where quantifiable.
Core Web Vitals report
Field data from CrUX + lab data from Lighthouse, broken out by URL group and device. Specific fixes per page group, not generic advice.
Implementation documentation
For anything non-trivial, code snippets or config examples. robots.txt drafts, canonical patterns, schema templates.
Log file analysis
What Googlebot actually hits (vs what you think it should). Crawl budget waste, orphan pages receiving no crawl, spider traps.
Quarterly re-check
Same audit run 90 days later to verify fixes held and catch regressions from new releases. Included in maintenance retainers.
Tools in our stack

The kit we actually use to find things.

Not an exhaustive list — the ones we reach for daily.

Screaming Frog
Crawl diagnostics
Sitebulb
Deep site audits
Ahrefs
Backlinks + crawl
Semrush
Competitive + audit
GSC / Bing WMT
First-party data
PageSpeed / Lighthouse
Performance lab
CrUX + BigQuery
Real-user vitals
Log file analyser
What Googlebot hits
Frequently asked

Technical SEO questions.

Initial audit and findings report: 2–3 weeks for most sites under 10k URLs. Large ecommerce or multi-site setups can take 4–6 weeks. Implementation then depends on what we find and your dev team's capacity.
For a full audit, yes — Google Search Console, Bing Webmaster Tools, Google Analytics, and usually staging environment access. We can also work with read-only access if your security policy requires it.
Yes, within platform limits. We document what's possible within the platform vs what would require migrating. Most technical SEO issues are fixable on these platforms; a few (server-side headers, edge-layer rules) are not.
Some fixes (indexation, crawl errors) show impact within 2–4 weeks. Core Web Vitals typically 4–8 weeks for CrUX to update. Architectural improvements and link-depth fixes: 2–4 months. We set expectations per-fix upfront.
Tooling gets you 60% of the way. The remaining 40% is interpretation — deciding what matters, prioritising against effort, and knowing which issues the tools miss. If you have a dev who can read a crawl report critically, tooling plus your time is reasonable.

Find out what's actually holding you back.

Share your URL and we'll run an initial diagnostic — no commitment. You'll get a one-pager of the top 10 issues with impact estimates within 3 working days.

All CMS platforms supported
Log file analysis included
Results within 4–12 weeks