Technical SEOFebruary 20, 2025 · 12 min read

Technical SEO Checklist 2025: 47 Checks That Actually Matter

A practitioner's technical SEO checklist for 2025, covering crawlability, Core Web Vitals, structured data, security, and performance. With specific thresholds, tools, and verdicts on what still moves rankings.

Ranking Lens

Author

Ranking Lens Team

AI Summary

A complete technical SEO checklist for 2025 covers seven domains: (1) Crawlability & indexation, robots.txt, XML sitemaps, canonical tags, redirect chains (max one hop), and JavaScript rendering verification; (2) Core Web Vitals, LCP under 2.5s, INP under 200ms (replaced FID in March 2024), CLS under 0.1, measured from CrUX field data not PageSpeed lab scores; (3) Mobile-first, 44px minimum touch targets, no horizontal scroll, correct viewport meta; (4) Structured data, Article, FAQPage, BreadcrumbList, Organization schema with Rich Results Test validation; (5) HTTPS & security, valid SSL, HTTP→HTTPS 301, HSTS, no mixed content; (6) Performance, WebP/AVIF images, Brotli compression, font-display:swap, no render-blocking resources; (7) Internal linking, max 3-click depth, descriptive anchors, no orphan pages. Ranking Lens automates continuous monitoring of these 47 checks with a unified technical audit score.

Optimized for ChatGPT, Claude & Perplexity

Why This Checklist Is Different

Most technical SEO checklists are recycled from 2020. They list "have an XML sitemap" alongside "pass Core Web Vitals" as if these are equivalent concerns. They don't tell you which issues create hard ceilings on rankings and which are nice-to-haves.

This checklist prioritizes by impact. We've divided 47 checks across seven domains and marked each with its ranking impact level, so you know where to focus limited time and budget.

A few items that were important in 2020 but are now table stakes aren't here. If your site is on HTTPS and has a sitemap, congratulations, that's baseline. This list focuses on where sites actually fail in 2025.

Key Takeaway: Technical SEO issues don't suppress rankings gradually, they create hard ceilings. A site with JavaScript rendering problems will cap out regardless of content quality and backlinks. Fix the structural issues before optimizing anything else.

Domain 1: Crawlability & Indexation

These are foundational. Crawlability problems prevent Google from seeing your content at all. Indexation problems mean your content exists but can't rank.

robots.txt

CheckWhat to VerifyImpact
robots.txt exists and is validFetch yourdomain.com/robots.txt, confirm it returns 200, not 404Medium
No critical pages disallowedVerify your main content URLs are not blocked; run Screaming Frog with "Check robots.txt"High
User-agent specificityIf blocking AI crawlers (GPTBot, ClaudeBot), confirm you're using named user-agents, not wildcard blocks that affect GooglebotMedium
  • robots.txt accessible and syntactically valid
  • No money pages, category pages, or article URLs blocked
  • AI crawler policy intentionally set (allow or selectively disallow)

XML Sitemap

  • Sitemap submitted in Google Search Console → Sitemaps
  • Sitemap includes only indexable URLs (no noindex pages, no 301 redirects, no 404s)
  • Sitemap <lastmod> dates are accurate (not all set to today's date, which is a common generator error)
  • For large sites: sitemap index file with multiple sitemaps by content type

Common mistake: Sitemaps that include URLs returning 301 redirects. Google reads this as a signal that your sitemap is inaccurate, which reduces how often it's fetched.

Canonical Tags

  • Every page has a self-referencing canonical tag
  • Paginated pages (page 2, page 3) have canonical pointing to themselves, not to page 1
  • Canonical in HTTP header matches canonical in HTML (if both are present)
  • No canonical loops (page A canonicals to page B which canonicals back to A)

Redirect Architecture

  • No redirect chains longer than one hop (A→B→C is a chain; flatten to A→C)
  • All old internal links updated to point directly to final destination URLs
  • 404 pages return actual 404 status code (test with curl -I url)

Key Takeaway: Run a redirect crawl with Screaming Frog or Sitebulb quarterly. Sites that have gone through even one CMS migration or redesign almost always have chain accumulation they don't know about.

JavaScript Rendering

This is where most modern sites fail silently.

  • Main content body is present in raw HTML (not injected by JavaScript)
  • Internal navigation links are present in initial HTML response
  • Test: GSC URL Inspection → View Crawled Page; compare rendered to raw HTML
  • No critical content hidden behind JavaScript-triggered modals or tabs

How to test in 30 seconds: curl -s yourdomain.com/your-article | grep "first paragraph text". If the grep returns nothing, Googlebot sees the same blank page.

Domain 2: Core Web Vitals

Core Web Vitals are Google's performance-to-ranking bridge. The thresholds below are hard limits from Google's documentation.

MetricPassNeeds ImprovementFailRanking Impact
LCP (Largest Contentful Paint)< 2.5s2.5–4.0s> 4.0sHigh at failure threshold
INP (Interaction to Next Paint)< 200ms200–500ms> 500msSignificant for interactive sites
CLS (Cumulative Layout Shift)< 0.10.1–0.25> 0.25High, directly suppresses rankings

Critical note: INP replaced FID (First Input Delay) as a Core Web Vital in March 2024. If your technical SEO documentation still references FID, it's outdated. INP measures the full interaction lifecycle, input delay, processing time, and rendering time, for all interactions, not just the first. It is significantly harder to pass on JavaScript-heavy applications.

LCP Optimization Checklist

  • Identify your LCP element for each page type using DevTools → Performance → LCP
  • LCP element (usually hero image or H1) is in the initial HTML, not lazy-loaded
  • Hero images use <link rel="preload" as="image"> in the <head>
  • Images served in WebP or AVIF format (30–50% smaller than JPEG at equivalent quality)
  • CDN with edge caching active, TTFB (Time to First Byte) under 600ms
  • No render-blocking <script> tags in <head> without defer or async

INP Optimization Checklist

  • No JavaScript tasks longer than 50ms blocking the main thread
  • Event handlers debounced or throttled appropriately
  • Third-party scripts (analytics, chat widgets) loaded asynchronously
  • React/Next.js: heavy computations moved to Web Workers or server components

CLS Optimization Checklist

  • All images have explicit width and height attributes
  • Font loading uses font-display: swap or font-display: optional
  • Ad slots and dynamic content containers have reserved height (CSS min-height)
  • No elements using layout-affecting CSS animations (animate transform not height)

Key Takeaway: Use Google Search Console's Core Web Vitals report for ranking-relevant data, not PageSpeed Insights lab scores. Lab scores diagnose problems. Field scores (CrUX) determine ranking impact.

Domain 3: Mobile-First Technical Requirements

Google uses mobile-first indexing for all sites. The mobile version of your page is the version that ranks.

  • Viewport meta tag present and correct: <meta name="viewport" content="width=device-width, initial-scale=1">
  • No horizontal scrolling at any viewport width (test at 320px, 375px, 414px)
  • Touch targets at least 44×44px (Google's minimum; Apple's HIG recommends 44pt)
  • Text readable without zooming at 16px base font size
  • No intrusive interstitials (pop-ups that cover main content on mobile), Google can penalize these
  • Test with Google's Mobile-Friendly Test (Search Console) and real device testing

Common Mobile Issues That Are Routinely Missed

Fixed-width elements: A container with width: 800px in a mobile viewport causes horizontal scroll. Audit with DevTools device emulation and look for horizontal overflow.

Font size below 12px: Google flags text smaller than 12px as unreadable on mobile. Check your CSS for font sizes below this threshold, especially in captions, footnotes, and disclaimers.

Interstitials timing: A pop-up that appears after 30 seconds on page doesn't trigger Google's penalty, but one that appears before the user can access content does. Time your overlays carefully.

Domain 4: Structured Data

Structured data serves two purposes in 2025: unlocking Google rich results features (which improve CTR) and signaling content type to AI retrieval systems (which affects GEO citation rates).

Required for Every Article

  • Article schema with: headline, datePublished, dateModified, author (Person), publisher (Organization with logo), description
  • Author Person entity: include name, url (link to author page), and ideally sameAs (links to social profiles, LinkedIn) for knowledge graph building
  • BreadcrumbList schema matching visible breadcrumb navigation

Recommended for Most Articles

  • FAQPage schema for any article with a FAQ section (unlocks FAQ rich results; questions/answers must be visible on page)
  • HowTo schema for step-by-step instructional content

Site-Level Schema

  • Organization schema on homepage with logo, url, sameAs (social profiles), contactPoint
  • WebSite schema with potentialAction SearchAction for sitelinks search box (if applicable)

Validation

  • All schema validated with Google's Rich Results Test
  • No schema on pages where the content type doesn't match (FAQPage on a page with no FAQ)
  • Schema is in <script type="application/ld+json"> format, not RDFa or Microdata (JSON-LD is Google's preferred format)

Key Takeaway: FAQPage schema is the single highest-ROI structured data implementation for blog content. It can expand your SERP listing to 3–5x the normal height for featured queries, dramatically increasing click-through rate.

Domain 5: HTTPS & Security

These are table stakes in 2025, but the nuances matter.

  • Valid SSL certificate with at least 90 days until expiration (set up renewal alerts)
  • HTTP URLs redirect to HTTPS with 301 (not 302)
  • www and non-www redirect consistently (pick one and 301 the other)
  • HSTS header present: Strict-Transport-Security: max-age=31536000; includeSubDomains
  • No mixed content warnings (HTTP resources loaded on HTTPS pages)
  • Security headers present: X-Content-Type-Options: nosniff, X-Frame-Options: SAMEORIGIN

Mixed content check: Browser DevTools → Console. Any "Mixed Content" warnings need to be fixed. Common culprits: hardcoded HTTP URLs in images or scripts, third-party embeds using HTTP endpoints.

Domain 6: Performance & Asset Optimization

Performance beyond Core Web Vitals, these don't have direct ranking thresholds but affect the user experience signals that indirectly drive rankings.

Images

  • All images served in WebP or AVIF format (convert with Squoosh, ImageMagick, or at the CDN level)
  • loading="lazy" on all images below the fold
  • loading="eager" (or no loading attribute) on hero/LCP images, lazy loading the LCP element is one of the most common self-inflicted LCP problems
  • Responsive images with srcset for different viewport densities
  • No images larger than 200KB after compression (use 100KB as a target for most use cases)

Server & Delivery

  • Brotli compression enabled (20–30% better compression ratio than Gzip; check with curl -H "Accept-Encoding: br" -I url)
  • Browser caching headers set (Cache-Control: max-age=31536000, immutable for hashed assets)
  • CDN with edge PoPs covering your primary traffic geography
  • TTFB under 600ms from your primary geography (check with GTmetrix from relevant server location)

Fonts

  • Web fonts loaded with font-display: swap or font-display: optional
  • Font preconnect: <link rel="preconnect" href="https://fonts.googleapis.com">
  • Not more than 2–3 font families with 2–3 weights each (each additional weight is an additional HTTP request)
  • Consider self-hosting fonts to eliminate third-party DNS lookup latency

Key Takeaway: loading="lazy" on your hero image is one of the most common LCP problems we see in audits. It looks right in code review and completely breaks real-world performance. Always set loading="eager" or omit the attribute on above-the-fold images.

Domain 7: Internal Linking Architecture

Internal linking is PageRank distribution. Your internal link structure determines which pages receive the most PageRank from your domain's authority, and therefore which pages have the best chance of ranking.

IssueDetection MethodImpact
Orphaned pagesScreaming Frog crawl → filter to pages with 0 inlinksHigh, zero internal PageRank
Pages buried > 3 clicks from homepageScreaming Frog crawl depth reportMedium, reduced crawl priority
Generic anchor text ("click here", "read more")Screaming Frog → Anchor text reportMedium, missed keyword signals
Broken internal linksScreaming Frog → Response codes → 404High, crawl budget waste

Internal Linking Checklist

  • No orphaned pages (every published URL has at least one internal link pointing to it)
  • No content buried more than 3 clicks from the homepage
  • All internal anchor text is descriptive (describes the target page's content)
  • No broken internal links (any returning 404 or 5xx)
  • Related articles linked within the body content, not just in sidebars
  • Hub pages (category/topic overview pages) link to all relevant sub-articles

Building an Effective Internal Linking System

The principle: internal links should reflect the semantic structure of your content, not just your navigation menu. If you have 10 articles about Core Web Vitals, they should all link to each other with descriptive anchor text, not just exist in a category page.

The practical implementation:

  1. For each new article published, identify 3–5 existing articles it's relevant to and add contextual links in both directions
  2. Quarterly: run a crawl and check the inlink count for each URL. Any page with under 3 internal links needs attention
  3. Maintain a topic cluster map: which articles belong to which semantic cluster, and which is the hub that should receive the most internal link equity

The Priority Stack: Where to Start

Not all 47 checks carry equal weight. If you're starting a technical audit with limited time, work through these in order:

  1. JavaScript rendering, if your content isn't in initial HTML, nothing else matters until this is fixed
  2. Core Web Vitals field data, check GSC, identify your worst-performing pages, start with CLS (fastest to fix)
  3. Orphan page audit, high impact, often overlooked, quick to fix
  4. Redirect chain audit, run Screaming Frog, flatten all chains
  5. Structured data, Article + FAQPage schema on all content pages
  6. Image optimization, WebP conversion + lazy loading audit (catch the LCP lazy-load bug)
  7. robots.txt + canonical review, confirm nothing important is accidentally blocked or miscanonicalized

The full 47-point audit above covers the complete picture. But these seven, executed well, address roughly 80% of the technical SEO impact available on a typical content site.

Key Takeaway: A technical SEO audit is not a one-time exercise. The sites that maintain technical excellence run automated checks continuously, not manual crawls once a year. Ranking Lens's technical audit score tracks your 47-point checklist status automatically so you catch regressions before they damage rankings.

Ranking Lens

Your SEO & GEO Copilot

Run a free GEO score scan and see exactly how well your content is optimized for AI systems like ChatGPT, Perplexity, and Google AI Overviews.

Start for free

Frequently Asked Questions

Everything you need to know

Themen & Tags

Technical SEOTechnical SEOChecklistCore Web VitalsCrawlingSchemaPerformance2025
Ranking Lens

Author

Ranking Lens Team

February 20, 2025

12 min read

Tammo is the founder of Ranking Lens and an expert in SEO & Generative Engine Optimization (GEO). He helps businesses get found in Google, ChatGPT, and AI Overviews.