Backlynk
Technical SEO18 min read

Technical SEO Checklist: 40+ Points to Audit Your Site

61% of websites fail Core Web Vitals. 52% have broken internal links. 74% are missing image alt text. This 40-point technical SEO checklist covers every category — crawlability, HTTPS, Core Web Vitals, schema, and more — with tools, thresholds, and fix priorities.

JM

James Mitchell

Technical SEO Lead

Key Takeaways - 61% of websites fail Google's Core Web Vitals thresholds — technical issues are the norm, not the exception (HTTP Archive 2025 Web Almanac) - INP replaced FID as a Core Web Vital in March 2024; most audit guides haven't updated to reflect this change - 52% of websites have broken internal or external links that suppress crawl efficiency and leak link equity (Semrush crawl analysis, 50,000+ domains) - 74% of websites have images missing alt text — the highest-prevalence technical error on the web, and one of the fastest to fix - A structured quarterly audit catches the issues that compound into 10-30% organic traffic losses over 6-18 months

The Numbers Behind the Checklist

Before you start clicking through audit tabs, consider what the aggregate data says about the technical state of the web.

Per the HTTP Archive's 2025 Web Almanac, only 39% of websites pass all three Core Web Vitals thresholds. On mobile — where Google's crawler indexes your site first under mobile-first indexing — the pass rate drops to 38.5%. That means roughly 6 in 10 websites are handing Google a concrete performance-based ranking disadvantage before a single piece of content is evaluated.

Broken links compound the problem. Semrush's crawl analysis of over 50,000 domains found that 52% of websites have broken internal or external links that need fixing. Each broken internal link is a dead end in Google's crawl path — a pathway that should be carrying link equity and directing Googlebot to your content, silently failing.

And the most prevalent issue? Per PageOptimizer Pro's 2025 small business SEO study, 74% of websites have images with missing alt text — a fix that takes minutes per image once identified.

This is not a story about sophisticated, hard-to-detect problems. Most technical SEO failures are elementary, silent, and fixable in a single sprint. The challenge is that they're invisible without systematic auditing. This checklist gives you the systematic framework.

Tools Required Before You Start

| Tool | Cost | Primary Function | |---|---|---| | Google Search Console | Free | Crawl errors, indexing status, Core Web Vitals field data | | Screaming Frog SEO Spider | Free up to 500 URLs / £259/yr | Full site crawl, broken links, redirects, canonical tags | | Google PageSpeed Insights | Free | CWV measurements and performance diagnostics | | Ahrefs Webmaster Tools | Free (own site only) | Backlink profile, site health, crawl errors | | Semrush Site Audit | Free tier / $139.95+/mo | 170+ parameter technical audit at scale | | Chrome Lighthouse | Free (built into Chrome DevTools) | Page-level performance, accessibility, SEO audit |

Category 1: Crawlability and Indexation

Crawlability is the prerequisite for everything. If Google cannot reach and index your pages, no other optimization matters. Start every audit here.

1. robots.txt Configuration

Visit yourdomain.com/robots.txt. Verify it is not blocking important pages, directories, CSS files, or JavaScript. The most common mistake in production: blocking /wp-content/ on WordPress (which prevents Google from rendering your pages), carrying over staging environment Disallow rules, or blocking subdirectories that contain content.

Fix: Remove any Disallow rules blocking content pages or rendering resources. Only disallow admin areas, internal search result pages, duplicate parameter URLs, and pages with zero SEO value.

2. XML Sitemap

Verify your sitemap exists — usually at /sitemap.xml or the path declared in robots.txt. Check that it contains all canonical, indexable pages. Common failures: the sitemap includes noindexed pages (conflicting signal), includes redirect URLs (should point to final destinations), or hasn't been submitted to Google Search Console.

Fix: Regenerate the sitemap excluding noindex pages and redirect URLs. Submit or re-submit to GSC under the Sitemaps section after any major structural change.

3. Crawl Errors and Indexing Gaps

In Google Search Console, navigate to Pages and review "Why pages aren't indexed." Each failure category has a distinct root cause. The ones that require immediate action:

  • Blocked by robots.txt — important content being blocked from Google's crawler
  • Noindex tag — verify these are intentional, not accidental (common after plugin conflicts or CMS updates)
  • Soft 404 — page returns a 200 HTTP status but Google treats it as empty or not found
  • Discovered — currently not indexed — Google found the page but chose not to index it (thin content, quality concern)
  • Crawled — currently not indexed — Google evaluated the page and declined to index it (more serious quality signal)

Fix: "Blocked by robots.txt" is the most urgent. "Crawled — currently not indexed" typically requires content quality improvement, not a technical fix.

4. Crawl Budget (Sites with 10,000+ Pages)

In GSC, go to Settings → Crawl Stats. For large sites, compare the number of pages Googlebot is actually crawling against your total indexable URL count. If Googlebot crawls only 10-15% of your pages, important content is not being indexed regularly.

Fix: Improve crawl budget efficiency by noindexing low-value pages (thin content, internal search results, parameter-generated duplicates), fixing crawl traps (infinite pagination, calendar URLs), improving page speed, and building stronger internal links to priority pages.

5. Noindex Audit

Run Screaming Frog and filter by "noindex" in the Directives column. Verify every noindexed page is intentionally noindexed. Common accidental noindex sources: CMS staging settings copied to production, SEO plugin misconfiguration, developer test tags left in place.

Fix: Remove accidental noindex tags. Retain intentional noindex on: admin pages, thank-you pages, paginated pages beyond depth 2 (optional), and thin filter/parameter pages.

6. Canonical Tag Implementation

In Screaming Frog, review the Canonicals tab. Every page should have a self-referencing canonical tag or a canonical pointing to the correct URL. Flag: pages missing canonical tags entirely, canonicals pointing to redirect URLs instead of final destinations, and canonicals pointing to noindexed pages (creates conflicting signals Google cannot resolve cleanly).

7. Crawl Depth

Priority pages should be reachable within 3-4 clicks from the homepage. Pages buried at depth 7+ receive less crawl priority and accumulate less internal link equity. In Screaming Frog, review the Crawl Depth report and identify important content that is too deep in the site structure.

Fix: Improve internal linking to surface important pages. Add contextual links from high-traffic content to deeply buried pages. Build a hierarchical HTML sitemap for large sites.

8. JavaScript Rendering Check

Use Google Search Console's URL Inspection tool on your key pages. Click "Test Live URL" and then view the screenshot — this shows what Google's renderer actually sees. If your main content is missing or incomplete in the rendered screenshot, Google cannot see it regardless of what your browser displays.

Fix: Server-side render critical content, or ensure JavaScript loads fast enough for Googlebot's rendering queue. Never place key content or internal links exclusively in slow-loading JavaScript files.

Category 2: HTTPS and Security

9. HTTPS Implementation

Verify all pages load on HTTPS. In Google Search Console, check for mixed content warnings in the Coverage report. Confirm that yourdomain.com (without HTTPS) redirects to the HTTPS version via a 301 redirect.

10. Mixed Content Errors

Open Chrome DevTools on key pages and check the Console tab for "Mixed Content" warnings. These appear when HTTP resources — images, scripts, stylesheets — are loaded on an HTTPS page. They trigger browser security warnings and dilute the security signal Google expects.

Fix: Update all internal resource URLs to HTTPS. For external resources, contact the provider or self-host.

11. SSL Certificate Validity

Verify your certificate has not expired and covers all subdomains in use (including www.). Check expiry via Chrome's lock icon → "Connection is secure" → Certificate details. Set up auto-renewal if not already configured — an expired certificate causes immediate ranking damage and browser security errors.

12. HTTP Security Headers

Advanced check: verify your server sends appropriate security headers (Strict-Transport-Security, X-Frame-Options, Content-Security-Policy). These are not direct ranking factors but Google's Chrome browser treats sites with complete security headers as more trustworthy. Check with securityheaders.com.

Category 3: Core Web Vitals and Page Speed

This is where most sites fail. Per HTTP Archive's 2025 Web Almanac, only 38.5% of mobile page views pass all Core Web Vitals — making this the single highest-impact category in any technical audit.

13. LCP (Largest Contentful Paint) — Target: 2.5 seconds or less

LCP measures how long until the largest visible element — typically a hero image or primary headline — is rendered. The 75th percentile of real-user data must hit 2.5 seconds or better for a "Good" rating in Google Search Console.

Check: GSC → Core Web Vitals report shows field data aggregated across real users. Google PageSpeed Insights provides URL-level diagnostics.

Common causes and fixes: - Slow server response time (TTFB over 600ms): upgrade hosting, implement a CDN - Unoptimized hero images: convert to WebP format (25-35% smaller at equivalent quality), add explicit width/height attributes, preload the LCP image in the HTML head - Render-blocking CSS or JavaScript: defer non-critical scripts, inline critical above-fold CSS

14. INP (Interaction to Next Paint) — Target: 200ms or less

Critical update: INP replaced FID (First Input Delay) as an official Core Web Vital in March 2024. Many audit tools and checklists published before this date still report FID. INP measures the full interaction lifecycle — from user input to the next visual update — not just initial delay. This is harder to optimize than FID was.

Check: PageSpeed Insights shows INP from real Chrome user data (CrUX). Chrome DevTools Performance panel for lab testing.

Common causes and fixes: long JavaScript tasks blocking the main thread, heavy event handlers on interactive elements, large DOM size (over 1,400 elements). Fix by breaking long tasks into smaller chunks and reducing DOM complexity.

15. CLS (Cumulative Layout Shift) — Target: 0.1 or less

CLS measures visual instability — content shifting after initial render. A page that "jumps" as images load or ads appear scores poorly.

Common causes and fixes: - Images without explicit dimensions: always set width and height attributes in HTML - Dynamically injected ad banners: reserve space with fixed-height placeholder containers - Web fonts causing layout shifts: use font-display swap or optional with appropriate fallback fonts - Iframes and video embeds without dimensions: set explicit aspect ratios via CSS

16. Time to First Byte (TTFB) — Target: 800ms or less

TTFB measures server response speed — how long until the first byte of the response arrives. It underpins all other performance metrics.

Check: Chrome DevTools Network tab shows TTFB for any request. PageSpeed Insights flags slow server response times.

Fix: Upgrade to better hosting if TTFB consistently exceeds 800ms. Implement server-side caching. Use a CDN for globally distributed audiences.

17. Image Optimization

Screaming Frog → Images tab → filter by file size. Any content images over 200KB need optimization. Any hero or banner images over 500KB represent a significant LCP bottleneck.

Fix: Convert all images to WebP format. Resize images to their maximum display dimensions (no oversizing). Implement lazy loading only for below-the-fold images — lazy loading your LCP image is a common mistake that significantly increases LCP.

18. Render-Blocking Resources

PageSpeed Insights → "Eliminate render-blocking resources" diagnostic lists specific CSS and JavaScript files delaying First Contentful Paint.

Fix: Add defer or async attributes to non-critical JavaScript. Load non-critical CSS asynchronously. Inline critical above-fold CSS directly in the HTML.

Category 4: Mobile Optimization

19. Mobile-First Indexing Readiness

Google has used mobile-first indexing for all sites since 2023. The mobile version of your site is what Google crawls, indexes, and uses for ranking — not the desktop version.

Check: In GSC URL Inspection, "Crawled as" should show "Googlebot Smartphone." Use Chrome DevTools device emulation to check your real mobile experience on key pages.

Fix: Ensure all desktop content is accessible on mobile. Do not hide important text or internal links behind "tap to expand" features on mobile if those features don't render in the HTML.

20. Viewport Meta Tag

Every page must include the viewport meta tag: name="viewport" with content="width=device-width, initial-scale=1". Without it, Google treats the page as not mobile-optimized.

Check: Screaming Frog flags pages with missing viewport meta tags under the Directives tab.

21. Touch Target Sizing

Google's minimum recommendation for touch targets is 48x48 pixels with sufficient spacing between adjacent interactive elements. Targets smaller than this cause tap accuracy errors on mobile and negatively affect INP scores.

Check: Chrome Lighthouse accessibility audit flags touch targets that fail the minimum size requirement.

22. No Intrusive Interstitials

Google penalizes pages with interstitials — full-page popups, large overlays — that obscure content on mobile immediately after page load. This is part of the Page Experience signal set.

Fix: Avoid full-page popups on the initial page load for mobile visitors. Cookie consent banners must be dismissible and cannot cover the full viewport.

Category 5: URL Structure and Architecture

23. URL Structure Quality

Review your URL patterns. Good URLs are short, lowercase, hyphen-separated, and descriptive (/technical-seo-checklist/ versus /article?id=1482&cat=87). URLs with excessive parameters, session IDs, or random strings create duplicate content problems and waste crawl budget.

Fix: For new content: implement clean URL patterns from the start. For existing content with backlinks: never change live URLs without implementing 301 redirects.

24. Broken Internal Links

Screaming Frog → Response Codes → 4XX filter. Also check GSC → Pages → Not Found (404). Per Semrush's analysis of 50,000+ domains, 52% of websites have broken internal or external links that create crawl dead-ends and leak link equity.

Fix: Update all internal links to their current correct destinations. Set up 301 redirects for any URL that has moved permanently.

25. Redirect Chains and Loops

A redirect chain exists when URL A redirects to URL B which redirects to URL C instead of A going directly to C. A redirect loop is circular: A redirects to B which redirects back to A.

Check: Screaming Frog → Redirects → Redirect Chains report.

Fix: Update the source URL to point directly to the final destination URL. Never chain more than one redirect — each hop in a chain passes less link equity and adds latency.

26. Duplicate Content

Check: Screaming Frog → Content → Duplicate Pages and Near-Duplicate Pages. Also check for parameter-generated duplicates (?sort=price, ?session_id=, ?color=red).

Fix: Implement canonical tags pointing to the preferred version. Use GSC's URL parameters tool for parameter-generated duplicates. Add noindex to low-value parameter pages.

27. Pagination Handling

Check whether paginated URLs (/category/page/2/, /category/page/3/) are being indexed and consuming crawl budget without adding value.

Fix: For most paginated archives, implement self-referencing canonical tags on each paginated page. Ensure the main category page is well-linked and receives the bulk of crawl attention.

Category 6: Structured Data and Schema

28. Schema Markup Implementation

Use Google's Rich Results Test tool on your key pages to check what schema types are implemented and whether they are valid. Structured data eligibility for rich results — star ratings, FAQ snippets, event listings, recipe cards — can increase CTR by 20-30%, per Semrush's SERP feature analysis data. Higher CTR feeds into NavBoost signals confirmed in the May 2024 Google API leak.

29. Schema Validation

The Rich Results Test surfaces errors and warnings. Errors prevent rich result eligibility entirely — address all errors. Warnings are typically missing optional fields and may be acceptable depending on the schema type.

30. Organization and LocalBusiness Schema

Your homepage should include Organization schema (or LocalBusiness for local businesses) containing: name, URL, logo, contact information, and sameAs links to all major social profiles. This builds your entity profile in Google's Knowledge Graph and helps AI models identify your brand accurately.

Fix: Implement via JSON-LD in the page head section. Most modern SEO plugins handle this automatically — verify it's correctly configured and contains complete data.

31. Breadcrumb Schema

Breadcrumb schema helps Google understand your site hierarchy and generates breadcrumb-style rich results in search. Check implementation on all non-root pages, and verify it matches your visible breadcrumb navigation.

Category 7: On-Page Technical Elements

32. Title Tag Audit

Screaming Frog → Page Titles → filter for: missing titles, duplicate titles, titles over 60 characters (Google truncates beyond approximately 580px display width), and titles under 30 characters (too short — missed opportunity).

Fix: Every page needs a unique, descriptive title within 50-60 characters that accurately reflects the page content and includes the primary keyword naturally.

33. Meta Description Audit

Screaming Frog → Meta Descriptions → filter for: missing, duplicate, over 160 characters, under 70 characters.

Note: Google rewrites meta descriptions in roughly 62% of cases (per Portent's 2024 study). That is not a reason to leave them blank. Well-written meta descriptions set correct expectations, improve CTR when Google does use them, and are displayed in social sharing previews.

34. H1 Tag Uniqueness

Every page should have exactly one H1 tag that clearly identifies the page's main topic. Screaming Frog → H1 → filter for: missing H1, pages with multiple H1 tags, and duplicate H1 text across pages.

35. Image Alt Text

Per PageOptimizer Pro's 2025 analysis, 74% of websites have images with missing alt text — the highest-prevalence technical error in modern SEO audits. Screaming Frog → Images → Missing Alt Text tab.

Fix: Add descriptive alt text to all content images. Decorative images (purely visual, no informational value) should use empty alt="" attributes. Alt text should describe the image accurately — not stuff keywords.

36. Open Graph and Social Meta Tags

When your content is shared on social media, does it render with a proper preview image, title, and description? Use a social preview checker (opengraph.xyz) to verify your key pages.

Fix: Implement Open Graph tags (og:title, og:description, og:image, og:url) and Twitter Card tags on all shareable pages. SEO plugins like Yoast or Rank Math handle this automatically — verify configuration is active and images are specified.

37. Orphan Pages

An orphan page exists in your sitemap but has no internal links pointing to it from other pages on your site. Google assigns orphan pages the lowest crawl priority and minimal internal link equity.

Check: Cross-reference your sitemap URL list against Screaming Frog's internally linked pages list. Any URL present in the sitemap but not linked internally is an orphan.

Fix: Add internal links from at least 2-3 relevant pages to every important page. If the page genuinely has no relevant linking opportunities, consider whether it needs to exist.

Category 8: Advanced Technical and AI Search

38. Server Log Analysis

Access your web server access logs and filter for Googlebot. Verify Googlebot is regularly visiting your priority pages — and not wasting crawl budget on low-value URLs like infinite paginated archives, parameter duplicates, or faceted navigation.

Fix: If Googlebot visits more low-value pages than priority pages: add noindex or robots.txt disallow to low-value page types, and improve internal linking toward priority content.

39. 5xx Error Monitoring

In GSC → Pages, check for Server Errors (5XX). Any server errors Googlebot encounters during crawling degrade crawl efficiency and can trigger deindexing of affected pages.

Fix: Resolve 500-series server errors immediately. Set up uptime monitoring (UptimeRobot, Better Uptime) to catch patterns before Google's crawler surfaces them in your reports.

40. E-E-A-T Signal Audit

Google's Search Quality Rater Guidelines define E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) as the framework evaluators use to assess content quality. While not directly a "technical" factor, E-E-A-T signals are auditable and improvable.

Check for: named authors with bios and verifiable credentials on all editorial content; a complete About page with organization information; physical address and contact details (especially for YMYL topics — Your Money or Your Life); Privacy Policy and Terms of Service pages; reviews and testimonials from identifiable sources.

41. llms.txt and AI Crawler Readiness

An emerging consideration for 2026 and beyond: AI search assistants (OpenAI, Perplexity, Google Gemini, Anthropic Claude) crawl the web independently. The llms.txt specification (analogous to robots.txt for AI crawlers) allows you to direct these crawlers to your most important and accurate content.

Check: Does your site have a /llms.txt file? Do you have clear, accurate information on your key pages that AI models can cite?

The relevance: per Pew Research Center's 2025 analysis, when an AI Overview appears in Google results, traditional blue links receive 8% CTR versus 15% without the AI Overview — a 47% reduction. However, inclusion in the AI Overview itself more than compensates. Optimizing for AI citation means ensuring your content is crawlable, factually specific, and clearly structured.

42. Core Web Vitals Field Data vs Lab Data

A critical distinction: Core Web Vitals in Google Search Console reflect field data — real user measurements collected from Chrome browsers over a 28-day rolling window. PageSpeed Insights lab data tests a single simulated page load.

A page can score 95 in PageSpeed Insights lab tests but still fail Core Web Vitals in GSC because real users on diverse devices and network conditions experience worse performance than the simulated test environment. Always verify with field data, not just lab scores. If your site lacks sufficient real-user Chrome data, PageSpeed Insights will show "No Data" for field metrics — and your CWV status in GSC remains unknown until you accumulate enough traffic.

Priority Framework

Not all technical issues deserve the same urgency. Use this matrix to sequence your fixes:

| Priority | Issue Type | Examples | Timeline | |---|---|---|---| | P0 — Immediate | Indexing blockers | robots.txt blocking key pages, site-wide accidental noindex, no HTTPS on live site | Fix within 24 hours | | P1 — This week | Core Web Vitals failures | LCP over 4s, CLS over 0.25 on key pages, INP over 500ms | Fix within 2 weeks | | P2 — This sprint | Broken crawl paths | Redirect chains over 1 hop, broken internal links, orphan pages | Fix within 1 month | | P3 — Quarterly | Optimization | Schema markup, alt text gaps, meta description updates, E-E-A-T signals | Address each audit cycle |

Frequently Asked Questions

How often should I run a technical SEO audit?

Quarterly for established sites. Immediately after any major change: migration, redesign, platform switch, URL restructuring. Monthly for e-commerce sites or daily-publishing content sites. Set up automated site monitoring in Semrush or Ahrefs between full audits to catch critical failures as they occur rather than waiting for the next scheduled review.

What is the most important technical SEO factor?

Crawlability and indexation — if Google cannot access and index your pages, no other optimization matters. Start every audit by confirming robots.txt is not blocking important content and that your key pages appear in GSC's indexed pages report. Once indexation is confirmed, Core Web Vitals represent the highest-leverage optimization for most established sites.

Can I complete a technical audit without paid tools?

Yes. Google Search Console covers crawl errors, indexing issues, Core Web Vitals field data, and mobile usability — all for free. Screaming Frog's free tier audits up to 500 URLs for broken links, redirects, canonical tags, and missing meta data. Google PageSpeed Insights provides page-level CWV diagnostics. Ahrefs Webmaster Tools (free for your own site) adds backlink data and site health scoring. Together, these free tools cover the majority of this checklist.

Does fixing technical SEO deliver immediate ranking improvements?

The timeline varies by issue type. Fixing indexation blockers (accidental noindex, robots.txt errors) can produce visible ranking improvements within 2-4 weeks as Google recrawls and reindexes the affected pages. Core Web Vitals improvements register in GSC's field data over a 28-day rolling window. Structural changes (URL architecture, internal linking improvements) typically take 30-90 days to fully process. Technical fixes unlock the potential of your content — they remove suppression rather than adding a direct ranking boost.

What are the most commonly missed technical SEO issues?

INP (which replaced FID in March 2024) is the most frequently missed because audit guides haven't caught up. Orphan pages are missed because most tools audit what's crawlable — but orphans are often not in the crawl path. Server log analysis revealing Googlebot's actual crawl behavior is skipped by most auditors despite being one of the most informative data sources available. And llms.txt for AI crawler optimization is emerging as a new category most sites haven't addressed at all.

How does technical SEO interact with backlink building?

They are multiplicative. A technically sound site extracts significantly more value from each backlink in its profile. Internal link structure determines how link equity from external backlinks distributes across your site — poor internal linking means a DR 70 backlink benefits only the page it lands on rather than flowing to your entire site. Run a technical audit alongside your backlink profile analysis for the most complete picture of what is suppressing your rankings.

---

*Technical issues silently suppress rankings you've already earned. Analyze your current backlink profile to see the authority signals available to your site, then submit to our directory database to build the referring domain foundation that technical excellence amplifies. Check our pricing plans to find the right tier for your site's scale.*

Written by

JM

James Mitchell

Technical SEO Lead

Technical SEO Lead with a decade of experience in site architecture, crawl optimization, and search algorithm analysis. Built and scaled SEO programs for three venture-backed startups from zero to 500K+ monthly organic sessions.

technical SEOsite auditCore Web VitalscrawlabilitySEO checklist

Build Backlinks at Scale

Submit your site to 200+ curated directories with automated verification solving, reliable delivery, and real-time tracking.

View Plans & Pricing