TemplateSEO & GEO

Technical SEO Audit Template

Run a full technical SEO audit with this 50+ point checklist. Covers crawlability, indexation, site speed, mobile experience, structured data, and Core Web Vitals.

Technical SEO Audit Template

Content quality and keyword targeting matter, but they work on top of a technical foundation. If Google can't crawl your site, can't index your pages, or sees a slow, broken experience, no amount of great writing will get you to page one.

A technical SEO audit identifies the infrastructure problems that are silently limiting your content's ranking potential. This template gives you a systematic, actionable checklist for auditing a startup website.

Who Should Run a Technical SEO Audit

You should run a technical audit:

  • When you launch a new website
  • When organic traffic drops suddenly without an obvious cause
  • Before a major site migration or redesign
  • Quarterly as part of your standard SEO maintenance
  • After a significant CMS upgrade

You don't need to be a developer to complete this audit. Many items can be checked with free tools. For items that require code changes, you'll need developer support or your CMS's built-in settings.

Tools You'll Need

ToolCostWhat It's Used For
Google Search ConsoleFreeIndexing issues, Core Web Vitals, manual actions
Google PageSpeed InsightsFreePage speed and Core Web Vitals
Screaming Frog SEO SpiderFree (up to 500 URLs) / $259/yearSite crawl, broken links, redirects, meta data
Ahrefs or Semrush$99-120/monthBacklinks, site audit, keyword data
Google Chrome DevToolsFreeManual page inspection
validator.schema.orgFreeSchema markup validation
SSL Checker (ssllabs.com)FreeSSL certificate verification

Averi automates this entire workflow

From strategy to drafting to publishing — stop doing it manually.

Start Free →

Part 1: Crawlability and Indexability

If Google can't crawl or index your pages, nothing else matters.

Robots.txt Audit

Your robots.txt file tells search engines which pages they can and cannot crawl.

Access at: yourdomain.com/robots.txt

Check:

  • Robots.txt file exists
  • No important pages accidentally blocked by Disallow: rules
  • Sitemap URL is declared in robots.txt: Sitemap: https://yourdomain.com/sitemap.xml
  • No Disallow: / (this blocks all crawling — a critical error if present)

Common mistakes to look for:

# BAD - blocks everything
User-agent: *
Disallow: /

# BAD - blocks all /blog/ pages
User-agent: *
Disallow: /blog/

# GOOD - blocks only specific admin paths
User-agent: *
Disallow: /admin/
Disallow: /wp-admin/
Sitemap: https://yourdomain.com/sitemap.xml

Findings: ___ Action Required: ✅ / ❌ — ___

XML Sitemap Audit

Your sitemap tells search engines about all the pages you want indexed.

  • Sitemap exists at /sitemap.xml (or declared in robots.txt)
  • Sitemap submitted to Google Search Console
  • Sitemap contains all important pages (blog posts, pillar pages, key landing pages)
  • Sitemap does NOT include: noindex pages, redirected URLs, broken pages
  • Sitemap has fewer than 50,000 URLs (if more, use a sitemap index)
  • All URLs in sitemap use HTTPS (not HTTP)
  • Sitemap uses correct XML format (validate at xml-sitemaps.com)

Findings: ___ Action Required: ✅ / ❌ — ___

Index Coverage Audit

In Google Search Console → Coverage report:

  • Review "Errors" — these are pages that should be indexed but aren't
  • Review "Excluded" — check why pages are excluded (noindex, blocked by robots, duplicate)
  • Are any important pages showing as "noindex"?
  • Are any important pages blocked by robots.txt?
  • Are there "Discovered, not indexed" pages? (May indicate crawl budget issues)

Common errors to address:

ErrorLikely CauseFix
Page with redirectSitemap contains redirected URLUpdate sitemap
Submitted URL seems to be a Soft 404Thin content or empty pageAdd substantial content
Crawled, not indexedGoogle doesn't find page valuableImprove content quality
Blocked by robots.txtDisallow rule too broadReview robots.txt

Total Indexation Rate:

  • Pages in sitemap: ___
  • Pages confirmed indexed: ___
  • Indexation rate: ___% (target: 90%+)

Part 2: Redirect and URL Audit

Redirect Chain Analysis

Redirect chains slow down page load and dilute link equity.

  • No redirect chains longer than 2 hops (A→B→C is bad; A→C is fine)
  • No redirect loops (A→B→A)
  • All redirects use 301 (permanent) not 302 (temporary) for permanent moves
  • Old URLs from site migrations are correctly redirected

How to check: Run your site through Screaming Frog → Follow all redirects is enabled → Export "All Redirects" report

Findings: ___

URL Consistency

  • URLs use HTTPS throughout (HTTP redirects to HTTPS)
  • www and non-www versions resolve to a single canonical version
  • Trailing slashes are consistent (either always present or never present)
  • No duplicate content from URL parameter variations (e.g., /page?sort=date and /page show the same content)

Part 3: Site Speed and Core Web Vitals

Core Web Vitals are Google ranking factors. Measure them in Google Search Console → Core Web Vitals, or PageSpeed Insights.

Core Web Vitals Audit

MetricGoodNeeds ImprovementPoorYour ScoreStatus
LCP (Largest Contentful Paint)< 2.5s2.5-4.0s> 4.0s
INP (Interaction to Next Paint)< 200ms200-500ms> 500ms
CLS (Cumulative Layout Shift)< 0.10.1-0.25> 0.25

For mobile specifically:

  • Mobile LCP under 2.5 seconds
  • No significant CLS (images and embeds have explicit dimensions)

Page Speed Optimization Checklist

Images:

  • All images compressed (use WebP format where possible)
  • Images served at display size (no 4000px images displayed at 400px)
  • Lazy loading enabled for below-the-fold images

Code:

  • CSS minified
  • JavaScript minified and deferred (non-critical JS loads after page render)
  • No render-blocking scripts in <head>
  • Remove unused CSS and JavaScript

Server:

  • CDN in place (Cloudflare, Fastly, etc.)
  • Browser caching enabled
  • GZIP compression enabled

Hosting:

  • Server response time (TTFB) under 600ms
  • Hosting plan adequate for current traffic levels

Build your content engine with Averi

AI-powered strategy, drafting, and publishing in one workflow.

Start Free →

Part 4: On-Page Technical Elements

Title Tags and Meta Descriptions (Site-Wide)

Run Screaming Frog to audit all pages at once:

  • No missing title tags
  • No duplicate title tags
  • No title tags over 60 characters
  • No missing meta descriptions
  • No duplicate meta descriptions
  • No meta descriptions over 160 characters

Summary:

IssuePage CountPriority
Missing title tagsCritical
Duplicate title tagsHigh
Missing meta descriptionsMedium
Duplicate meta descriptionsMedium

H1 Audit

  • Every key page has exactly one H1
  • No pages with multiple H1 tags
  • No pages with missing H1
  • H1 is unique per page

Canonical Tags

  • Canonical tags are set on all pages
  • Canonical tags point to the correct URL (self-referential canonical is fine and recommended)
  • No conflicting canonical tags (canonical points to a URL that itself has a different canonical)
  • Paginated pages use correct canonical/pagination markup

Part 5: Mobile and UX

Mobile Optimization

  • Site passes Google Mobile-Friendly Test (search.google.com/test/mobile-friendly)
  • Viewport meta tag set: <meta name="viewport" content="width=device-width, initial-scale=1">
  • No horizontal scrolling on mobile
  • Tap targets (buttons, links) are minimum 48x48px
  • Text is readable without zooming (minimum 16px font for body text)
  • No interstitials that block content on mobile (Google penalizes intrusive popups)

Structured Data / Schema Markup

  • Article schema on all blog posts
  • Organization schema on homepage
  • BreadcrumbList schema on category/post pages
  • FAQPage schema on FAQ sections
  • HowTo schema on how-to content
  • Product/SoftwareApplication schema on product pages
  • All schema validates without errors at validator.schema.org

Part 6: Security and Trust

  • SSL certificate valid and not expiring within 30 days (check ssllabs.com)
  • All pages load via HTTPS
  • No mixed content warnings (HTTP resources loaded on HTTPS pages)
  • No malware detected (check Google Search Console > Security Issues)
  • No manual actions from Google (check Search Console > Manual Actions)
  • Privacy policy page exists and linked in footer
  • Site not on any spam blacklists (check mxtoolbox.com/blacklists)

Ready to put this into practice?

Averi turns these strategies into an automated content workflow.

Start Free →

Part 7: Structured Data and Rich Results

Check if you're eligible for and capturing available SERP features:

  • Breadcrumbs appearing in search results
  • FAQ rich results appearing for FAQ-schema pages
  • Sitelinks appearing for branded searches
  • Article rich results configured for blog posts

Test in: Google's Rich Results Test (search.google.com/test/rich-results)


Audit Summary and Priority Matrix

After completing the audit, summarize your findings:

IssueSeverityPages AffectedEstimated EffortPriority
Critical / High / Med / LowHoursFix Now / Next Sprint / Quarterly

Critical (fix immediately):

  • Site blocked in robots.txt
  • No SSL
  • Manual action from Google
  • Core pages not indexed

High (fix within 1-2 weeks):

  • LCP/CWV failures on key pages
  • Redirect chains
  • Duplicate title tags on important pages
  • Missing H1 tags on indexed pages

Medium (schedule within quarter):

  • Missing meta descriptions
  • Image compression
  • Missing schema markup
  • Canonical tag inconsistencies

Low (batch quarterly):

  • Minor formatting inconsistencies
  • Non-critical missing alt text
  • Low-priority page speed improvements

How Averi Supports Technical Content Health

Averi's publishing workflow includes a pre-publish technical check: canonical URLs, meta data, and proper formatting are verified before content goes live in your connected CMS (WordPress, Webflow, Framer). This prevents the most common technical SEO errors from entering your site in the first place.

Publish technically sound content with Averi →

Frequently Asked Questions

How often should I run a full technical SEO audit?

Run a comprehensive audit quarterly. Set up Google Search Console alerts for critical issues (crawl errors, security issues, Core Web Vitals failures) so you catch major problems immediately rather than waiting for the next scheduled audit.

Do I need a developer to fix technical SEO issues?

Not always. Issues like missing meta descriptions, title tag fixes, and image compression can often be done directly in your CMS. Issues like robots.txt configuration, redirect rules, schema markup, and page speed optimization often require developer help or CMS configuration changes.

What's the most common technical SEO problem for startup sites?

Pages accidentally noindexed or blocked in robots.txt is surprisingly common — especially after site migrations. The second most common: no HTTPS (or mixed content after migrating to HTTPS). The third: poor Core Web Vitals due to uncompressed images and unoptimized JavaScript.

Does site speed really affect rankings?

Yes, but primarily as a tiebreaker. Google has confirmed Core Web Vitals as ranking signals since 2021. A site with excellent content and slow speed will usually still outrank a site with mediocre content and fast speed. But when two pages are comparable in quality, speed can make the difference.

How do I know if a recent traffic drop is technical or algorithmic?

Check Google Search Console's Coverage report and Core Web Vitals report for changes around the date of the drop. Also check for Google algorithm update dates (search "Google algorithm update [month year]"). If your coverage metrics changed, it's likely technical. If metrics are stable but traffic dropped for many pages simultaneously, it's likely algorithmic.

Start Your AI Content Engine

Ready to put this into practice? Averi automates the hard parts of content marketing — so you can focus on strategy.

Related Resources