How to Automate SEO Audits with a Free API

2026-04-14 | Tags: seo, api, automation, python

How to Automate SEO Audits with a Free API

Most SEO audit tools cost $50-200/month. If you just need to check the basics — meta tags, heading structure, Open Graph data, and page performance — you can build your own automated pipeline with free APIs in about 30 minutes.

This guide shows you how to combine three free APIs to build a comprehensive SEO audit system that runs on a schedule and alerts you when something breaks.

What We're Building

An automated SEO audit that checks: - Meta tags: title, description, canonical URL, robots directives - Heading structure: H1 count, heading hierarchy - Open Graph / Twitter Cards: social sharing metadata - Structured data: JSON-LD, microdata - Page performance: load time, TTFB, resource count - Broken links: internal and external link health

The Free APIs

We'll use three endpoints, all available at https://hermesforge.dev:

  1. SEO Audit API (/api/seo?url=...) — meta tags, headings, structured data, OG tags
  2. Performance API (/api/perf?url=...) — load time, TTFB, resource analysis
  3. Dead Link Checker (/api/deadlinks?url=...) — broken link detection with status codes

No API key required for basic usage. No signup. Just HTTP GET requests.

Quick Start: Single Page Audit

import requests

def audit_page(url):
    base = "https://hermesforge.dev"

    # SEO metadata
    seo = requests.get(f"{base}/api/seo", params={"url": url}).json()

    # Performance metrics
    perf = requests.get(f"{base}/api/perf", params={"url": url}).json()

    # Broken links
    links = requests.get(f"{base}/api/deadlinks", params={"url": url}).json()

    return {"seo": seo, "performance": perf, "links": links}

result = audit_page("https://example.com")

Building the Audit Report

Let's turn raw API responses into actionable insights:

import requests
from datetime import datetime

class SEOAuditor:
    def __init__(self, base_url="https://hermesforge.dev"):
        self.base = base_url
        self.issues = []

    def check_seo(self, url):
        """Check meta tags, headings, and structured data."""
        resp = requests.get(f"{self.base}/api/seo", params={"url": url})
        data = resp.json()

        # Title checks
        title = data.get("title", "")
        if not title:
            self.issues.append(("critical", "Missing page title"))
        elif len(title) > 60:
            self.issues.append(("warning", f"Title too long ({len(title)} chars, max 60)"))
        elif len(title) < 30:
            self.issues.append(("warning", f"Title too short ({len(title)} chars, min 30)"))

        # Meta description
        desc = data.get("meta_description", "")
        if not desc:
            self.issues.append(("critical", "Missing meta description"))
        elif len(desc) > 160:
            self.issues.append(("warning", f"Description too long ({len(desc)} chars, max 160)"))

        # H1 check
        h1_count = data.get("h1_count", 0)
        if h1_count == 0:
            self.issues.append(("critical", "Missing H1 heading"))
        elif h1_count > 1:
            self.issues.append(("warning", f"Multiple H1 tags ({h1_count}), should be exactly 1"))

        # Open Graph
        og = data.get("og_tags", {})
        if not og.get("og:title"):
            self.issues.append(("warning", "Missing og:title"))
        if not og.get("og:description"):
            self.issues.append(("warning", "Missing og:description"))
        if not og.get("og:image"):
            self.issues.append(("warning", "Missing og:image — social shares will have no preview"))

        # Canonical
        if not data.get("canonical"):
            self.issues.append(("warning", "Missing canonical URL"))

        return data

    def check_performance(self, url):
        """Check page load performance."""
        resp = requests.get(f"{self.base}/api/perf", params={"url": url})
        data = resp.json()

        load_time = data.get("load_time_ms", 0)
        if load_time > 3000:
            self.issues.append(("critical", f"Slow page load: {load_time}ms (target: <3000ms)"))
        elif load_time > 1500:
            self.issues.append(("warning", f"Page load could be faster: {load_time}ms"))

        ttfb = data.get("ttfb_ms", 0)
        if ttfb > 800:
            self.issues.append(("warning", f"High TTFB: {ttfb}ms (target: <800ms)"))

        return data

    def check_links(self, url):
        """Check for broken links."""
        resp = requests.get(f"{self.base}/api/deadlinks", params={"url": url})
        data = resp.json()

        broken = [l for l in data.get("links", []) if l.get("status", 200) >= 400]
        if broken:
            for link in broken:
                self.issues.append(("critical",
                    f"Broken link: {link['url']} (status {link['status']})"))

        return data

    def run_audit(self, url):
        """Run complete SEO audit."""
        self.issues = []

        seo_data = self.check_seo(url)
        perf_data = self.check_performance(url)
        link_data = self.check_links(url)

        return {
            "url": url,
            "timestamp": datetime.utcnow().isoformat() + "Z",
            "seo": seo_data,
            "performance": perf_data,
            "links": link_data,
            "issues": self.issues,
            "score": self._calculate_score()
        }

    def _calculate_score(self):
        """Simple scoring: start at 100, deduct for issues."""
        score = 100
        for severity, _ in self.issues:
            if severity == "critical":
                score -= 15
            elif severity == "warning":
                score -= 5
        return max(0, score)

# Usage
auditor = SEOAuditor()
report = auditor.run_audit("https://example.com")
print(f"SEO Score: {report['score']}/100")
for severity, msg in report["issues"]:
    icon = "🔴" if severity == "critical" else "🟡"
    print(f"  {icon} {msg}")

Automating Multi-Page Audits

Most sites need more than a single page audit. Here's how to check your key pages:

def audit_site(urls):
    """Audit multiple pages and generate a summary."""
    auditor = SEOAuditor()
    results = []

    for url in urls:
        print(f"Auditing: {url}")
        report = auditor.run_audit(url)
        results.append(report)
        print(f"  Score: {report['score']}/100 "
              f"({len(report['issues'])} issues)")

    # Summary
    avg_score = sum(r["score"] for r in results) / len(results)
    all_critical = [
        (r["url"], msg)
        for r in results
        for sev, msg in r["issues"] if sev == "critical"
    ]

    print(f"\nOverall score: {avg_score:.0f}/100")
    if all_critical:
        print(f"\nCritical issues ({len(all_critical)}):")
        for url, msg in all_critical:
            print(f"  {url}: {msg}")

    return results

# Audit your key pages
pages = [
    "https://yoursite.com",
    "https://yoursite.com/about",
    "https://yoursite.com/pricing",
    "https://yoursite.com/blog",
]
audit_site(pages)

Scheduling with Cron

Save the auditor as seo_audit.py and add a cron job:

# Run SEO audit every Monday at 9 AM
0 9 * * 1 python3 /path/to/seo_audit.py >> /var/log/seo-audit.log 2>&1

Sending Alerts

Add email alerts when critical issues are found:

import smtplib
from email.mime.text import MIMEText

def send_alert(issues, recipient):
    critical = [(url, msg) for url, msg in issues if msg[0] == "critical"]
    if not critical:
        return

    body = "SEO Audit found critical issues:\n\n"
    for url, msg in critical:
        body += f"  - {url}: {msg}\n"

    msg = MIMEText(body)
    msg["Subject"] = f"SEO Alert: {len(critical)} critical issues found"
    msg["From"] = "seo-audit@yourserver.com"
    msg["To"] = recipient

    with smtplib.SMTP("localhost") as server:
        server.send_message(msg)

Comparison: DIY vs Paid Tools

Feature This approach Ahrefs ($99/mo) Screaming Frog ($259/yr)
Meta tag audit Yes Yes Yes
Broken links Yes Yes Yes
Performance Yes Limited No
Structured data Yes No Yes
Custom scoring Yes No No
Scheduling Cron Built-in Manual
Cost Free $99/month $259/year
API access Yes Extra cost No

What This Won't Do

This approach covers the technical SEO basics. It won't replace tools like Ahrefs for: - Backlink analysis - Keyword research - Competitor analysis - SERP tracking - Content gap analysis

But for automated technical SEO monitoring — catching broken meta tags, missing OG images, broken links, and performance regressions — this free API approach handles it well.

Next Steps

All three APIs used in this guide are free, require no signup, and have no rate limits for reasonable usage. Try them at https://hermesforge.dev/api.