How to Find Broken Links on Any Website Using a Free API

2026-03-16 | Tags: seo, broken-links, api, web-development

How to Find Broken Links on Any Website Using a Free API

Broken links hurt your SEO rankings, frustrate visitors, and make your site look abandoned. Google's crawler treats 404 errors as a quality signal — too many broken links and your pages drop in search results.

Most broken link checkers require installing software, running desktop apps, or paying for SaaS subscriptions. What if you could check any website with a single API call?

Our free API crawls any website and returns every broken link it finds:

curl "https://51-68-119-197.sslip.io/api/deadlinks?url=https://example.com"

Response:

{
  "url": "https://example.com",
  "total_links": 24,
  "broken_links": 2,
  "links": [
    {
      "url": "https://example.com/old-page",
      "status": 404,
      "source": "https://example.com/about",
      "text": "Learn more"
    },
    {
      "url": "https://example.com/removed",
      "status": 410,
      "source": "https://example.com/blog",
      "text": "Read article"
    }
  ]
}

Quick Mode vs Full Crawl

Quick mode (default) checks a single page's links in under a second:

curl "https://51-68-119-197.sslip.io/api/deadlinks?url=https://yoursite.com/important-page"

Full crawl follows internal links and checks your entire site:

curl "https://51-68-119-197.sslip.io/api/deadlinks?url=https://yoursite.com&depth=3&max_pages=50"

Parameters: - depth — How many levels deep to crawl (1-5, default 1) - max_pages — Maximum pages to check (1-100, default 10) - max_duration — Timeout in seconds (default 60, max 120)

Export Formats

Get results as CSV for spreadsheets:

curl "https://51-68-119-197.sslip.io/api/deadlinks?url=https://yoursite.com&format=csv"

Or as Markdown for documentation:

curl "https://51-68-119-197.sslip.io/api/deadlinks?url=https://yoursite.com&format=markdown"

Set up a cron job to check your site weekly and email yourself:

#!/bin/bash
# weekly-link-check.sh — Run every Sunday at 9am

SITE="https://yoursite.com"
API="https://51-68-119-197.sslip.io/api/deadlinks"
RESULT=$(curl -s "$API?url=$SITE&depth=2&max_pages=30&format=csv")

BROKEN=$(echo "$RESULT" | grep -c "404\|500\|timeout")

if [ "$BROKEN" -gt 0 ]; then
    echo "Found $BROKEN broken links on $SITE" | mail -s "Broken Link Alert" you@example.com
    echo "$RESULT" >> /var/log/broken-links.csv
fi

Add to crontab:

0 9 * * 0 /home/user/weekly-link-check.sh

CI/CD Integration

Check for broken links in your deployment pipeline. Use the check_only parameter to get a pass/fail result:

# In your CI script
RESULT=$(curl -s "https://51-68-119-197.sslip.io/api/deadlinks?url=$DEPLOY_URL&check_only=true&threshold=0")

if echo "$RESULT" | jq -e '.passed == false' > /dev/null 2>&1; then
    echo "BROKEN LINKS DETECTED - deployment blocked"
    echo "$RESULT" | jq '.broken_links'
    exit 1
fi

GitHub Actions Format

Get output formatted for GitHub Actions annotations:

curl "https://51-68-119-197.sslip.io/api/deadlinks?url=https://yoursite.com&format=github"

This outputs ::warning annotations that appear directly in your PR checks.

Rate Limits and API Keys

The free tier allows 5 requests per day without authentication. For more:

Add your API key to requests:

curl -H "X-API-Key: YOUR_KEY" "https://51-68-119-197.sslip.io/api/deadlinks?url=https://yoursite.com&depth=3"

Combine with Other Tools

Use the dead link checker alongside our other free APIs for comprehensive site monitoring:

Or use the Audit Bundle API to run all checks in a single request.


Try the dead link checker now — no signup required:

curl "https://51-68-119-197.sslip.io/api/deadlinks?url=https://yoursite.com"