Skip to main content
The audit command crawls a website and runs SEO rules to generate a report.

Usage

squirrel audit <url> [options]

Arguments

ArgumentDescription
urlThe URL to audit (required)

Options

OptionAliasDescriptionDefault
--maxPages-mMaximum pages to crawl50
--format-fOutput format: console, json, htmlconsole
--output-oOutput file pathauto
--refresh-rIgnore cache, fetch all pages freshfalse
--verbose-vShow progress during crawlfalse
--debugEnable debug loggingfalse
--all-aRun all rules including LLM-basedfalse

Examples

Basic Audit

squirrel audit https://example.com

Crawl More Pages

squirrel audit https://example.com -m 200

Export to JSON

squirrel audit https://example.com -f json -o report.json

Generate HTML Report

squirrel audit https://example.com -f html -o report.html

Fresh Crawl (Ignore Cache)

squirrel audit https://example.com --refresh

Verbose Output

squirrel audit https://example.com -v

Run All Rules

squirrel audit https://example.com --all

Output Formats

Console (default)

Human-readable output with colored issue severity:
SquirrelScan v2.0
========================================
Auditing: https://example.com
Max pages: 50

Crawled 12 pages

ISSUES

[high] Missing meta description
  → /about
  → /contact

[medium] Image missing alt text
  → /images/hero.png on /

[low] Non-HTTPS link
  → http://oldsite.com on /links

JSON

Machine-readable JSON for CI/CD pipelines and AI processing:
squirrel audit https://example.com -f json -o report.json
{
  "url": "https://example.com",
  "crawledAt": "2025-01-08T00:00:00Z",
  "pages": [...],
  "issues": [...],
  "stats": {
    "totalPages": 12,
    "issueCount": { "high": 2, "medium": 3, "low": 5 }
  }
}

HTML

Visual report you can open in a browser:
squirrel audit https://example.com -f html -o report.html
open report.html

Crawl Behavior

The audit command manages crawl sessions intelligently:
ScenarioBehavior
First runCreates new crawl
Re-run (completed)New crawl, uses cache for 304s
Re-run (interrupted)Resumes from where it left off
Config changedNew crawl with fresh scope
Changing scope-affecting config (include, exclude, allow_query_params, drop_query_prefixes) triggers a fresh crawl.

Caching

SquirrelScan caches page content locally. On subsequent audits:
  • 304 Not Modified: Uses cached content (fast)
  • 200 OK: Fetches fresh content (slower)
Use --refresh to bypass the cache entirely.

Exit Codes

CodeMeaning
0Success
1Error (invalid URL, crawl failed, etc.)

Configuration

The audit command respects settings from squirrel.toml:
[crawler]
max_pages = 100
delay_ms = 200
timeout_ms = 30000
include = ["/blog/*"]
exclude = ["/admin/*"]

[rules]
enable = ["*"]
disable = ["ai/*"]
See Configuration for all options.

Using with AI

Pipe JSON output to your LLM:
squirrel audit https://example.com -f json | claude "analyze this SEO report"
Or use with Claude Code’s MCP integration for interactive auditing.