* Fix SEO skill 34 bugs, Korean labels, and transition Ahrefs refs to our-seo-agent P0: Fix report_aggregator.py — wrong SKILL_REGISTRY[33] mapping, missing CATEGORY_WEIGHTS for 7 categories, and break bug in health score parsing that exited loop even on parse failure. P1: Remove VIEW tab references from skill 20, expand skill 32 docs, replace Ahrefs MCP references across all 16 skills (19-28, 31-34) with our-seo-agent CLI data source references. P2: Fix Korean labels in executive_report.py and dashboard_generator.py, add tenacity to base requirements, sync skill 34 base_client.py with canonical version from skill 12. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * Add Claude Code slash commands for SEO skills 19-34 and fix stale paths Create 14 new slash command files for skills 19-28, 31-34 so they appear as /seo-* commands in Claude Code. Also fix stale directory paths in 8 existing commands (skills 12-18, 29-30) that referenced pre-renumbering skill directories. Update .gitignore to track .claude/commands/ while keeping other .claude/ files ignored. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * Add 8 slash commands, enhance reference-curator with depth/output options - Add slash commands: ourdigital-brand-guide, notion-writer, notebooklm-agent, notebooklm-automation, notebooklm-studio, notebooklm-research, reference-curator, multi-agent-guide - Add --depth (light/standard/deep/full) with Firecrawl parameter mapping - Add --output with ~/Documents/reference-library/ default and user confirmation - Increase --max-sources default from 10 to 100 - Rename /reference-curator-pipeline to /reference-curator - Simplify web-crawler-orchestrator label to web-crawler in docs Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
38 lines
958 B
Markdown
38 lines
958 B
Markdown
---
|
|
description: Seo Technical command
|
|
---
|
|
|
|
# SEO Technical Audit
|
|
|
|
Technical SEO audit for robots.txt and sitemap validation.
|
|
|
|
## Triggers
|
|
- "check robots.txt", "validate sitemap", "technical SEO"
|
|
|
|
## Capabilities
|
|
|
|
1. **Robots.txt Analysis** - Parse and validate robots.txt rules
|
|
2. **Sitemap Validation** - Check XML sitemap structure and URLs
|
|
3. **Sitemap Crawling** - Crawl all URLs in sitemap for issues
|
|
|
|
## Scripts
|
|
|
|
```bash
|
|
# Check robots.txt
|
|
python custom-skills/12-seo-technical-audit/code/scripts/robots_checker.py \
|
|
--url https://example.com
|
|
|
|
# Validate sitemap
|
|
python custom-skills/12-seo-technical-audit/code/scripts/sitemap_validator.py \
|
|
--url https://example.com/sitemap.xml
|
|
|
|
# Crawl sitemap URLs
|
|
python custom-skills/12-seo-technical-audit/code/scripts/sitemap_crawler.py \
|
|
--sitemap https://example.com/sitemap.xml --output report.json
|
|
```
|
|
|
|
## Output
|
|
- Robots.txt rule analysis
|
|
- Sitemap structure validation
|
|
- URL accessibility report
|