refactor(skills): Restructure skills to dual-platform architecture
Major refactoring of ourdigital-custom-skills with new numbering system: ## Structure Changes - Each skill now has code/ (Claude Code) and desktop/ (Claude Desktop) versions - New progressive numbering: 01-09 General, 10-19 SEO, 20-29 GTM, 30-39 OurDigital, 40-49 Jamie ## Skill Reorganization - 01-notion-organizer (from 02) - 10-18: SEO tools split into focused skills (technical, on-page, local, schema, vitals, gsc, gateway) - 20-21: GTM audit and manager - 30-32: OurDigital designer, research, presentation - 40-41: Jamie brand editor and audit ## New Files - .claude/commands/: Slash command definitions for all skills - CLAUDE.md: Updated with new skill structure documentation - REFACTORING_PLAN.md: Migration documentation - COMPATIBILITY_REPORT.md, SKILLS_COMPARISON.md: Analysis docs ## Removed - Old skill directories (02-05, 10-14, 20-21 old numbering) - Consolidated into new structure with _archive/ for reference 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
@@ -0,0 +1,94 @@
|
||||
---
|
||||
name: seo-technical-audit
|
||||
version: 1.0.0
|
||||
description: Technical SEO auditor for crawlability fundamentals. Triggers: robots.txt, sitemap validation, crawlability, indexing check, technical SEO.
|
||||
allowed-tools: mcp__firecrawl__*, mcp__perplexity__*, mcp__notion__*
|
||||
---
|
||||
|
||||
# SEO Technical Audit
|
||||
|
||||
## Purpose
|
||||
|
||||
Analyze crawlability fundamentals: robots.txt rules, XML sitemap structure, and URL accessibility. Identify issues blocking search engine crawlers.
|
||||
|
||||
## Core Capabilities
|
||||
|
||||
1. **Robots.txt Analysis** - Parse rules, check blocked resources
|
||||
2. **Sitemap Validation** - Verify XML structure, URL limits, dates
|
||||
3. **URL Accessibility** - Check HTTP status, redirects, broken links
|
||||
|
||||
## MCP Tool Usage
|
||||
|
||||
### Firecrawl for Page Data
|
||||
```
|
||||
mcp__firecrawl__scrape: Fetch robots.txt and sitemap content
|
||||
mcp__firecrawl__crawl: Check multiple URLs accessibility
|
||||
```
|
||||
|
||||
### Perplexity for Best Practices
|
||||
```
|
||||
mcp__perplexity__search: Research current SEO recommendations
|
||||
```
|
||||
|
||||
## Workflow
|
||||
|
||||
### 1. Robots.txt Check
|
||||
1. Fetch `[domain]/robots.txt` using Firecrawl
|
||||
2. Parse User-agent rules and Disallow patterns
|
||||
3. Identify blocked resources (CSS, JS, images)
|
||||
4. Check for Sitemap declarations
|
||||
5. Report critical issues
|
||||
|
||||
### 2. Sitemap Validation
|
||||
1. Locate sitemap (from robots.txt or `/sitemap.xml`)
|
||||
2. Validate XML syntax
|
||||
3. Check URL count (max 50,000)
|
||||
4. Verify lastmod date formats
|
||||
5. For sitemap index: parse child sitemaps
|
||||
|
||||
### 3. URL Accessibility Sampling
|
||||
1. Extract URLs from sitemap
|
||||
2. Sample 50-100 URLs for large sites
|
||||
3. Check HTTP status codes
|
||||
4. Identify redirects and broken links
|
||||
5. Report 4xx/5xx errors
|
||||
|
||||
## Output Format
|
||||
|
||||
```markdown
|
||||
## Technical SEO Audit: [domain]
|
||||
|
||||
### Robots.txt Analysis
|
||||
- Status: [Valid/Invalid/Missing]
|
||||
- Sitemap declared: [Yes/No]
|
||||
- Critical blocks: [List]
|
||||
|
||||
### Sitemap Validation
|
||||
- URLs found: [count]
|
||||
- Syntax: [Valid/Errors]
|
||||
- Issues: [List]
|
||||
|
||||
### URL Accessibility (sampled)
|
||||
- Checked: [count] URLs
|
||||
- Success (2xx): [count]
|
||||
- Redirects (3xx): [count]
|
||||
- Errors (4xx/5xx): [count]
|
||||
|
||||
### Recommendations
|
||||
1. [Priority fixes]
|
||||
```
|
||||
|
||||
## Common Issues
|
||||
|
||||
| Issue | Impact | Fix |
|
||||
|-------|--------|-----|
|
||||
| No sitemap in robots.txt | Medium | Add `Sitemap:` directive |
|
||||
| Blocking CSS/JS | High | Allow Googlebot access |
|
||||
| 404s in sitemap | High | Remove or fix URLs |
|
||||
| Missing lastmod | Low | Add dates for freshness signals |
|
||||
|
||||
## Limitations
|
||||
|
||||
- Cannot access password-protected sitemaps
|
||||
- Large sitemaps (10,000+ URLs) require sampling
|
||||
- Does not check render-blocking issues (use Core Web Vitals skill)
|
||||
Reference in New Issue
Block a user