Files
our-claude-skills/custom-skills/99_archive/seo-audit-agent/examples.md
Andrew Yim b69e4b6f3a refactor: Reorganize skill numbering and update documentation
Skill Numbering Changes:
- 01-03: OurDigital core (was 30-32)
- 31-32: Notion tools (was 01-02)
- 99_archive: Renamed from _archive for sorting

New Files:
- AGENTS.md: Claude Code agent routing guide
- requirements.txt for 00-claude-code-setting, 32-notion-writer, 43-jamie-youtube-manager

Documentation Updates:
- CLAUDE.md: Updated skill inventory (23 skills)
- AUDIT_REPORT.md: Current completion status (91%)
- Archived REFACTORING_PLAN.md (most tasks complete)

Removed:
- ga-agent-skills/ (moved to separate repo ~/Project/dintel-ga4-agent)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-23 18:42:39 +07:00

522 lines
11 KiB
Markdown

# OurDigital SEO Audit - Usage Examples
## Quick Reference
| Task | Command |
|------|---------|
| Full audit | `Perform SEO audit for [URL]` |
| Schema check | `Validate schema on [URL]` |
| PageSpeed | `Check Core Web Vitals for [URL]` |
| Sitemap | `Validate sitemap at [URL]` |
| Robots.txt | `Analyze robots.txt for [domain]` |
| Export to Notion | `Export findings to Notion` |
---
## Full Site Audit
### Basic Full Audit
**User prompt:**
> Perform a comprehensive SEO audit for https://example.com
**Agent workflow:**
1. Use Firecrawl to crawl the site and gather page data
2. Run PageSpeed Insights for Core Web Vitals
3. Validate schema markup, sitemap, and robots.txt
4. Query Search Console for performance data (if access granted)
5. Create Notion database with findings
6. Generate prioritized recommendations
---
## Technical SEO Check
### Crawlability Analysis
**User prompt:**
> Check technical SEO for https://example.com including crawlability and indexing
**Agent workflow:**
1. Fetch robots.txt and analyze directives
2. Validate sitemap.xml structure
3. Check HTTP response codes for key pages
4. Verify canonical tags and redirects
5. Test mobile-friendliness
---
## Schema Markup
### Generate Organization Schema
**User prompt:**
> Generate Organization schema markup for my company: OurDigital Corp, website https://ourdigital.com, founded 2020
**Agent response:**
```json
{
"@context": "https://schema.org",
"@type": "Organization",
"name": "OurDigital Corp",
"url": "https://ourdigital.com",
"foundingDate": "2020",
"sameAs": []
}
```
### Generate LocalBusiness Schema
**User prompt:**
> Generate LocalBusiness schema for a restaurant: Seoul Cafe, 123 Gangnam-daero, Seoul, open Mon-Fri 9am-6pm
**Agent response:**
```json
{
"@context": "https://schema.org",
"@type": "Restaurant",
"name": "Seoul Cafe",
"address": {
"@type": "PostalAddress",
"streetAddress": "123 Gangnam-daero",
"addressLocality": "Seoul",
"addressCountry": "KR"
},
"openingHoursSpecification": [{
"@type": "OpeningHoursSpecification",
"dayOfWeek": ["Monday", "Tuesday", "Wednesday", "Thursday", "Friday"],
"opens": "09:00",
"closes": "18:00"
}]
}
```
### Validate Existing Schema
**User prompt:**
> Validate existing schema markup on https://example.com
**Agent workflow:**
1. Scrape page with Firecrawl
2. Extract JSON-LD, Microdata, and RDFa
3. Validate against schema.org vocabulary
4. Check for required properties
5. Test Rich Results eligibility
6. Report issues and recommendations
---
## Sitemap Validation
### Check Sitemap
**User prompt:**
> Validate the sitemap at https://example.com/sitemap.xml
**Agent workflow:**
1. Fetch and parse XML sitemap
2. Validate XML syntax
3. Check URL count (max 50,000)
4. Verify lastmod dates
5. Test sample URLs for accessibility
6. Report issues found
**Sample output:**
```
Sitemap Validation Report
=========================
URL: https://example.com/sitemap.xml
Total URLs: 1,234
Valid URLs: 1,200
Issues Found:
- 34 URLs returning 404
- 12 URLs with invalid lastmod format
- Missing sitemap index (recommended for 1000+ URLs)
```
---
## Robots.txt Analysis
### Analyze Robots.txt
**User prompt:**
> Analyze robots.txt for example.com
**Agent workflow:**
1. Fetch /robots.txt
2. Parse all directives
3. Check for blocking issues
4. Verify sitemap declaration
5. Test specific URLs
6. Compare user-agent rules
**Sample output:**
```
Robots.txt Analysis
==================
URL: https://example.com/robots.txt
User-agents defined: 3 (*, Googlebot, Bingbot)
Issues:
- WARNING: CSS/JS files blocked (/assets/)
- INFO: Crawl-delay set to 10 seconds (may slow indexing)
- OK: Sitemap declared
Rules Summary:
- Disallowed: /admin/, /private/, /tmp/
- Allowed: /public/, /blog/
```
---
## Core Web Vitals
### Performance Analysis
**User prompt:**
> Check Core Web Vitals for https://example.com
**Agent workflow:**
1. Run PageSpeed Insights API (mobile + desktop)
2. Extract Core Web Vitals metrics
3. Compare against thresholds
4. Identify optimization opportunities
5. Prioritize recommendations
**Sample output:**
```
Core Web Vitals Report
=====================
URL: https://example.com
Strategy: Mobile
Metrics:
- LCP: 3.2s (NEEDS IMPROVEMENT - target <2.5s)
- FID: 45ms (GOOD - target <100ms)
- CLS: 0.15 (NEEDS IMPROVEMENT - target <0.1)
- INP: 180ms (GOOD - target <200ms)
Top Opportunities:
1. Serve images in next-gen formats (-1.5s)
2. Eliminate render-blocking resources (-0.8s)
3. Reduce unused CSS (-0.3s)
```
---
## Local SEO Assessment
### Local SEO Audit
**User prompt:**
> Perform local SEO audit for Seoul Dental Clinic in Gangnam
**Agent workflow:**
1. Search for existing citations (Perplexity)
2. Check for LocalBusiness schema
3. Analyze NAP consistency
4. Review Google Business Profile (manual check)
5. Identify missing citations
6. Recommend improvements
---
## Keyword Research
### Trend Analysis
**User prompt:**
> Research keyword trends for "digital marketing" in Korea over the past year
**Agent workflow:**
1. Query Google Trends (pytrends)
2. Get related queries
3. Identify seasonal patterns
4. Compare with related terms
5. Generate insights
---
## Competitive Analysis
### SERP Analysis
**User prompt:**
> Analyze top 10 search results for "best coffee shops Seoul"
**Agent workflow:**
1. Use Custom Search API
2. Extract title, description, URL
3. Analyze common patterns
4. Check for schema markup
5. Identify content gaps
---
## CLI Script Usage
### Schema Generator
```bash
# Generate Organization schema
python scripts/schema_generator.py \
--type organization \
--name "OurDigital Corp" \
--url "https://ourdigital.com"
# Generate Product schema
python scripts/schema_generator.py \
--type product \
--name "SEO Tool" \
--price 29900 \
--currency KRW
```
### Schema Validator
```bash
# Validate schema on a URL
python scripts/schema_validator.py \
--url https://example.com \
--output report.json
# Validate local JSON-LD file
python scripts/schema_validator.py \
--file schema.json
```
### Sitemap Validator
```bash
# Validate sitemap
python scripts/sitemap_validator.py \
--url https://example.com/sitemap.xml \
--check-urls \
--output sitemap_report.json
```
### Robots.txt Checker
```bash
# Analyze robots.txt
python scripts/robots_checker.py \
--url https://example.com/robots.txt
# Test specific URL
python scripts/robots_checker.py \
--url https://example.com/robots.txt \
--test-url /admin/dashboard \
--user-agent Googlebot
```
### Full Audit
```bash
# Run complete audit
python scripts/full_audit.py \
--url https://example.com \
--output notion \
--notion-page-id abc123
# Export to Google Sheets
python scripts/full_audit.py \
--url https://example.com \
--output sheets \
--spreadsheet-id xyz789
```
---
## Output to Notion
### Create Findings Database
**User prompt:**
> Create an SEO audit findings database in Notion for example.com
**Agent workflow:**
1. Search for existing SEO audit pages
2. Create new database with schema
3. Add initial findings from audit
4. Set up views (by priority, by category)
5. Share database link with user
---
## Batch Operations
### Audit Multiple Pages
**User prompt:**
> Check schema markup on these URLs: url1.com, url2.com, url3.com
**Agent workflow:**
1. Queue URLs for processing
2. Validate each URL sequentially
3. Aggregate findings
4. Generate comparison report
---
## Integration with Search Console
### Performance Report
**User prompt:**
> Get Search Console performance data for the last 30 days
**Agent workflow:**
1. Verify Search Console access
2. Query search analytics API
3. Get top queries and pages
4. Calculate CTR and position changes
5. Identify opportunities
**Sample output:**
```
Search Console Performance (Last 30 Days)
========================================
Total Clicks: 12,345
Total Impressions: 456,789
Average CTR: 2.7%
Average Position: 15.3
Top Queries:
1. "example product" - 1,234 clicks, position 3.2
2. "example service" - 987 clicks, position 5.1
3. "example review" - 654 clicks, position 8.4
Pages with Opportunities:
- /product-page: High impressions, low CTR (improve title)
- /service-page: Good CTR, position 11 (push to page 1)
```
---
## Real-World Examples (OurDigital)
### Example: Audit blog.ourdigital.org
**User prompt:**
> Perform SEO audit for https://blog.ourdigital.org and export to Notion
**Actual Results:**
```
=== SEO Audit: blog.ourdigital.org ===
Robots.txt: ✓ Valid
- 6 disallow rules
- Sitemap declared
Sitemap: ✓ Valid
- 126 posts indexed
- All URLs accessible
Schema Markup: ⚠ Issues Found
- Organization missing 'url' property (High)
- WebPage missing 'name' property (High)
- Missing SearchAction on WebSite (Medium)
- Missing sameAs on Organization (Medium)
Core Web Vitals (Mobile):
- Performance: 53/100
- SEO: 100/100
- LCP: 5.91s ✗ Poor
- CLS: 0.085 ✓ Good
- TBT: 651ms ✗ Poor
Findings exported to Notion: 6 issues
```
### Example: GA4 Traffic Analysis
**User prompt:**
> Get traffic data for OurDigital Blog from GA4
**Actual Results:**
```
GA4 Property: OurDigital Blog (489750460)
Period: Last 30 days
Top Pages by Views:
1. / (Homepage): 86 views
2. /google-business-profile-ownership-authentication: 59 views
3. /information-overload/: 37 views
4. /social-media-vs-sns/: 23 views
5. /reputation-in-connected-world/: 19 views
```
### Example: Search Console Performance
**User prompt:**
> Get Search Console data for ourdigital.org
**Actual Results:**
```
Property: sc-domain:ourdigital.org
Period: Last 30 days
Top Pages by Clicks:
1. ourdigital.org/information-overload - 27 clicks, pos 4.2
2. ourdigital.org/google-business-profile-ownership - 18 clicks, pos 5.9
3. ourdigital.org/social-media-vs-sns - 13 clicks, pos 9.5
4. ourdigital.org/website-migration-redirect - 12 clicks, pos 17.9
5. ourdigital.org/google-brand-lift-measurement - 7 clicks, pos 5.7
```
---
## Notion Database Structure
### Finding Entry Example
**Issue:** Organization schema missing 'url' property
**Properties:**
| Field | Value |
|-------|-------|
| Category | Schema/Structured Data |
| Priority | High |
| Site | https://blog.ourdigital.org |
| URL | https://blog.ourdigital.org/posts/example/ |
| Found Date | 2024-12-14 |
| Audit ID | blog.ourdigital.org-20241214-123456 |
**Page Content:**
```markdown
## Description
The Organization schema on the blog post is missing the required
'url' property that identifies the organization's website.
## Impact
⚠️ May affect rich result eligibility and knowledge panel display
in search results. Google uses the url property to verify and
connect your organization across web properties.
## Recommendation
💡 Add 'url': 'https://ourdigital.org' to the Organization schema
markup in your site's JSON-LD structured data.
```
---
## API Configuration Reference
### Available Properties
| API | Property/Domain | ID |
|-----|-----------------|-----|
| Search Console | sc-domain:ourdigital.org | - |
| GA4 | OurDigital Lab | 218477407 |
| GA4 | OurDigital Journal | 413643875 |
| GA4 | OurDigital Blog | 489750460 |
| Custom Search | - | e5f27994f2bab4bf2 |
### Service Account
```
Email: ourdigital-seo-agent@ourdigital-insights.iam.gserviceaccount.com
File: ~/.credential/ourdigital-seo-agent.json
```