Fix SEO skill 34 bugs, Korean labels, and transition Ahrefs refs to our-seo-agent (#2)
This commit is contained in:
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
Keyword strategy and research tool for SEO campaigns. Expands seed keywords via Ahrefs APIs, classifies search intent, clusters topics, performs competitor keyword gap analysis, and supports Korean market keyword discovery including Naver autocomplete.
|
Keyword strategy and research tool for SEO campaigns. Expands seed keywords via our-seo-agent CLI or pre-fetched data, classifies search intent, clusters topics, performs competitor keyword gap analysis, and supports Korean market keyword discovery including Naver autocomplete.
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
||||||
@@ -68,17 +68,13 @@ python scripts/keyword_gap_analyzer.py --target https://example.com --competitor
|
|||||||
- Segment gaps by intent type
|
- Segment gaps by intent type
|
||||||
- Prioritize low-KD high-volume opportunities
|
- Prioritize low-KD high-volume opportunities
|
||||||
|
|
||||||
## Ahrefs MCP Tools Used
|
## Data Sources
|
||||||
|
|
||||||
| Tool | Purpose |
|
| Source | Purpose |
|
||||||
|------|---------|
|
|--------|---------|
|
||||||
| `keywords-explorer-overview` | Get keyword metrics (volume, KD, CPC) |
|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
|
||||||
| `keywords-explorer-matching-terms` | Find matching keyword variations |
|
| WebSearch / WebFetch | Supplementary live data |
|
||||||
| `keywords-explorer-related-terms` | Discover semantically related keywords |
|
| Notion MCP | Save audit report to database |
|
||||||
| `keywords-explorer-search-suggestions` | Get autocomplete suggestions |
|
|
||||||
| `keywords-explorer-volume-by-country` | Compare volume across countries |
|
|
||||||
| `keywords-explorer-volume-history` | Track volume trends over time |
|
|
||||||
| `site-explorer-organic-keywords` | Get competitor keyword rankings |
|
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -23,14 +23,10 @@ Expand seed keywords, classify search intent, cluster topics, and identify compe
|
|||||||
|
|
||||||
## MCP Tool Usage
|
## MCP Tool Usage
|
||||||
|
|
||||||
### Ahrefs for Keyword Data
|
### SEO Data
|
||||||
```
|
```
|
||||||
mcp__ahrefs__keywords-explorer-overview: Get keyword metrics
|
our-seo-agent CLI: Primary keyword data source (future); use --input for pre-fetched JSON
|
||||||
mcp__ahrefs__keywords-explorer-matching-terms: Find keyword variations
|
WebSearch / WebFetch: Live keyword research and autocomplete data
|
||||||
mcp__ahrefs__keywords-explorer-related-terms: Discover related keywords
|
|
||||||
mcp__ahrefs__keywords-explorer-search-suggestions: Autocomplete suggestions
|
|
||||||
mcp__ahrefs__keywords-explorer-volume-by-country: Country volume comparison
|
|
||||||
mcp__ahrefs__site-explorer-organic-keywords: Competitor keyword rankings
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Web Search for Naver Discovery
|
### Web Search for Naver Discovery
|
||||||
@@ -42,7 +38,7 @@ WebSearch: Naver autocomplete and trend discovery
|
|||||||
|
|
||||||
### 1. Seed Keyword Expansion
|
### 1. Seed Keyword Expansion
|
||||||
1. Input seed keyword (Korean or English)
|
1. Input seed keyword (Korean or English)
|
||||||
2. Query Ahrefs matching-terms and related-terms
|
2. Query keyword data via our-seo-agent CLI, pre-fetched JSON, or WebSearch
|
||||||
3. Get search suggestions for long-tail variations
|
3. Get search suggestions for long-tail variations
|
||||||
4. Apply Korean suffix expansion if Korean market
|
4. Apply Korean suffix expansion if Korean market
|
||||||
5. Deduplicate and merge results
|
5. Deduplicate and merge results
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
SERP analysis tool for understanding search result landscapes. Detects Google SERP features (featured snippets, PAA, knowledge panels, local pack, video, ads), analyzes Naver SERP composition (blog, cafe, knowledge iN, Smart Store, brand zone, VIEW tab), maps competitor positions, and scores SERP feature opportunities.
|
SERP analysis tool for understanding search result landscapes. Detects Google SERP features (featured snippets, PAA, knowledge panels, local pack, video, ads), analyzes Naver SERP composition (blog, cafe, knowledge iN, Smart Store, brand zone, shortform, influencer), maps competitor positions, and scores SERP feature opportunities.
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
||||||
@@ -65,13 +65,13 @@ python scripts/naver_serp_analyzer.py --keywords-file keywords.txt --json
|
|||||||
- Brand zone presence detection
|
- Brand zone presence detection
|
||||||
- Shortform/influencer content analysis
|
- Shortform/influencer content analysis
|
||||||
|
|
||||||
## Ahrefs MCP Tools Used
|
## Data Sources
|
||||||
|
|
||||||
| Tool | Purpose |
|
| Source | Purpose |
|
||||||
|------|---------|
|
|--------|---------|
|
||||||
| `serp-overview` | Get SERP results for a keyword |
|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
|
||||||
| `keywords-explorer-overview` | Get keyword metrics and SERP features |
|
| WebSearch / WebFetch | Live SERP data and Naver section analysis |
|
||||||
| `site-explorer-organic-keywords` | Map competitor positions |
|
| Notion MCP | Save analysis report to SEO Audit Log database |
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
Naver SERP Analyzer - Naver search result composition analysis
|
Naver SERP Analyzer - Naver search result composition analysis
|
||||||
==============================================================
|
==============================================================
|
||||||
Purpose: Analyze Naver SERP section distribution, content type mapping,
|
Purpose: Analyze Naver SERP section distribution, content type mapping,
|
||||||
brand zone detection, and VIEW tab content analysis.
|
brand zone detection, and section priority analysis.
|
||||||
Python: 3.10+
|
Python: 3.10+
|
||||||
|
|
||||||
Usage:
|
Usage:
|
||||||
|
|||||||
@@ -21,11 +21,10 @@ Analyze search engine result page composition for Google and Naver. Detect SERP
|
|||||||
|
|
||||||
## MCP Tool Usage
|
## MCP Tool Usage
|
||||||
|
|
||||||
### Ahrefs for SERP Data
|
### SEO Data
|
||||||
```
|
```
|
||||||
mcp__ahrefs__serp-overview: Get SERP results and features for a keyword
|
our-seo-agent CLI: Primary data source (future); use --input for pre-fetched JSON
|
||||||
mcp__ahrefs__keywords-explorer-overview: Get keyword metrics, volume, difficulty, and SERP feature flags
|
WebSearch / WebFetch: Live SERP data and keyword metrics
|
||||||
mcp__ahrefs__site-explorer-organic-keywords: Map competitor keyword positions
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Notion for Report Storage
|
### Notion for Report Storage
|
||||||
@@ -43,7 +42,7 @@ WebFetch: Fetch Naver SERP HTML for section analysis
|
|||||||
## Workflow
|
## Workflow
|
||||||
|
|
||||||
### 1. Google SERP Analysis
|
### 1. Google SERP Analysis
|
||||||
1. Fetch SERP data via `mcp__ahrefs__serp-overview` for the target keyword and country
|
1. Fetch SERP data via `our-seo-agent` CLI, `--input` JSON, or WebSearch for the target keyword and country
|
||||||
2. Detect SERP features (featured snippet, PAA, local pack, knowledge panel, video, ads, images, shopping)
|
2. Detect SERP features (featured snippet, PAA, local pack, knowledge panel, video, ads, images, shopping)
|
||||||
3. Map competitor positions from organic results (domain, URL, title, position)
|
3. Map competitor positions from organic results (domain, URL, title, position)
|
||||||
4. Classify content type for each result (blog, product, service, news, video)
|
4. Classify content type for each result (blog, product, service, news, video)
|
||||||
@@ -107,7 +106,7 @@ WebFetch: Fetch Naver SERP HTML for section analysis
|
|||||||
|
|
||||||
## Limitations
|
## Limitations
|
||||||
|
|
||||||
- Ahrefs SERP data may have a delay (not real-time)
|
- SERP data may have a delay depending on data source (not real-time)
|
||||||
- Naver SERP HTML structure changes periodically
|
- Naver SERP HTML structure changes periodically
|
||||||
- Brand zone detection depends on HTML class patterns
|
- Brand zone detection depends on HTML class patterns
|
||||||
- Cannot detect personalized SERP results
|
- Cannot detect personalized SERP results
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
Position tracking tool for monitoring keyword rankings via Ahrefs Rank Tracker. Monitors ranking positions, detects position changes with threshold alerts, calculates visibility scores weighted by search volume, compares against competitors, and segments by brand/non-brand keywords.
|
Position tracking tool for monitoring keyword rankings. Monitors ranking positions, detects position changes with threshold alerts, calculates visibility scores weighted by search volume, compares against competitors, and segments by brand/non-brand keywords.
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
||||||
@@ -41,7 +41,7 @@ python scripts/position_tracker.py --target https://example.com --competitor htt
|
|||||||
```
|
```
|
||||||
|
|
||||||
**Capabilities**:
|
**Capabilities**:
|
||||||
- Current ranking position retrieval via Ahrefs Rank Tracker
|
- Current ranking position retrieval via our-seo-agent CLI or pre-fetched data
|
||||||
- Position change detection with configurable threshold alerts
|
- Position change detection with configurable threshold alerts
|
||||||
- Visibility score calculation (weighted by search volume)
|
- Visibility score calculation (weighted by search volume)
|
||||||
- Brand vs non-brand keyword segmentation
|
- Brand vs non-brand keyword segmentation
|
||||||
@@ -69,17 +69,13 @@ python scripts/ranking_reporter.py --target https://example.com --competitor htt
|
|||||||
- Competitor overlap and position comparison
|
- Competitor overlap and position comparison
|
||||||
- Average position by keyword group
|
- Average position by keyword group
|
||||||
|
|
||||||
## Ahrefs MCP Tools Used
|
## Data Sources
|
||||||
|
|
||||||
| Tool | Purpose |
|
| Source | Purpose |
|
||||||
|------|---------|
|
|--------|---------|
|
||||||
| `rank-tracker-overview` | Get rank tracking overview for project |
|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
|
||||||
| `rank-tracker-competitors-overview` | Compare against competitors |
|
| WebSearch / WebFetch | Supplementary live data |
|
||||||
| `rank-tracker-competitors-pages` | Competitor page-level rankings |
|
| Notion MCP | Save audit report to database |
|
||||||
| `rank-tracker-competitors-stats` | Competitor ranking statistics |
|
|
||||||
| `rank-tracker-serp-overview` | SERP details for tracked keywords |
|
|
||||||
| `management-projects` | List Ahrefs projects |
|
|
||||||
| `management-project-keywords` | Get tracked keywords for project |
|
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
---
|
---
|
||||||
name: seo-position-tracking
|
name: seo-position-tracking
|
||||||
description: |
|
description: |
|
||||||
Keyword position tracking and ranking monitoring via Ahrefs Rank Tracker.
|
Keyword position tracking for keyword ranking monitoring.
|
||||||
Triggers: rank tracking, position monitoring, keyword rankings, visibility score, ranking report, 키워드 순위, 순위 추적.
|
Triggers: rank tracking, position monitoring, keyword rankings, visibility score, ranking report, 키워드 순위, 순위 추적.
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -9,11 +9,11 @@ description: |
|
|||||||
|
|
||||||
## Purpose
|
## Purpose
|
||||||
|
|
||||||
Monitor keyword ranking positions, detect significant changes, calculate visibility scores, and compare against competitors using Ahrefs Rank Tracker data. Provides actionable alerts for ranking drops and segment-level performance breakdown.
|
Monitor keyword ranking positions, detect significant changes, calculate visibility scores, and compare against competitors using our-seo-agent CLI or pre-fetched ranking data. Provides actionable alerts for ranking drops and segment-level performance breakdown.
|
||||||
|
|
||||||
## Core Capabilities
|
## Core Capabilities
|
||||||
|
|
||||||
1. **Position Monitoring** - Retrieve current keyword ranking positions from Ahrefs Rank Tracker projects
|
1. **Position Monitoring** - Retrieve current keyword ranking positions from our-seo-agent CLI or pre-fetched data
|
||||||
2. **Change Detection** - Detect significant position changes with configurable threshold alerts (severity: critical/high/medium/low)
|
2. **Change Detection** - Detect significant position changes with configurable threshold alerts (severity: critical/high/medium/low)
|
||||||
3. **Visibility Scoring** - Calculate weighted visibility scores using CTR-curve model (position 1 = 30%, position 2 = 15%, etc.)
|
3. **Visibility Scoring** - Calculate weighted visibility scores using CTR-curve model (position 1 = 30%, position 2 = 15%, etc.)
|
||||||
4. **Brand/Non-brand Segmentation** - Automatically classify keywords by brand relevance and search intent type
|
4. **Brand/Non-brand Segmentation** - Automatically classify keywords by brand relevance and search intent type
|
||||||
@@ -21,15 +21,10 @@ Monitor keyword ranking positions, detect significant changes, calculate visibil
|
|||||||
|
|
||||||
## MCP Tool Usage
|
## MCP Tool Usage
|
||||||
|
|
||||||
### Ahrefs Rank Tracker Tools
|
### SEO Data
|
||||||
```
|
```
|
||||||
mcp__ahrefs__rank-tracker-overview: Get rank tracking overview with current positions
|
our-seo-agent CLI: Primary ranking data source (future); use --input for pre-fetched JSON
|
||||||
mcp__ahrefs__rank-tracker-competitors-overview: Compare rankings against competitors
|
WebSearch: Supplementary ranking data
|
||||||
mcp__ahrefs__rank-tracker-competitors-pages: Competitor page-level ranking data
|
|
||||||
mcp__ahrefs__rank-tracker-competitors-stats: Detailed competitor ranking statistics
|
|
||||||
mcp__ahrefs__rank-tracker-serp-overview: SERP details for tracked keywords
|
|
||||||
mcp__ahrefs__management-projects: List available Ahrefs projects
|
|
||||||
mcp__ahrefs__management-project-keywords: Get tracked keywords for a project
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Notion for Report Storage
|
### Notion for Report Storage
|
||||||
@@ -41,7 +36,7 @@ mcp__notion__notion-update-page: Update existing tracking entries
|
|||||||
## Workflow
|
## Workflow
|
||||||
|
|
||||||
### Phase 1: Data Collection
|
### Phase 1: Data Collection
|
||||||
1. Identify Ahrefs project via `management-projects`
|
1. Identify tracking project or use --input for pre-fetched data
|
||||||
2. Retrieve tracked keywords via `management-project-keywords`
|
2. Retrieve tracked keywords via `management-project-keywords`
|
||||||
3. Fetch current positions via `rank-tracker-overview`
|
3. Fetch current positions via `rank-tracker-overview`
|
||||||
4. Fetch competitor data via `rank-tracker-competitors-overview` (if requested)
|
4. Fetch competitor data via `rank-tracker-competitors-overview` (if requested)
|
||||||
|
|||||||
@@ -68,19 +68,13 @@ python scripts/link_gap_finder.py --target https://example.com --competitor http
|
|||||||
- Categorize link sources (editorial, directory, forum, blog, news)
|
- Categorize link sources (editorial, directory, forum, blog, news)
|
||||||
- Prioritize by feasibility and impact
|
- Prioritize by feasibility and impact
|
||||||
|
|
||||||
## Ahrefs MCP Tools Used
|
## Data Sources
|
||||||
|
|
||||||
| Tool | Purpose |
|
| Source | Purpose |
|
||||||
|------|---------|
|
|--------|---------|
|
||||||
| `site-explorer-all-backlinks` | Get all backlinks for a target |
|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
|
||||||
| `site-explorer-backlinks-stats` | Backlink statistics overview |
|
| WebSearch / WebFetch | Supplementary live data |
|
||||||
| `site-explorer-referring-domains` | List referring domains |
|
| Notion MCP | Save audit report to database |
|
||||||
| `site-explorer-anchors` | Anchor text distribution |
|
|
||||||
| `site-explorer-broken-backlinks` | Find broken backlinks |
|
|
||||||
| `site-explorer-domain-rating` | Get Domain Rating |
|
|
||||||
| `site-explorer-domain-rating-history` | DR trend over time |
|
|
||||||
| `site-explorer-refdomains-history` | Referring domains trend |
|
|
||||||
| `site-explorer-linked-domains` | Domains linked from target |
|
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -22,17 +22,10 @@ Analyze backlink profiles, detect toxic links, find competitor link gaps, track
|
|||||||
|
|
||||||
## MCP Tool Usage
|
## MCP Tool Usage
|
||||||
|
|
||||||
### Ahrefs for Backlink Data
|
### SEO Data
|
||||||
```
|
```
|
||||||
mcp__ahrefs__site-explorer-all-backlinks: Get all backlinks for a target
|
our-seo-agent CLI: Primary backlink data source (future); use --input for pre-fetched JSON
|
||||||
mcp__ahrefs__site-explorer-backlinks-stats: Backlink statistics overview
|
WebSearch / WebFetch: Supplementary backlink data
|
||||||
mcp__ahrefs__site-explorer-referring-domains: List referring domains
|
|
||||||
mcp__ahrefs__site-explorer-anchors: Anchor text distribution
|
|
||||||
mcp__ahrefs__site-explorer-broken-backlinks: Find broken backlinks
|
|
||||||
mcp__ahrefs__site-explorer-domain-rating: Get Domain Rating
|
|
||||||
mcp__ahrefs__site-explorer-domain-rating-history: DR trend over time
|
|
||||||
mcp__ahrefs__site-explorer-refdomains-history: Referring domains trend
|
|
||||||
mcp__ahrefs__site-explorer-linked-domains: Domains linked from target
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Notion for Report Storage
|
### Notion for Report Storage
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
Content strategy tool for SEO-driven content planning. Performs content inventory via sitemap crawl and Ahrefs top pages, scores content performance, detects content decay, analyzes topic gaps vs competitors, maps topic clusters, and generates content briefs. Supports Korean content patterns (Naver Blog format, review/후기 content).
|
Content strategy tool for SEO-driven content planning. Performs content inventory via sitemap crawl and our-seo-agent CLI, scores content performance, detects content decay, analyzes topic gaps vs competitors, maps topic clusters, and generates content briefs. Supports Korean content patterns (Naver Blog format, review/후기 content).
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
||||||
@@ -42,7 +42,7 @@ python scripts/content_auditor.py --url https://example.com --type blog --json
|
|||||||
```
|
```
|
||||||
|
|
||||||
**Capabilities**:
|
**Capabilities**:
|
||||||
- Content inventory via sitemap crawl + Ahrefs top-pages
|
- Content inventory via sitemap crawl + our-seo-agent CLI or pre-fetched data
|
||||||
- Performance scoring (traffic, rankings, backlinks)
|
- Performance scoring (traffic, rankings, backlinks)
|
||||||
- Content decay detection (pages losing traffic over time)
|
- Content decay detection (pages losing traffic over time)
|
||||||
- Content type classification (blog, product, service, landing, resource)
|
- Content type classification (blog, product, service, landing, resource)
|
||||||
@@ -85,15 +85,13 @@ python scripts/content_brief_generator.py --keyword "dental implant cost" --url
|
|||||||
- Internal linking suggestions
|
- Internal linking suggestions
|
||||||
- Korean content format recommendations
|
- Korean content format recommendations
|
||||||
|
|
||||||
## Ahrefs MCP Tools Used
|
## Data Sources
|
||||||
|
|
||||||
| Tool | Purpose |
|
| Source | Purpose |
|
||||||
|------|---------|
|
|--------|---------|
|
||||||
| `site-explorer-top-pages` | Get top performing pages |
|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
|
||||||
| `site-explorer-pages-by-traffic` | Pages ranked by organic traffic |
|
| WebSearch / WebFetch | Supplementary live data |
|
||||||
| `site-explorer-organic-keywords` | Keywords per page |
|
| Notion MCP | Save audit report to database |
|
||||||
| `site-explorer-organic-competitors` | Find content competitors |
|
|
||||||
| `site-explorer-best-by-external-links` | Best content by links |
|
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -20,16 +20,10 @@ Audit existing content performance, identify topic gaps vs competitors, map topi
|
|||||||
|
|
||||||
## MCP Tool Usage
|
## MCP Tool Usage
|
||||||
|
|
||||||
### Ahrefs for Content Data
|
### SEO Data
|
||||||
```
|
```
|
||||||
site-explorer-top-pages: Get top performing pages
|
our-seo-agent CLI: Primary content/traffic data source (future); use --input for pre-fetched JSON
|
||||||
site-explorer-pages-by-traffic: Pages ranked by organic traffic
|
WebSearch / WebFetch: Supplementary content data
|
||||||
site-explorer-organic-keywords: Keywords per page
|
|
||||||
site-explorer-organic-competitors: Find content competitors
|
|
||||||
site-explorer-best-by-external-links: Best content by backlinks
|
|
||||||
keywords-explorer-matching-terms: Secondary keyword suggestions
|
|
||||||
keywords-explorer-related-terms: LSI keyword suggestions
|
|
||||||
serp-overview: Analyze top ranking results for a keyword
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### WebSearch for Content Research
|
### WebSearch for Content Research
|
||||||
@@ -47,7 +41,7 @@ notion-create-pages: Save audit reports to SEO Audit Log
|
|||||||
|
|
||||||
### 1. Content Audit
|
### 1. Content Audit
|
||||||
1. Crawl sitemap to discover all content URLs
|
1. Crawl sitemap to discover all content URLs
|
||||||
2. Fetch top pages data from Ahrefs (traffic, keywords, backlinks)
|
2. Fetch top pages data via our-seo-agent CLI, pre-fetched JSON, or WebSearch
|
||||||
3. Classify content types (blog, product, service, landing, resource)
|
3. Classify content types (blog, product, service, landing, resource)
|
||||||
4. Score each page performance (0-100 composite)
|
4. Score each page performance (0-100 composite)
|
||||||
5. Detect decaying content (traffic decline patterns)
|
5. Detect decaying content (traffic decline patterns)
|
||||||
@@ -56,7 +50,7 @@ notion-create-pages: Save audit reports to SEO Audit Log
|
|||||||
8. Generate recommendations
|
8. Generate recommendations
|
||||||
|
|
||||||
### 2. Content Gap Analysis
|
### 2. Content Gap Analysis
|
||||||
1. Gather target site keywords from Ahrefs
|
1. Gather target site keywords via our-seo-agent CLI or pre-fetched data
|
||||||
2. Gather competitor top pages and keywords
|
2. Gather competitor top pages and keywords
|
||||||
3. Identify topics present in competitors but missing from target
|
3. Identify topics present in competitors but missing from target
|
||||||
4. Score gaps by priority (traffic potential + competition coverage)
|
4. Score gaps by priority (traffic potential + competition coverage)
|
||||||
@@ -123,7 +117,7 @@ notion-create-pages: Save audit reports to SEO Audit Log
|
|||||||
|
|
||||||
## Limitations
|
## Limitations
|
||||||
|
|
||||||
- Ahrefs API required for traffic and keyword data
|
- our-seo-agent CLI or pre-fetched JSON required for traffic and keyword data
|
||||||
- Competitor analysis limited to publicly available content
|
- Competitor analysis limited to publicly available content
|
||||||
- Content decay detection uses heuristic without historical data in standalone mode
|
- Content decay detection uses heuristic without historical data in standalone mode
|
||||||
- Topic clustering requires minimum 3 topics per cluster
|
- Topic clustering requires minimum 3 topics per cluster
|
||||||
|
|||||||
@@ -68,12 +68,13 @@ python scripts/product_schema_checker.py --sitemap https://example.com/product-s
|
|||||||
- Merchant listing schema support
|
- Merchant listing schema support
|
||||||
- Korean market: Naver Shopping structured data requirements
|
- Korean market: Naver Shopping structured data requirements
|
||||||
|
|
||||||
## Ahrefs MCP Tools Used
|
## Data Sources
|
||||||
|
|
||||||
| Tool | Purpose |
|
| Source | Purpose |
|
||||||
|------|---------|
|
|--------|---------|
|
||||||
| `site-explorer-pages-by-traffic` | Identify top product/category pages |
|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
|
||||||
| `site-explorer-organic-keywords` | Product page keyword performance |
|
| WebSearch / WebFetch | Supplementary live data |
|
||||||
|
| Notion MCP | Save audit report to database |
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -23,10 +23,10 @@ Audit e-commerce sites for product page optimization, structured data validation
|
|||||||
|
|
||||||
## MCP Tool Usage
|
## MCP Tool Usage
|
||||||
|
|
||||||
### Ahrefs for Product Page Discovery
|
### SEO Data
|
||||||
```
|
```
|
||||||
mcp__ahrefs__site-explorer-pages-by-traffic: Identify top product and category pages
|
our-seo-agent CLI: Primary product page data source (future); use --input for pre-fetched JSON
|
||||||
mcp__ahrefs__site-explorer-organic-keywords: Product page keyword performance
|
WebSearch / WebFetch: Supplementary product page data
|
||||||
```
|
```
|
||||||
|
|
||||||
### WebSearch for Marketplace Checks
|
### WebSearch for Marketplace Checks
|
||||||
@@ -43,7 +43,7 @@ mcp__notion__notion-create-pages: Save audit report to SEO Audit Log database
|
|||||||
## Workflow
|
## Workflow
|
||||||
|
|
||||||
### 1. Product Page Audit
|
### 1. Product Page Audit
|
||||||
1. Discover product pages via Ahrefs pages-by-traffic or sitemap
|
1. Discover product pages via our-seo-agent CLI, pre-fetched JSON, or sitemap crawl
|
||||||
2. For each product page check:
|
2. For each product page check:
|
||||||
- Title tag: contains product name, under 60 chars
|
- Title tag: contains product name, under 60 chars
|
||||||
- Meta description: includes price/feature info, under 155 chars
|
- Meta description: includes price/feature info, under 155 chars
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
SEO KPI and performance framework for unified metrics aggregation across all SEO dimensions. Establishes baselines, sets targets (30/60/90-day), generates executive summaries with health scores, provides tactical breakdowns, estimates ROI using Ahrefs traffic cost, and supports period-over-period comparison (MoM, QoQ).
|
SEO KPI and performance framework for unified metrics aggregation across all SEO dimensions. Establishes baselines, sets targets (30/60/90-day), generates executive summaries with health scores, provides tactical breakdowns, estimates ROI using our-seo-agent traffic cost data, and supports period-over-period comparison (MoM, QoQ).
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
||||||
@@ -49,10 +49,10 @@ python scripts/kpi_aggregator.py --url https://example.com --roi --json
|
|||||||
- Content KPIs (indexed pages, content freshness score, thin content ratio)
|
- Content KPIs (indexed pages, content freshness score, thin content ratio)
|
||||||
- Link KPIs (domain rating, referring domains, link velocity)
|
- Link KPIs (domain rating, referring domains, link velocity)
|
||||||
- Local KPIs (GBP visibility, review score, citation accuracy)
|
- Local KPIs (GBP visibility, review score, citation accuracy)
|
||||||
- Multi-source data aggregation from Ahrefs and other skill outputs
|
- Multi-source data aggregation from our-seo-agent CLI and other skill outputs
|
||||||
- Baseline establishment and target setting (30/60/90 day)
|
- Baseline establishment and target setting (30/60/90 day)
|
||||||
- Overall health score (0-100) with weighted dimensions
|
- Overall health score (0-100) with weighted dimensions
|
||||||
- ROI estimation using Ahrefs organic traffic cost
|
- ROI estimation using organic traffic cost data
|
||||||
|
|
||||||
## Performance Reporter
|
## Performance Reporter
|
||||||
|
|
||||||
@@ -79,15 +79,13 @@ python scripts/performance_reporter.py --url https://example.com --period monthl
|
|||||||
- Target vs actual comparison with progress %
|
- Target vs actual comparison with progress %
|
||||||
- Traffic value change (ROI proxy)
|
- Traffic value change (ROI proxy)
|
||||||
|
|
||||||
## Ahrefs MCP Tools Used
|
## Data Sources
|
||||||
|
|
||||||
| Tool | Purpose |
|
| Source | Purpose |
|
||||||
|------|---------|
|
|--------|---------|
|
||||||
| `site-explorer-metrics` | Current organic metrics |
|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
|
||||||
| `site-explorer-metrics-history` | Historical metrics trends |
|
| WebSearch / WebFetch | Supplementary live data |
|
||||||
| `site-explorer-metrics-by-country` | Country-level breakdown |
|
| Notion MCP | Save audit report to database |
|
||||||
| `site-explorer-domain-rating-history` | DR trend over time |
|
|
||||||
| `site-explorer-total-search-volume-history` | Total keyword volume trend |
|
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -10,26 +10,23 @@ description: |
|
|||||||
|
|
||||||
## Purpose
|
## Purpose
|
||||||
|
|
||||||
Aggregate SEO KPIs across all dimensions into a unified dashboard. Establish baselines, set targets (30/60/90-day), generate executive summaries with health scores, provide tactical breakdowns, estimate ROI using Ahrefs traffic cost, and support period-over-period comparison (MoM, QoQ, YoY).
|
Aggregate SEO KPIs across all dimensions into a unified dashboard. Establish baselines, set targets (30/60/90-day), generate executive summaries with health scores, provide tactical breakdowns, estimate ROI using our-seo-agent traffic cost data, and support period-over-period comparison (MoM, QoQ, YoY).
|
||||||
|
|
||||||
## Core Capabilities
|
## Core Capabilities
|
||||||
|
|
||||||
1. **KPI Aggregation** - Unified metrics across 7 dimensions (traffic, rankings, links, technical, content, engagement, local)
|
1. **KPI Aggregation** - Unified metrics across 7 dimensions (traffic, rankings, links, technical, content, engagement, local)
|
||||||
2. **Health Scoring** - Weighted 0-100 score with trend direction
|
2. **Health Scoring** - Weighted 0-100 score with trend direction
|
||||||
3. **Baseline & Targets** - Establish baselines and set 30/60/90 day growth targets
|
3. **Baseline & Targets** - Establish baselines and set 30/60/90 day growth targets
|
||||||
4. **ROI Estimation** - Traffic value from Ahrefs organic cost
|
4. **ROI Estimation** - Traffic value from organic cost data
|
||||||
5. **Performance Reporting** - Period-over-period comparison with executive summary
|
5. **Performance Reporting** - Period-over-period comparison with executive summary
|
||||||
6. **Tactical Breakdown** - Actionable next steps per dimension
|
6. **Tactical Breakdown** - Actionable next steps per dimension
|
||||||
|
|
||||||
## MCP Tool Usage
|
## MCP Tool Usage
|
||||||
|
|
||||||
### Ahrefs for SEO Metrics
|
### SEO Data
|
||||||
```
|
```
|
||||||
mcp__ahrefs__site-explorer-metrics: Current organic metrics snapshot
|
our-seo-agent CLI: Primary metrics source (future); use --input for pre-fetched JSON
|
||||||
mcp__ahrefs__site-explorer-metrics-history: Historical trend data
|
WebSearch / WebFetch: Supplementary metrics data
|
||||||
mcp__ahrefs__site-explorer-metrics-by-country: Country-level breakdown
|
|
||||||
mcp__ahrefs__site-explorer-domain-rating-history: Domain rating trend
|
|
||||||
mcp__ahrefs__site-explorer-total-search-volume-history: Keyword volume trend
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Notion for Report Storage
|
### Notion for Report Storage
|
||||||
@@ -45,7 +42,7 @@ mcp__notion__*: Save reports to SEO Audit Log database
|
|||||||
3. Calculate dimension scores with weights (traffic 25%, rankings 20%, technical 20%, content 15%, links 15%, local 5%)
|
3. Calculate dimension scores with weights (traffic 25%, rankings 20%, technical 20%, content 15%, links 15%, local 5%)
|
||||||
4. Compute overall health score (0-100)
|
4. Compute overall health score (0-100)
|
||||||
5. Set 30/60/90 day targets (5%/10%/20% improvement)
|
5. Set 30/60/90 day targets (5%/10%/20% improvement)
|
||||||
6. Estimate ROI from Ahrefs traffic cost (divide raw cost by 100 for USD)
|
6. Estimate ROI from traffic cost data (use our-seo-agent CLI or pre-fetched JSON)
|
||||||
|
|
||||||
### 2. Performance Reporting
|
### 2. Performance Reporting
|
||||||
1. Determine date range from period (monthly/quarterly/yearly/custom)
|
1. Determine date range from period (monthly/quarterly/yearly/custom)
|
||||||
@@ -94,10 +91,10 @@ mcp__notion__*: Save reports to SEO Audit Log database
|
|||||||
|
|
||||||
## Limitations
|
## Limitations
|
||||||
|
|
||||||
- Local KPIs require external GBP data (not available via Ahrefs)
|
- Local KPIs require external GBP data (not available via our-seo-agent)
|
||||||
- Engagement KPIs (bounce rate, session duration) require Google Analytics
|
- Engagement KPIs (bounce rate, session duration) require Google Analytics
|
||||||
- Technical health is estimated heuristically from available data
|
- Technical health is estimated heuristically from available data
|
||||||
- ROI is estimated from Ahrefs traffic cost, not actual revenue
|
- ROI is estimated from organic traffic cost data, not actual revenue
|
||||||
|
|
||||||
## Notion Output (Required)
|
## Notion Output (Required)
|
||||||
|
|
||||||
|
|||||||
@@ -83,12 +83,13 @@ python scripts/international_auditor.py --url https://example.com --korean-expan
|
|||||||
- CJK-specific URL encoding issues
|
- CJK-specific URL encoding issues
|
||||||
- Regional search engine considerations (Naver, Baidu, Yahoo Japan)
|
- Regional search engine considerations (Naver, Baidu, Yahoo Japan)
|
||||||
|
|
||||||
## Ahrefs MCP Tools Used
|
## Data Sources
|
||||||
|
|
||||||
| Tool | Purpose |
|
| Source | Purpose |
|
||||||
|------|---------|
|
|--------|---------|
|
||||||
| `site-explorer-metrics-by-country` | Country-level traffic distribution |
|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
|
||||||
| `site-explorer-organic-keywords` | Keywords by country filter |
|
| WebSearch / WebFetch | Supplementary live data |
|
||||||
|
| Notion MCP | Save audit report to database |
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -21,10 +21,10 @@ Audit international SEO implementation: hreflang tags, URL structure patterns, c
|
|||||||
|
|
||||||
## MCP Tool Usage
|
## MCP Tool Usage
|
||||||
|
|
||||||
### Ahrefs for Country Metrics
|
### SEO Data
|
||||||
```
|
```
|
||||||
mcp__ahrefs__site-explorer-metrics-by-country: Country-level traffic distribution
|
our-seo-agent CLI: Primary country metrics source (future); use --input for pre-fetched JSON
|
||||||
mcp__ahrefs__site-explorer-organic-keywords: Keywords filtered by country
|
WebSearch / WebFetch: Supplementary international data
|
||||||
```
|
```
|
||||||
|
|
||||||
### Notion for Report Storage
|
### Notion for Report Storage
|
||||||
@@ -69,7 +69,7 @@ WebSearch: Research hreflang implementation guides and regional search engine re
|
|||||||
4. Recommend proper implementation (suggest, do not force)
|
4. Recommend proper implementation (suggest, do not force)
|
||||||
|
|
||||||
### 5. Korean Expansion Analysis (Optional)
|
### 5. Korean Expansion Analysis (Optional)
|
||||||
1. Analyze current traffic by country via Ahrefs
|
1. Analyze current traffic by country via our-seo-agent CLI or pre-fetched data
|
||||||
2. Recommend priority target markets for Korean businesses
|
2. Recommend priority target markets for Korean businesses
|
||||||
3. Check CJK-specific URL encoding issues
|
3. Check CJK-specific URL encoding issues
|
||||||
4. Advise on regional search engines (Naver, Baidu, Yahoo Japan)
|
4. Advise on regional search engines (Naver, Baidu, Yahoo Japan)
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
AI search visibility and brand radar tool for tracking how a brand appears in AI-generated search answers. Monitors AI answer citations, tracks share of voice in AI search vs competitors, analyzes cited domains and pages, and tracks impressions/mentions history. Uses Ahrefs Brand Radar APIs for comprehensive AI visibility monitoring.
|
AI search visibility and brand radar tool for tracking how a brand appears in AI-generated search answers. Monitors AI answer citations, tracks share of voice in AI search vs competitors, analyzes cited domains and pages, and tracks impressions/mentions history. Uses our-seo-agent CLI or pre-fetched data for comprehensive AI visibility monitoring.
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
||||||
@@ -74,19 +74,13 @@ python scripts/ai_citation_analyzer.py --target example.com --responses --json
|
|||||||
- Competitor citation comparison
|
- Competitor citation comparison
|
||||||
- Recommendation generation for improving AI visibility
|
- Recommendation generation for improving AI visibility
|
||||||
|
|
||||||
## Ahrefs MCP Tools Used
|
## Data Sources
|
||||||
|
|
||||||
| Tool | Purpose |
|
| Source | Purpose |
|
||||||
|------|---------|
|
|--------|---------|
|
||||||
| `brand-radar-ai-responses` | Get AI-generated responses mentioning brand |
|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
|
||||||
| `brand-radar-cited-domains` | Domains cited in AI answers |
|
| WebSearch / WebFetch | Supplementary live data |
|
||||||
| `brand-radar-cited-pages` | Specific pages cited in AI answers |
|
| Notion MCP | Save audit report to database |
|
||||||
| `brand-radar-impressions-history` | Brand impression trend over time |
|
|
||||||
| `brand-radar-impressions-overview` | Current impression metrics |
|
|
||||||
| `brand-radar-mentions-history` | Brand mention trend over time |
|
|
||||||
| `brand-radar-mentions-overview` | Current mention metrics |
|
|
||||||
| `brand-radar-sov-history` | Share of voice trend |
|
|
||||||
| `brand-radar-sov-overview` | Current share of voice |
|
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -2,14 +2,14 @@
|
|||||||
name: seo-ai-visibility
|
name: seo-ai-visibility
|
||||||
description: |
|
description: |
|
||||||
AI search visibility and brand radar monitoring. Tracks how a brand appears
|
AI search visibility and brand radar monitoring. Tracks how a brand appears
|
||||||
in AI-generated search answers using Ahrefs Brand Radar APIs.
|
in AI-generated search answers using our-seo-agent CLI or pre-fetched data.
|
||||||
Triggers: AI search, AI visibility, brand radar, AI citations,
|
Triggers: AI search, AI visibility, brand radar, AI citations,
|
||||||
share of voice, AI answers, AI mentions.
|
share of voice, AI answers, AI mentions.
|
||||||
---
|
---
|
||||||
|
|
||||||
# SEO AI Visibility & Brand Radar
|
# SEO AI Visibility & Brand Radar
|
||||||
|
|
||||||
Monitor and analyze brand visibility in AI-generated search results. This skill uses Ahrefs Brand Radar APIs to track impressions, mentions, share of voice, cited domains, cited pages, and AI response content.
|
Monitor and analyze brand visibility in AI-generated search results. This skill uses our-seo-agent CLI or pre-fetched data to track impressions, mentions, share of voice, cited domains, cited pages, and AI response content.
|
||||||
|
|
||||||
## Capabilities
|
## Capabilities
|
||||||
|
|
||||||
@@ -30,24 +30,18 @@ Monitor and analyze brand visibility in AI-generated search results. This skill
|
|||||||
## Workflow
|
## Workflow
|
||||||
|
|
||||||
1. **Input**: User provides target domain and optional competitors
|
1. **Input**: User provides target domain and optional competitors
|
||||||
2. **Data Collection**: Fetch metrics from Ahrefs Brand Radar APIs
|
2. **Data Collection**: Fetch metrics from our-seo-agent CLI or pre-fetched JSON
|
||||||
3. **Analysis**: Calculate trends, compare competitors, analyze sentiment
|
3. **Analysis**: Calculate trends, compare competitors, analyze sentiment
|
||||||
4. **Recommendations**: Generate actionable Korean-language recommendations
|
4. **Recommendations**: Generate actionable Korean-language recommendations
|
||||||
5. **Output**: JSON report and Notion database entry
|
5. **Output**: JSON report and Notion database entry
|
||||||
|
|
||||||
## Ahrefs MCP Tools
|
## Data Sources
|
||||||
|
|
||||||
| Tool | Purpose |
|
| Source | Purpose |
|
||||||
|------|---------|
|
|--------|---------|
|
||||||
| `brand-radar-ai-responses` | AI-generated responses mentioning brand |
|
| `our-seo-agent` CLI | Primary AI visibility data source (future); use `--input` for pre-fetched JSON |
|
||||||
| `brand-radar-cited-domains` | Domains cited in AI answers |
|
| WebSearch / WebFetch | Supplementary AI search data |
|
||||||
| `brand-radar-cited-pages` | Specific pages cited in AI answers |
|
| Notion MCP | Save audit report to database |
|
||||||
| `brand-radar-impressions-history` | Impression trend over time |
|
|
||||||
| `brand-radar-impressions-overview` | Current impression metrics |
|
|
||||||
| `brand-radar-mentions-history` | Mention trend over time |
|
|
||||||
| `brand-radar-mentions-overview` | Current mention metrics |
|
|
||||||
| `brand-radar-sov-history` | Share of voice trend |
|
|
||||||
| `brand-radar-sov-overview` | Current share of voice |
|
|
||||||
|
|
||||||
## Notion Output
|
## Notion Output
|
||||||
|
|
||||||
@@ -100,7 +94,7 @@ All reports are saved to the OurDigital SEO Audit Log:
|
|||||||
|
|
||||||
## Limitations
|
## Limitations
|
||||||
|
|
||||||
- Requires Ahrefs Brand Radar API access (not available in basic plans)
|
- Requires our-seo-agent CLI or pre-fetched AI visibility data
|
||||||
- AI search landscape changes rapidly; data may not reflect real-time state
|
- AI search landscape changes rapidly; data may not reflect real-time state
|
||||||
- Share of Voice metrics are relative to tracked competitor set only
|
- Share of Voice metrics are relative to tracked competitor set only
|
||||||
- Sentiment analysis based on AI-generated text, not user perception
|
- Sentiment analysis based on AI-generated text, not user perception
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
Knowledge Graph and Entity SEO tool for analyzing brand entity presence in Google Knowledge Graph, Knowledge Panels, People Also Ask (PAA), and FAQ rich results. Checks entity attribute completeness, Wikipedia/Wikidata presence, and Korean equivalents (Naver knowledge iN, Naver encyclopedia). Uses WebSearch and WebFetch for data collection, Ahrefs serp-overview for SERP feature detection.
|
Knowledge Graph and Entity SEO tool for analyzing brand entity presence in Google Knowledge Graph, Knowledge Panels, People Also Ask (PAA), and FAQ rich results. Checks entity attribute completeness, Wikipedia/Wikidata presence, and Korean equivalents (Naver knowledge iN, Naver encyclopedia). Uses WebSearch and WebFetch for data collection, WebSearch for SERP feature detection.
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
||||||
@@ -75,7 +75,7 @@ python scripts/entity_auditor.py --url https://example.com --entity "Brand Name"
|
|||||||
|--------|---------|
|
|--------|---------|
|
||||||
| WebSearch | Search for entity/brand to detect Knowledge Panel |
|
| WebSearch | Search for entity/brand to detect Knowledge Panel |
|
||||||
| WebFetch | Fetch Wikipedia, Wikidata, Naver pages |
|
| WebFetch | Fetch Wikipedia, Wikidata, Naver pages |
|
||||||
| Ahrefs `serp-overview` | SERP feature detection for entity keywords |
|
| WebSearch | SERP feature detection for entity keywords |
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -48,7 +48,7 @@ Analyze brand entity presence in Google Knowledge Graph, Knowledge Panels, Peopl
|
|||||||
3. Check sameAs links accessibility
|
3. Check sameAs links accessibility
|
||||||
4. Use **WebSearch** to search brand name and analyze SERP features
|
4. Use **WebSearch** to search brand name and analyze SERP features
|
||||||
5. Monitor PAA questions for brand keywords
|
5. Monitor PAA questions for brand keywords
|
||||||
6. Use **Ahrefs serp-overview** for SERP feature detection
|
6. Use **WebSearch** for SERP feature detection
|
||||||
7. Save report to **Notion** SEO Audit Log
|
7. Save report to **Notion** SEO Audit Log
|
||||||
|
|
||||||
## Tools Used
|
## Tools Used
|
||||||
@@ -57,7 +57,7 @@ Analyze brand entity presence in Google Knowledge Graph, Knowledge Panels, Peopl
|
|||||||
|------|---------|
|
|------|---------|
|
||||||
| WebSearch | Search for entity/brand to detect Knowledge Panel |
|
| WebSearch | Search for entity/brand to detect Knowledge Panel |
|
||||||
| WebFetch | Fetch Wikipedia, Wikidata, Naver pages, website schemas |
|
| WebFetch | Fetch Wikipedia, Wikidata, Naver pages, website schemas |
|
||||||
| Ahrefs `serp-overview` | SERP feature detection for entity keywords |
|
| WebSearch | SERP feature detection for entity keywords |
|
||||||
| Notion | Save audit reports to SEO Audit Log database |
|
| Notion | Save audit reports to SEO Audit Log database |
|
||||||
|
|
||||||
## Notion Output
|
## Notion Output
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
Competitor intelligence and benchmarking tool for comprehensive SEO competitive analysis. Auto-discovers competitors via Ahrefs, builds competitor profile cards (DR, traffic, keywords, backlinks, content volume), creates head-to-head comparison matrices, tracks traffic trends, analyzes keyword overlap, compares content freshness/volume, and scores competitive threats. Supports Korean market competitor analysis including Naver Blog/Cafe presence.
|
Competitor intelligence and benchmarking tool for comprehensive SEO competitive analysis. Auto-discovers competitors via our-seo-agent CLI, builds competitor profile cards (DR, traffic, keywords, backlinks, content volume), creates head-to-head comparison matrices, tracks traffic trends, analyzes keyword overlap, compares content freshness/volume, and scores competitive threats. Supports Korean market competitor analysis including Naver Blog/Cafe presence.
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
||||||
@@ -41,7 +41,7 @@ python scripts/competitor_profiler.py --target https://example.com --korean-mark
|
|||||||
```
|
```
|
||||||
|
|
||||||
**Capabilities**:
|
**Capabilities**:
|
||||||
- Competitor auto-discovery via Ahrefs organic-competitors
|
- Competitor auto-discovery via our-seo-agent CLI or pre-fetched data
|
||||||
- Competitor profile cards:
|
- Competitor profile cards:
|
||||||
- Domain Rating (DR)
|
- Domain Rating (DR)
|
||||||
- Organic traffic estimate
|
- Organic traffic estimate
|
||||||
@@ -78,21 +78,13 @@ python scripts/competitive_monitor.py --target https://example.com --scope traff
|
|||||||
- Alert generation for significant competitive movements
|
- Alert generation for significant competitive movements
|
||||||
- Market share estimation based on organic traffic
|
- Market share estimation based on organic traffic
|
||||||
|
|
||||||
## Ahrefs MCP Tools Used
|
## Data Sources
|
||||||
|
|
||||||
| Tool | Purpose |
|
| Source | Purpose |
|
||||||
|------|---------|
|
|--------|---------|
|
||||||
| `site-explorer-metrics` | Current organic metrics |
|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
|
||||||
| `site-explorer-metrics-history` | Historical metrics trends |
|
| WebSearch / WebFetch | Supplementary live data |
|
||||||
| `site-explorer-organic-competitors` | Discover organic competitors |
|
| Notion MCP | Save audit report to database |
|
||||||
| `site-explorer-organic-keywords` | Keyword rankings per domain |
|
|
||||||
| `site-explorer-top-pages` | Top performing pages |
|
|
||||||
| `site-explorer-pages-by-traffic` | Pages ranked by traffic |
|
|
||||||
| `site-explorer-domain-rating` | Domain Rating |
|
|
||||||
| `site-explorer-domain-rating-history` | DR trend |
|
|
||||||
| `site-explorer-referring-domains` | Referring domain list |
|
|
||||||
| `site-explorer-backlinks-stats` | Backlink overview |
|
|
||||||
| `site-explorer-pages-history` | Page index history |
|
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -15,7 +15,7 @@ Comprehensive competitor intelligence for SEO: auto-discover competitors, build
|
|||||||
|
|
||||||
## Core Capabilities
|
## Core Capabilities
|
||||||
|
|
||||||
1. **Competitor Discovery** - Auto-discover organic competitors via Ahrefs
|
1. **Competitor Discovery** - Auto-discover organic competitors via our-seo-agent CLI
|
||||||
2. **Profile Cards** - DR, traffic, keywords, referring domains, top pages, content volume
|
2. **Profile Cards** - DR, traffic, keywords, referring domains, top pages, content volume
|
||||||
3. **Comparison Matrix** - Multi-dimensional head-to-head comparison
|
3. **Comparison Matrix** - Multi-dimensional head-to-head comparison
|
||||||
4. **Keyword Overlap** - Shared, unique, and gap keyword analysis
|
4. **Keyword Overlap** - Shared, unique, and gap keyword analysis
|
||||||
@@ -26,19 +26,10 @@ Comprehensive competitor intelligence for SEO: auto-discover competitors, build
|
|||||||
|
|
||||||
## MCP Tool Usage
|
## MCP Tool Usage
|
||||||
|
|
||||||
### Ahrefs for Competitive Data
|
### SEO Data
|
||||||
```
|
```
|
||||||
mcp__ahrefs__site-explorer-organic-competitors: Discover organic competitors
|
our-seo-agent CLI: Primary competitive data source (future); use --input for pre-fetched JSON
|
||||||
mcp__ahrefs__site-explorer-metrics: Current organic metrics
|
WebSearch / WebFetch: Supplementary competitor data
|
||||||
mcp__ahrefs__site-explorer-metrics-history: Historical metric trends
|
|
||||||
mcp__ahrefs__site-explorer-domain-rating: Domain Rating score
|
|
||||||
mcp__ahrefs__site-explorer-domain-rating-history: DR trend over time
|
|
||||||
mcp__ahrefs__site-explorer-organic-keywords: Keyword rankings per domain
|
|
||||||
mcp__ahrefs__site-explorer-top-pages: Top performing pages
|
|
||||||
mcp__ahrefs__site-explorer-pages-by-traffic: Pages ranked by traffic
|
|
||||||
mcp__ahrefs__site-explorer-referring-domains: Referring domain list
|
|
||||||
mcp__ahrefs__site-explorer-backlinks-stats: Backlink overview
|
|
||||||
mcp__ahrefs__site-explorer-pages-history: Page index history
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Notion for Report Storage
|
### Notion for Report Storage
|
||||||
@@ -55,7 +46,7 @@ WebSearch: Check Naver Blog/Cafe presence for competitors
|
|||||||
|
|
||||||
### Competitor Profiling
|
### Competitor Profiling
|
||||||
1. Accept target URL/domain
|
1. Accept target URL/domain
|
||||||
2. Auto-discover competitors via Ahrefs organic-competitors (or use provided list)
|
2. Auto-discover competitors via our-seo-agent CLI or use provided list
|
||||||
3. Build profile card for target and each competitor (DR, traffic, keywords, backlinks, content)
|
3. Build profile card for target and each competitor (DR, traffic, keywords, backlinks, content)
|
||||||
4. Analyze keyword overlap between target and each competitor
|
4. Analyze keyword overlap between target and each competitor
|
||||||
5. Build multi-dimensional comparison matrix
|
5. Build multi-dimensional comparison matrix
|
||||||
@@ -136,9 +127,9 @@ WebSearch: Check Naver Blog/Cafe presence for competitors
|
|||||||
|
|
||||||
## Limitations
|
## Limitations
|
||||||
|
|
||||||
- Ahrefs data has ~24h freshness lag
|
- Data freshness depends on source and collection method
|
||||||
- Keyword overlap limited to top 1,000 keywords per domain
|
- Keyword overlap limited to top 1,000 keywords per domain
|
||||||
- Content velocity based on Ahrefs page index (not real-time crawl)
|
- Content velocity based on page index data (not real-time crawl)
|
||||||
- Naver presence detection is heuristic-based
|
- Naver presence detection is heuristic-based
|
||||||
|
|
||||||
## Notion Output (Required)
|
## Notion Output (Required)
|
||||||
|
|||||||
@@ -78,8 +78,8 @@ python scripts/crawl_budget_analyzer.py --log-file access.log --sitemap https://
|
|||||||
# Per-bot profiling
|
# Per-bot profiling
|
||||||
python scripts/crawl_budget_analyzer.py --log-file access.log --scope bots --json
|
python scripts/crawl_budget_analyzer.py --log-file access.log --scope bots --json
|
||||||
|
|
||||||
# With Ahrefs page history comparison
|
# With external page history comparison
|
||||||
python scripts/crawl_budget_analyzer.py --log-file access.log --url https://example.com --ahrefs --json
|
python scripts/crawl_budget_analyzer.py --log-file access.log --url https://example.com --input pages.json --json
|
||||||
```
|
```
|
||||||
|
|
||||||
**Capabilities**:
|
**Capabilities**:
|
||||||
@@ -112,7 +112,7 @@ python scripts/crawl_budget_analyzer.py --log-file access.log --url https://exam
|
|||||||
|--------|---------|
|
|--------|---------|
|
||||||
| Server access logs | Primary crawl data |
|
| Server access logs | Primary crawl data |
|
||||||
| XML sitemap | Reference for expected crawlable pages |
|
| XML sitemap | Reference for expected crawlable pages |
|
||||||
| Ahrefs `site-explorer-pages-history` | Compare indexed pages with crawled pages |
|
| `our-seo-agent` CLI | Compare indexed pages with crawled pages (future) |
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -62,8 +62,8 @@ python scripts/crawl_budget_analyzer.py --log-file access.log --scope orphans --
|
|||||||
python scripts/crawl_budget_analyzer.py --log-file access.log --scope bots --json
|
python scripts/crawl_budget_analyzer.py --log-file access.log --scope bots --json
|
||||||
```
|
```
|
||||||
|
|
||||||
### Step 4: Cross-Reference with Ahrefs (Optional)
|
### Step 4: Cross-Reference with External Data (Optional)
|
||||||
Use `site-explorer-pages-history` to compare indexed pages vs crawled pages.
|
Use `our-seo-agent` CLI or provide pre-fetched JSON via `--input` to compare indexed pages vs crawled pages. WebSearch can supplement with current indexing data.
|
||||||
|
|
||||||
### Step 5: Generate Recommendations
|
### Step 5: Generate Recommendations
|
||||||
Prioritized action items:
|
Prioritized action items:
|
||||||
@@ -76,12 +76,21 @@ Prioritized action items:
|
|||||||
### Step 6: Report to Notion
|
### Step 6: Report to Notion
|
||||||
Save Korean-language report to SEO Audit Log database.
|
Save Korean-language report to SEO Audit Log database.
|
||||||
|
|
||||||
## MCP Tools Used
|
| Property | Type | Description |
|
||||||
|
|----------|------|-------------|
|
||||||
|
| Issue | Title | Report title (Korean + date) |
|
||||||
|
| Site | URL | Audited website URL |
|
||||||
|
| Category | Select | Crawl Budget |
|
||||||
|
| Priority | Select | Based on efficiency score |
|
||||||
|
| Found Date | Date | Analysis date (YYYY-MM-DD) |
|
||||||
|
| Audit ID | Rich Text | Format: CRAWL-YYYYMMDD-NNN |
|
||||||
|
|
||||||
| Tool | Purpose |
|
## Data Sources
|
||||||
|------|---------|
|
|
||||||
| Ahrefs `site-explorer-pages-history` | Compare indexed pages with crawled pages |
|
| Source | Purpose |
|
||||||
| Notion | Save audit report to database |
|
|--------|---------|
|
||||||
|
| `our-seo-agent` CLI | Future primary data source; use `--input` for pre-fetched JSON |
|
||||||
|
| Notion MCP | Save audit report to database |
|
||||||
| WebSearch | Current bot documentation and best practices |
|
| WebSearch | Current bot documentation and best practices |
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
@@ -118,6 +127,27 @@ Save Korean-language report to SEO Audit Log database.
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Korean Output Example
|
||||||
|
|
||||||
|
```
|
||||||
|
# 크롤 예산 분석 보고서 - example.com
|
||||||
|
|
||||||
|
## 분석 기간: 2025-01-01 ~ 2025-01-31
|
||||||
|
|
||||||
|
### 봇별 크롤 현황
|
||||||
|
| 봇 | 요청 수 | 고유 URL | 일 평균 |
|
||||||
|
|----|---------|---------|---------|
|
||||||
|
| Googlebot | 80,000 | 12,000 | 2,580 |
|
||||||
|
| Yeti (Naver) | 35,000 | 8,000 | 1,129 |
|
||||||
|
|
||||||
|
### 크롤 낭비 요인
|
||||||
|
- 파라미터 URL: 5,000건 (3.3%)
|
||||||
|
- 리다이렉트 체인: 2,000건 (1.3%)
|
||||||
|
- 소프트 404: 1,500건 (1.0%)
|
||||||
|
|
||||||
|
### 효율성 점수: 72/100
|
||||||
|
```
|
||||||
|
|
||||||
## Limitations
|
## Limitations
|
||||||
|
|
||||||
- Requires actual server access logs (not available via standard web crawling)
|
- Requires actual server access logs (not available via standard web crawling)
|
||||||
|
|||||||
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
SEO site migration planning and monitoring tool for comprehensive pre-migration risk assessment, redirect mapping, URL inventory, crawl baseline capture, and post-migration traffic/indexation monitoring. Supports domain moves, platform changes, URL restructuring, HTTPS migrations, and subdomain consolidation. Captures full URL inventory via Firecrawl crawl, builds traffic/keyword baselines via Ahrefs, generates redirect maps with per-URL risk scoring, and tracks post-launch recovery with automated alerts.
|
SEO site migration planning and monitoring tool for comprehensive pre-migration risk assessment, redirect mapping, URL inventory, crawl baseline capture, and post-migration traffic/indexation monitoring. Supports domain moves, platform changes, URL restructuring, HTTPS migrations, and subdomain consolidation. Captures full URL inventory via Firecrawl crawl, builds traffic/keyword baselines via our-seo-agent CLI, generates redirect maps with per-URL risk scoring, and tracks post-launch recovery with automated alerts.
|
||||||
|
|
||||||
## Quick Start
|
## Quick Start
|
||||||
|
|
||||||
@@ -45,7 +45,7 @@ python scripts/migration_planner.py --domain https://blog.example.com --type sub
|
|||||||
|
|
||||||
**Capabilities**:
|
**Capabilities**:
|
||||||
- URL inventory via Firecrawl crawl (capture all URLs + status codes)
|
- URL inventory via Firecrawl crawl (capture all URLs + status codes)
|
||||||
- Ahrefs top-pages baseline (traffic, keywords per page)
|
- our-seo-agent top-pages baseline (traffic, keywords per page)
|
||||||
- Redirect map generation (old URL -> new URL mapping)
|
- Redirect map generation (old URL -> new URL mapping)
|
||||||
- Risk scoring per URL (based on traffic + backlinks + keyword rankings)
|
- Risk scoring per URL (based on traffic + backlinks + keyword rankings)
|
||||||
- Pre-migration checklist generation
|
- Pre-migration checklist generation
|
||||||
@@ -78,17 +78,13 @@ python scripts/migration_monitor.py --domain https://new-example.com --migration
|
|||||||
- Recovery timeline estimation
|
- Recovery timeline estimation
|
||||||
- Alert generation for traffic drops >20%
|
- Alert generation for traffic drops >20%
|
||||||
|
|
||||||
## Ahrefs MCP Tools Used
|
## Data Sources
|
||||||
|
|
||||||
| Tool | Purpose |
|
| Source | Purpose |
|
||||||
|------|---------|
|
|--------|---------|
|
||||||
| `site-explorer-metrics` | Current organic metrics (traffic, keywords) |
|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
|
||||||
| `site-explorer-metrics-history` | Historical metrics for pre/post comparison |
|
| WebSearch / WebFetch | Supplementary live data |
|
||||||
| `site-explorer-top-pages` | Top performing pages for baseline |
|
| Notion MCP | Save audit report to database |
|
||||||
| `site-explorer-pages-by-traffic` | Pages ranked by traffic for risk scoring |
|
|
||||||
| `site-explorer-organic-keywords` | Keyword rankings per page |
|
|
||||||
| `site-explorer-referring-domains` | Referring domains per page for risk scoring |
|
|
||||||
| `site-explorer-backlinks-stats` | Backlink overview for migration impact |
|
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -8,12 +8,12 @@ description: |
|
|||||||
|
|
||||||
## Purpose
|
## Purpose
|
||||||
|
|
||||||
Comprehensive site migration planning and post-migration monitoring for SEO: crawl-based URL inventory, traffic/keyword baseline capture via Ahrefs, redirect map generation with per-URL risk scoring, pre-migration checklist creation, and post-launch traffic/indexation/ranking recovery tracking with automated alerts. Supports domain moves, platform changes, URL restructuring, HTTPS migrations, and subdomain consolidation.
|
Comprehensive site migration planning and post-migration monitoring for SEO: crawl-based URL inventory, traffic/keyword baseline capture via our-seo-agent CLI, redirect map generation with per-URL risk scoring, pre-migration checklist creation, and post-launch traffic/indexation/ranking recovery tracking with automated alerts. Supports domain moves, platform changes, URL restructuring, HTTPS migrations, and subdomain consolidation.
|
||||||
|
|
||||||
## Core Capabilities
|
## Core Capabilities
|
||||||
|
|
||||||
1. **URL Inventory** - Crawl entire site via Firecrawl to capture all URLs and status codes
|
1. **URL Inventory** - Crawl entire site via Firecrawl to capture all URLs and status codes
|
||||||
2. **Traffic Baseline** - Capture per-page traffic, keywords, and backlinks via Ahrefs
|
2. **Traffic Baseline** - Capture per-page traffic, keywords, and backlinks via our-seo-agent CLI
|
||||||
3. **Redirect Map Generation** - Create old URL -> new URL mappings with 301 redirect rules
|
3. **Redirect Map Generation** - Create old URL -> new URL mappings with 301 redirect rules
|
||||||
4. **Risk Scoring** - Score each URL (0-100) based on traffic, backlinks, and keyword rankings
|
4. **Risk Scoring** - Score each URL (0-100) based on traffic, backlinks, and keyword rankings
|
||||||
5. **Pre-Migration Checklist** - Generate type-specific migration checklist (Korean)
|
5. **Pre-Migration Checklist** - Generate type-specific migration checklist (Korean)
|
||||||
@@ -26,15 +26,10 @@ Comprehensive site migration planning and post-migration monitoring for SEO: cra
|
|||||||
|
|
||||||
## MCP Tool Usage
|
## MCP Tool Usage
|
||||||
|
|
||||||
### Ahrefs for SEO Baseline & Monitoring
|
### SEO Data
|
||||||
```
|
```
|
||||||
mcp__ahrefs__site-explorer-metrics: Current organic metrics (traffic, keywords)
|
our-seo-agent CLI: Primary SEO baseline data source (future); use --input for pre-fetched JSON
|
||||||
mcp__ahrefs__site-explorer-metrics-history: Historical metrics for pre/post comparison
|
WebSearch / WebFetch: Supplementary migration data
|
||||||
mcp__ahrefs__site-explorer-top-pages: Top performing pages for baseline
|
|
||||||
mcp__ahrefs__site-explorer-pages-by-traffic: Pages ranked by traffic for risk scoring
|
|
||||||
mcp__ahrefs__site-explorer-organic-keywords: Keyword rankings per page
|
|
||||||
mcp__ahrefs__site-explorer-referring-domains: Referring domains for risk scoring
|
|
||||||
mcp__ahrefs__site-explorer-backlinks-stats: Backlink overview for migration impact
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Firecrawl for URL Inventory & Redirect Verification
|
### Firecrawl for URL Inventory & Redirect Verification
|
||||||
@@ -58,9 +53,9 @@ mcp__perplexity__search: Research migration best practices and common pitfalls
|
|||||||
### Pre-Migration Planning
|
### Pre-Migration Planning
|
||||||
1. Accept target domain, migration type, and new domain (if applicable)
|
1. Accept target domain, migration type, and new domain (if applicable)
|
||||||
2. Crawl URL inventory via Firecrawl (capture all URLs + status codes)
|
2. Crawl URL inventory via Firecrawl (capture all URLs + status codes)
|
||||||
3. Fetch Ahrefs top pages baseline (traffic, keywords, backlinks per page)
|
3. Fetch top pages baseline via our-seo-agent CLI or pre-fetched data
|
||||||
4. Fetch site-level metrics (total traffic, keywords, referring domains)
|
4. Fetch site-level metrics (total traffic, keywords, referring domains)
|
||||||
5. Enrich URL inventory with Ahrefs traffic/backlink data
|
5. Enrich URL inventory with traffic/backlink data from our-seo-agent CLI
|
||||||
6. Score risk per URL (0-100) based on traffic weight (40%), backlinks (30%), keywords (30%)
|
6. Score risk per URL (0-100) based on traffic weight (40%), backlinks (30%), keywords (30%)
|
||||||
7. Generate redirect map (old URL -> new URL) based on migration type
|
7. Generate redirect map (old URL -> new URL) based on migration type
|
||||||
8. Aggregate risk assessment (high/medium/low URL counts, overall risk level)
|
8. Aggregate risk assessment (high/medium/low URL counts, overall risk level)
|
||||||
@@ -69,7 +64,7 @@ mcp__perplexity__search: Research migration best practices and common pitfalls
|
|||||||
|
|
||||||
### Post-Migration Monitoring
|
### Post-Migration Monitoring
|
||||||
1. Accept domain, migration date, and optional baseline JSON
|
1. Accept domain, migration date, and optional baseline JSON
|
||||||
2. Compare pre vs post traffic using Ahrefs metrics history
|
2. Compare pre vs post traffic using our-seo-agent metrics history
|
||||||
3. Check redirect health via Firecrawl (broken, chains, loops)
|
3. Check redirect health via Firecrawl (broken, chains, loops)
|
||||||
4. Track indexation changes (pre vs post page count, missing pages)
|
4. Track indexation changes (pre vs post page count, missing pages)
|
||||||
5. Track keyword ranking changes for priority keywords
|
5. Track keyword ranking changes for priority keywords
|
||||||
@@ -156,7 +151,7 @@ mcp__perplexity__search: Research migration best practices and common pitfalls
|
|||||||
|
|
||||||
## Limitations
|
## Limitations
|
||||||
|
|
||||||
- Ahrefs data has ~24h freshness lag
|
- Data freshness depends on source and collection method
|
||||||
- Firecrawl crawl limited to 5,000 URLs per run
|
- Firecrawl crawl limited to 5,000 URLs per run
|
||||||
- Redirect chain detection depends on Firecrawl following redirects
|
- Redirect chain detection depends on Firecrawl following redirects
|
||||||
- Recovery estimation is heuristic-based on industry averages
|
- Recovery estimation is heuristic-based on industry averages
|
||||||
|
|||||||
@@ -98,12 +98,14 @@ python scripts/executive_report.py --report aggregated_report.json --audience c-
|
|||||||
- Support for C-level, marketing team, and technical team audiences
|
- Support for C-level, marketing team, and technical team audiences
|
||||||
- Markdown output format
|
- Markdown output format
|
||||||
|
|
||||||
## Ahrefs MCP Tools Used
|
## Data Sources
|
||||||
|
|
||||||
| Tool | Purpose |
|
| Source | Purpose |
|
||||||
|------|---------|
|
|--------|---------|
|
||||||
| `site-explorer-metrics` | Fresh current organic metrics snapshot |
|
| `our-seo-agent` CLI | Primary data source (future); use `--input` flag to provide pre-fetched JSON |
|
||||||
| `site-explorer-metrics-history` | Historical metrics for trend visualization |
|
| `--output-dir` flag | Scan local JSON files from skills 11-33 |
|
||||||
|
| WebSearch / WebFetch | Supplementary data for trend context |
|
||||||
|
| Notion MCP | Query past audits from SEO Audit Log database |
|
||||||
|
|
||||||
## Output Format
|
## Output Format
|
||||||
|
|
||||||
|
|||||||
@@ -20,8 +20,10 @@ from tenacity import (
|
|||||||
retry_if_exception_type,
|
retry_if_exception_type,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Load environment variables
|
||||||
load_dotenv()
|
load_dotenv()
|
||||||
|
|
||||||
|
# Logging setup
|
||||||
logging.basicConfig(
|
logging.basicConfig(
|
||||||
level=logging.INFO,
|
level=logging.INFO,
|
||||||
format="%(asctime)s - %(levelname)s - %(message)s",
|
format="%(asctime)s - %(levelname)s - %(message)s",
|
||||||
@@ -34,6 +36,13 @@ class RateLimiter:
|
|||||||
"""Rate limiter using token bucket algorithm."""
|
"""Rate limiter using token bucket algorithm."""
|
||||||
|
|
||||||
def __init__(self, rate: float, per: float = 1.0):
|
def __init__(self, rate: float, per: float = 1.0):
|
||||||
|
"""
|
||||||
|
Initialize rate limiter.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
rate: Number of requests allowed
|
||||||
|
per: Time period in seconds (default: 1 second)
|
||||||
|
"""
|
||||||
self.rate = rate
|
self.rate = rate
|
||||||
self.per = per
|
self.per = per
|
||||||
self.tokens = rate
|
self.tokens = rate
|
||||||
@@ -41,6 +50,7 @@ class RateLimiter:
|
|||||||
self._lock = asyncio.Lock()
|
self._lock = asyncio.Lock()
|
||||||
|
|
||||||
async def acquire(self) -> None:
|
async def acquire(self) -> None:
|
||||||
|
"""Acquire a token, waiting if necessary."""
|
||||||
async with self._lock:
|
async with self._lock:
|
||||||
now = datetime.now()
|
now = datetime.now()
|
||||||
elapsed = (now - self.last_update).total_seconds()
|
elapsed = (now - self.last_update).total_seconds()
|
||||||
@@ -64,6 +74,14 @@ class BaseAsyncClient:
|
|||||||
requests_per_second: float = 3.0,
|
requests_per_second: float = 3.0,
|
||||||
logger: logging.Logger | None = None,
|
logger: logging.Logger | None = None,
|
||||||
):
|
):
|
||||||
|
"""
|
||||||
|
Initialize base client.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
max_concurrent: Maximum concurrent requests
|
||||||
|
requests_per_second: Rate limit
|
||||||
|
logger: Logger instance
|
||||||
|
"""
|
||||||
self.semaphore = Semaphore(max_concurrent)
|
self.semaphore = Semaphore(max_concurrent)
|
||||||
self.rate_limiter = RateLimiter(requests_per_second)
|
self.rate_limiter = RateLimiter(requests_per_second)
|
||||||
self.logger = logger or logging.getLogger(self.__class__.__name__)
|
self.logger = logger or logging.getLogger(self.__class__.__name__)
|
||||||
@@ -83,6 +101,7 @@ class BaseAsyncClient:
|
|||||||
self,
|
self,
|
||||||
coro: Callable[[], Any],
|
coro: Callable[[], Any],
|
||||||
) -> Any:
|
) -> Any:
|
||||||
|
"""Execute a request with rate limiting and retry."""
|
||||||
async with self.semaphore:
|
async with self.semaphore:
|
||||||
await self.rate_limiter.acquire()
|
await self.rate_limiter.acquire()
|
||||||
self.stats["requests"] += 1
|
self.stats["requests"] += 1
|
||||||
@@ -100,6 +119,7 @@ class BaseAsyncClient:
|
|||||||
requests: list[Callable[[], Any]],
|
requests: list[Callable[[], Any]],
|
||||||
desc: str = "Processing",
|
desc: str = "Processing",
|
||||||
) -> list[Any]:
|
) -> list[Any]:
|
||||||
|
"""Execute multiple requests concurrently."""
|
||||||
try:
|
try:
|
||||||
from tqdm.asyncio import tqdm
|
from tqdm.asyncio import tqdm
|
||||||
has_tqdm = True
|
has_tqdm = True
|
||||||
@@ -124,6 +144,7 @@ class BaseAsyncClient:
|
|||||||
return await asyncio.gather(*tasks, return_exceptions=True)
|
return await asyncio.gather(*tasks, return_exceptions=True)
|
||||||
|
|
||||||
def print_stats(self) -> None:
|
def print_stats(self) -> None:
|
||||||
|
"""Print request statistics."""
|
||||||
self.logger.info("=" * 40)
|
self.logger.info("=" * 40)
|
||||||
self.logger.info("Request Statistics:")
|
self.logger.info("Request Statistics:")
|
||||||
self.logger.info(f" Total Requests: {self.stats['requests']}")
|
self.logger.info(f" Total Requests: {self.stats['requests']}")
|
||||||
@@ -140,6 +161,8 @@ class ConfigManager:
|
|||||||
|
|
||||||
@property
|
@property
|
||||||
def google_credentials_path(self) -> str | None:
|
def google_credentials_path(self) -> str | None:
|
||||||
|
"""Get Google service account credentials path."""
|
||||||
|
# Prefer SEO-specific credentials, fallback to general credentials
|
||||||
seo_creds = os.path.expanduser("~/.credential/ourdigital-seo-agent.json")
|
seo_creds = os.path.expanduser("~/.credential/ourdigital-seo-agent.json")
|
||||||
if os.path.exists(seo_creds):
|
if os.path.exists(seo_creds):
|
||||||
return seo_creds
|
return seo_creds
|
||||||
@@ -147,23 +170,38 @@ class ConfigManager:
|
|||||||
|
|
||||||
@property
|
@property
|
||||||
def pagespeed_api_key(self) -> str | None:
|
def pagespeed_api_key(self) -> str | None:
|
||||||
|
"""Get PageSpeed Insights API key."""
|
||||||
return os.getenv("PAGESPEED_API_KEY")
|
return os.getenv("PAGESPEED_API_KEY")
|
||||||
|
|
||||||
|
@property
|
||||||
|
def custom_search_api_key(self) -> str | None:
|
||||||
|
"""Get Custom Search API key."""
|
||||||
|
return os.getenv("CUSTOM_SEARCH_API_KEY")
|
||||||
|
|
||||||
|
@property
|
||||||
|
def custom_search_engine_id(self) -> str | None:
|
||||||
|
"""Get Custom Search Engine ID."""
|
||||||
|
return os.getenv("CUSTOM_SEARCH_ENGINE_ID")
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def notion_token(self) -> str | None:
|
def notion_token(self) -> str | None:
|
||||||
|
"""Get Notion API token."""
|
||||||
return os.getenv("NOTION_TOKEN") or os.getenv("NOTION_API_KEY")
|
return os.getenv("NOTION_TOKEN") or os.getenv("NOTION_API_KEY")
|
||||||
|
|
||||||
def validate_google_credentials(self) -> bool:
|
def validate_google_credentials(self) -> bool:
|
||||||
|
"""Validate Google credentials are configured."""
|
||||||
creds_path = self.google_credentials_path
|
creds_path = self.google_credentials_path
|
||||||
if not creds_path:
|
if not creds_path:
|
||||||
return False
|
return False
|
||||||
return os.path.exists(creds_path)
|
return os.path.exists(creds_path)
|
||||||
|
|
||||||
def get_required(self, key: str) -> str:
|
def get_required(self, key: str) -> str:
|
||||||
|
"""Get required environment variable or raise error."""
|
||||||
value = os.getenv(key)
|
value = os.getenv(key)
|
||||||
if not value:
|
if not value:
|
||||||
raise ValueError(f"Missing required environment variable: {key}")
|
raise ValueError(f"Missing required environment variable: {key}")
|
||||||
return value
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
# Singleton config instance
|
||||||
config = ConfigManager()
|
config = ConfigManager()
|
||||||
|
|||||||
@@ -453,11 +453,13 @@ CATEGORY_KOREAN_LABELS: dict[str, str] = {
|
|||||||
"competitor": "경쟁사",
|
"competitor": "경쟁사",
|
||||||
"schema": "스키마",
|
"schema": "스키마",
|
||||||
"kpi": "KPI",
|
"kpi": "KPI",
|
||||||
"search_console": "Search Console",
|
"comprehensive": "종합 감사",
|
||||||
|
"search_console": "서치 콘솔",
|
||||||
"ecommerce": "이커머스",
|
"ecommerce": "이커머스",
|
||||||
"international": "국제 SEO",
|
"international": "국제 SEO",
|
||||||
"ai_search": "AI 검색",
|
"ai_search": "AI 검색",
|
||||||
"entity_seo": "엔티티 SEO",
|
"entity_seo": "엔티티 SEO",
|
||||||
|
"migration": "사이트 이전",
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -123,11 +123,11 @@ CATEGORY_LABELS_KR = {
|
|||||||
"competitor": "경쟁 분석",
|
"competitor": "경쟁 분석",
|
||||||
"schema": "스키마/구조화 데이터",
|
"schema": "스키마/구조화 데이터",
|
||||||
"kpi": "KPI 프레임워크",
|
"kpi": "KPI 프레임워크",
|
||||||
"search_console": "Search Console",
|
"search_console": "서치 콘솔",
|
||||||
"ecommerce": "이커머스 SEO",
|
"ecommerce": "이커머스 SEO",
|
||||||
"international": "국제 SEO",
|
"international": "국제 SEO",
|
||||||
"ai_search": "AI 검색 가시성",
|
"ai_search": "AI 검색 가시성",
|
||||||
"entity_seo": "Knowledge Graph",
|
"entity_seo": "지식 그래프",
|
||||||
}
|
}
|
||||||
|
|
||||||
# Common English issue descriptions -> Korean translations
|
# Common English issue descriptions -> Korean translations
|
||||||
@@ -434,11 +434,11 @@ class ExecutiveReportGenerator:
|
|||||||
grade_kr = HEALTH_LABELS_KR.get(grade, grade)
|
grade_kr = HEALTH_LABELS_KR.get(grade, grade)
|
||||||
trend_kr = TREND_LABELS_KR.get(summary.health_trend, summary.health_trend)
|
trend_kr = TREND_LABELS_KR.get(summary.health_trend, summary.health_trend)
|
||||||
|
|
||||||
lines.append("## Health Score")
|
lines.append("## 종합 건강 점수")
|
||||||
lines.append("")
|
lines.append("")
|
||||||
lines.append(f"| 지표 | 값 |")
|
lines.append(f"| 지표 | 값 |")
|
||||||
lines.append(f"|------|-----|")
|
lines.append(f"|------|-----|")
|
||||||
lines.append(f"| Overall Score | **{summary.health_score}/100** |")
|
lines.append(f"| 종합 점수 | **{summary.health_score}/100** |")
|
||||||
lines.append(f"| 등급 | {grade_kr} |")
|
lines.append(f"| 등급 | {grade_kr} |")
|
||||||
lines.append(f"| 추세 | {trend_kr} |")
|
lines.append(f"| 추세 | {trend_kr} |")
|
||||||
lines.append("")
|
lines.append("")
|
||||||
|
|||||||
@@ -55,7 +55,7 @@ SKILL_REGISTRY = {
|
|||||||
28: {"name": "knowledge-graph", "category": "entity_seo", "weight": 0.10},
|
28: {"name": "knowledge-graph", "category": "entity_seo", "weight": 0.10},
|
||||||
31: {"name": "competitor-intel", "category": "competitor", "weight": 0.15},
|
31: {"name": "competitor-intel", "category": "competitor", "weight": 0.15},
|
||||||
32: {"name": "crawl-budget", "category": "technical", "weight": 0.10},
|
32: {"name": "crawl-budget", "category": "technical", "weight": 0.10},
|
||||||
33: {"name": "page-experience", "category": "performance", "weight": 0.10},
|
33: {"name": "migration-planner", "category": "migration", "weight": 0.10},
|
||||||
}
|
}
|
||||||
|
|
||||||
CATEGORY_WEIGHTS = {
|
CATEGORY_WEIGHTS = {
|
||||||
@@ -69,6 +69,13 @@ CATEGORY_WEIGHTS = {
|
|||||||
"competitor": 0.05,
|
"competitor": 0.05,
|
||||||
"schema": 0.05,
|
"schema": 0.05,
|
||||||
"kpi": 0.05,
|
"kpi": 0.05,
|
||||||
|
"comprehensive": 1.0,
|
||||||
|
"search_console": 0.05,
|
||||||
|
"ecommerce": 0.05,
|
||||||
|
"international": 0.05,
|
||||||
|
"ai_search": 0.05,
|
||||||
|
"entity_seo": 0.05,
|
||||||
|
"migration": 0.05,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@@ -255,14 +262,15 @@ class ReportAggregator(BaseAsyncClient):
|
|||||||
|
|
||||||
# Extract health score — check top-level first, then nested data dict
|
# Extract health score — check top-level first, then nested data dict
|
||||||
score_found = False
|
score_found = False
|
||||||
for key in ("health_score", "overall_health", "score"):
|
for key in ("health_score", "overall_health", "overall_score", "score",
|
||||||
|
"technical_score", "efficiency_score", "onpage_score"):
|
||||||
if key in data:
|
if key in data:
|
||||||
try:
|
try:
|
||||||
skill_output.health_score = float(data[key])
|
skill_output.health_score = float(data[key])
|
||||||
score_found = True
|
score_found = True
|
||||||
except (ValueError, TypeError):
|
|
||||||
pass
|
|
||||||
break
|
break
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
continue
|
||||||
|
|
||||||
if not score_found:
|
if not score_found:
|
||||||
nested = data.get("data", {})
|
nested = data.get("data", {})
|
||||||
@@ -276,9 +284,9 @@ class ReportAggregator(BaseAsyncClient):
|
|||||||
if val is not None:
|
if val is not None:
|
||||||
try:
|
try:
|
||||||
skill_output.health_score = float(val)
|
skill_output.health_score = float(val)
|
||||||
except (ValueError, TypeError):
|
|
||||||
pass
|
|
||||||
break
|
break
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
continue
|
||||||
|
|
||||||
# Extract audit date
|
# Extract audit date
|
||||||
for key in ("audit_date", "report_date", "timestamp", "found_date"):
|
for key in ("audit_date", "report_date", "timestamp", "found_date"):
|
||||||
|
|||||||
@@ -20,10 +20,10 @@ Aggregate outputs from all SEO skills (11-33) into stakeholder-ready executive r
|
|||||||
|
|
||||||
## MCP Tool Usage
|
## MCP Tool Usage
|
||||||
|
|
||||||
### Ahrefs for Fresh Data Pull
|
### SEO Data
|
||||||
```
|
```
|
||||||
mcp__ahrefs__site-explorer-metrics: Pull current organic metrics snapshot for dashboard
|
our-seo-agent CLI: Primary data source (future); use --input for pre-fetched JSON
|
||||||
mcp__ahrefs__site-explorer-metrics-history: Pull historical metrics for trend visualization
|
WebSearch / WebFetch: Supplementary live data
|
||||||
```
|
```
|
||||||
|
|
||||||
### Notion for Reading Past Audits and Writing Reports
|
### Notion for Reading Past Audits and Writing Reports
|
||||||
@@ -42,7 +42,7 @@ mcp__perplexity__*: Enrich reports with industry benchmarks and competitor conte
|
|||||||
### Dashboard Generation
|
### Dashboard Generation
|
||||||
1. Accept target domain and optional date range
|
1. Accept target domain and optional date range
|
||||||
2. Query Notion SEO Audit Log for all past audit entries for the domain
|
2. Query Notion SEO Audit Log for all past audit entries for the domain
|
||||||
3. Optionally pull fresh metrics from Ahrefs (site-explorer-metrics, metrics-history)
|
3. Optionally pull fresh metrics from our-seo-agent CLI or provide pre-fetched JSON via --input
|
||||||
4. Normalize all skill outputs into unified format
|
4. Normalize all skill outputs into unified format
|
||||||
5. Compute cross-skill health score with weighted category dimensions
|
5. Compute cross-skill health score with weighted category dimensions
|
||||||
6. Identify top issues (sorted by severity) and top wins across all audits
|
6. Identify top issues (sorted by severity) and top wins across all audits
|
||||||
|
|||||||
@@ -25,5 +25,8 @@ markdownify>=0.11.6
|
|||||||
# HTTP client
|
# HTTP client
|
||||||
httpx>=0.25.0
|
httpx>=0.25.0
|
||||||
|
|
||||||
|
# Retry logic
|
||||||
|
tenacity>=8.2.0
|
||||||
|
|
||||||
# Data validation
|
# Data validation
|
||||||
pydantic>=2.5.0
|
pydantic>=2.5.0
|
||||||
|
|||||||
Reference in New Issue
Block a user