Fix SEO skill 34 bugs, Korean labels, and transition Ahrefs refs to our-seo-agent (#2)

This commit is contained in:
Andrew Yim
2026-02-14 01:09:35 +09:00
committed by GitHub
parent d2d0a2d460
commit a28bfbf847
34 changed files with 265 additions and 262 deletions

View File

@@ -2,7 +2,7 @@
## Overview
Keyword strategy and research tool for SEO campaigns. Expands seed keywords via Ahrefs APIs, classifies search intent, clusters topics, performs competitor keyword gap analysis, and supports Korean market keyword discovery including Naver autocomplete.
Keyword strategy and research tool for SEO campaigns. Expands seed keywords via our-seo-agent CLI or pre-fetched data, classifies search intent, clusters topics, performs competitor keyword gap analysis, and supports Korean market keyword discovery including Naver autocomplete.
## Quick Start
@@ -68,17 +68,13 @@ python scripts/keyword_gap_analyzer.py --target https://example.com --competitor
- Segment gaps by intent type
- Prioritize low-KD high-volume opportunities
## Ahrefs MCP Tools Used
## Data Sources
| Tool | Purpose |
|------|---------|
| `keywords-explorer-overview` | Get keyword metrics (volume, KD, CPC) |
| `keywords-explorer-matching-terms` | Find matching keyword variations |
| `keywords-explorer-related-terms` | Discover semantically related keywords |
| `keywords-explorer-search-suggestions` | Get autocomplete suggestions |
| `keywords-explorer-volume-by-country` | Compare volume across countries |
| `keywords-explorer-volume-history` | Track volume trends over time |
| `site-explorer-organic-keywords` | Get competitor keyword rankings |
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
| WebSearch / WebFetch | Supplementary live data |
| Notion MCP | Save audit report to database |
## Output Format

View File

@@ -23,14 +23,10 @@ Expand seed keywords, classify search intent, cluster topics, and identify compe
## MCP Tool Usage
### Ahrefs for Keyword Data
### SEO Data
```
mcp__ahrefs__keywords-explorer-overview: Get keyword metrics
mcp__ahrefs__keywords-explorer-matching-terms: Find keyword variations
mcp__ahrefs__keywords-explorer-related-terms: Discover related keywords
mcp__ahrefs__keywords-explorer-search-suggestions: Autocomplete suggestions
mcp__ahrefs__keywords-explorer-volume-by-country: Country volume comparison
mcp__ahrefs__site-explorer-organic-keywords: Competitor keyword rankings
our-seo-agent CLI: Primary keyword data source (future); use --input for pre-fetched JSON
WebSearch / WebFetch: Live keyword research and autocomplete data
```
### Web Search for Naver Discovery
@@ -42,7 +38,7 @@ WebSearch: Naver autocomplete and trend discovery
### 1. Seed Keyword Expansion
1. Input seed keyword (Korean or English)
2. Query Ahrefs matching-terms and related-terms
2. Query keyword data via our-seo-agent CLI, pre-fetched JSON, or WebSearch
3. Get search suggestions for long-tail variations
4. Apply Korean suffix expansion if Korean market
5. Deduplicate and merge results

View File

@@ -2,7 +2,7 @@
## Overview
SERP analysis tool for understanding search result landscapes. Detects Google SERP features (featured snippets, PAA, knowledge panels, local pack, video, ads), analyzes Naver SERP composition (blog, cafe, knowledge iN, Smart Store, brand zone, VIEW tab), maps competitor positions, and scores SERP feature opportunities.
SERP analysis tool for understanding search result landscapes. Detects Google SERP features (featured snippets, PAA, knowledge panels, local pack, video, ads), analyzes Naver SERP composition (blog, cafe, knowledge iN, Smart Store, brand zone, shortform, influencer), maps competitor positions, and scores SERP feature opportunities.
## Quick Start
@@ -65,13 +65,13 @@ python scripts/naver_serp_analyzer.py --keywords-file keywords.txt --json
- Brand zone presence detection
- Shortform/influencer content analysis
## Ahrefs MCP Tools Used
## Data Sources
| Tool | Purpose |
|------|---------|
| `serp-overview` | Get SERP results for a keyword |
| `keywords-explorer-overview` | Get keyword metrics and SERP features |
| `site-explorer-organic-keywords` | Map competitor positions |
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
| WebSearch / WebFetch | Live SERP data and Naver section analysis |
| Notion MCP | Save analysis report to SEO Audit Log database |
## Output Format

View File

@@ -2,7 +2,7 @@
Naver SERP Analyzer - Naver search result composition analysis
==============================================================
Purpose: Analyze Naver SERP section distribution, content type mapping,
brand zone detection, and VIEW tab content analysis.
brand zone detection, and section priority analysis.
Python: 3.10+
Usage:

View File

@@ -21,11 +21,10 @@ Analyze search engine result page composition for Google and Naver. Detect SERP
## MCP Tool Usage
### Ahrefs for SERP Data
### SEO Data
```
mcp__ahrefs__serp-overview: Get SERP results and features for a keyword
mcp__ahrefs__keywords-explorer-overview: Get keyword metrics, volume, difficulty, and SERP feature flags
mcp__ahrefs__site-explorer-organic-keywords: Map competitor keyword positions
our-seo-agent CLI: Primary data source (future); use --input for pre-fetched JSON
WebSearch / WebFetch: Live SERP data and keyword metrics
```
### Notion for Report Storage
@@ -43,7 +42,7 @@ WebFetch: Fetch Naver SERP HTML for section analysis
## Workflow
### 1. Google SERP Analysis
1. Fetch SERP data via `mcp__ahrefs__serp-overview` for the target keyword and country
1. Fetch SERP data via `our-seo-agent` CLI, `--input` JSON, or WebSearch for the target keyword and country
2. Detect SERP features (featured snippet, PAA, local pack, knowledge panel, video, ads, images, shopping)
3. Map competitor positions from organic results (domain, URL, title, position)
4. Classify content type for each result (blog, product, service, news, video)
@@ -107,7 +106,7 @@ WebFetch: Fetch Naver SERP HTML for section analysis
## Limitations
- Ahrefs SERP data may have a delay (not real-time)
- SERP data may have a delay depending on data source (not real-time)
- Naver SERP HTML structure changes periodically
- Brand zone detection depends on HTML class patterns
- Cannot detect personalized SERP results

View File

@@ -2,7 +2,7 @@
## Overview
Position tracking tool for monitoring keyword rankings via Ahrefs Rank Tracker. Monitors ranking positions, detects position changes with threshold alerts, calculates visibility scores weighted by search volume, compares against competitors, and segments by brand/non-brand keywords.
Position tracking tool for monitoring keyword rankings. Monitors ranking positions, detects position changes with threshold alerts, calculates visibility scores weighted by search volume, compares against competitors, and segments by brand/non-brand keywords.
## Quick Start
@@ -41,7 +41,7 @@ python scripts/position_tracker.py --target https://example.com --competitor htt
```
**Capabilities**:
- Current ranking position retrieval via Ahrefs Rank Tracker
- Current ranking position retrieval via our-seo-agent CLI or pre-fetched data
- Position change detection with configurable threshold alerts
- Visibility score calculation (weighted by search volume)
- Brand vs non-brand keyword segmentation
@@ -69,17 +69,13 @@ python scripts/ranking_reporter.py --target https://example.com --competitor htt
- Competitor overlap and position comparison
- Average position by keyword group
## Ahrefs MCP Tools Used
## Data Sources
| Tool | Purpose |
|------|---------|
| `rank-tracker-overview` | Get rank tracking overview for project |
| `rank-tracker-competitors-overview` | Compare against competitors |
| `rank-tracker-competitors-pages` | Competitor page-level rankings |
| `rank-tracker-competitors-stats` | Competitor ranking statistics |
| `rank-tracker-serp-overview` | SERP details for tracked keywords |
| `management-projects` | List Ahrefs projects |
| `management-project-keywords` | Get tracked keywords for project |
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
| WebSearch / WebFetch | Supplementary live data |
| Notion MCP | Save audit report to database |
## Output Format

View File

@@ -1,7 +1,7 @@
---
name: seo-position-tracking
description: |
Keyword position tracking and ranking monitoring via Ahrefs Rank Tracker.
Keyword position tracking for keyword ranking monitoring.
Triggers: rank tracking, position monitoring, keyword rankings, visibility score, ranking report, 키워드 순위, 순위 추적.
---
@@ -9,11 +9,11 @@ description: |
## Purpose
Monitor keyword ranking positions, detect significant changes, calculate visibility scores, and compare against competitors using Ahrefs Rank Tracker data. Provides actionable alerts for ranking drops and segment-level performance breakdown.
Monitor keyword ranking positions, detect significant changes, calculate visibility scores, and compare against competitors using our-seo-agent CLI or pre-fetched ranking data. Provides actionable alerts for ranking drops and segment-level performance breakdown.
## Core Capabilities
1. **Position Monitoring** - Retrieve current keyword ranking positions from Ahrefs Rank Tracker projects
1. **Position Monitoring** - Retrieve current keyword ranking positions from our-seo-agent CLI or pre-fetched data
2. **Change Detection** - Detect significant position changes with configurable threshold alerts (severity: critical/high/medium/low)
3. **Visibility Scoring** - Calculate weighted visibility scores using CTR-curve model (position 1 = 30%, position 2 = 15%, etc.)
4. **Brand/Non-brand Segmentation** - Automatically classify keywords by brand relevance and search intent type
@@ -21,15 +21,10 @@ Monitor keyword ranking positions, detect significant changes, calculate visibil
## MCP Tool Usage
### Ahrefs Rank Tracker Tools
### SEO Data
```
mcp__ahrefs__rank-tracker-overview: Get rank tracking overview with current positions
mcp__ahrefs__rank-tracker-competitors-overview: Compare rankings against competitors
mcp__ahrefs__rank-tracker-competitors-pages: Competitor page-level ranking data
mcp__ahrefs__rank-tracker-competitors-stats: Detailed competitor ranking statistics
mcp__ahrefs__rank-tracker-serp-overview: SERP details for tracked keywords
mcp__ahrefs__management-projects: List available Ahrefs projects
mcp__ahrefs__management-project-keywords: Get tracked keywords for a project
our-seo-agent CLI: Primary ranking data source (future); use --input for pre-fetched JSON
WebSearch: Supplementary ranking data
```
### Notion for Report Storage
@@ -41,7 +36,7 @@ mcp__notion__notion-update-page: Update existing tracking entries
## Workflow
### Phase 1: Data Collection
1. Identify Ahrefs project via `management-projects`
1. Identify tracking project or use --input for pre-fetched data
2. Retrieve tracked keywords via `management-project-keywords`
3. Fetch current positions via `rank-tracker-overview`
4. Fetch competitor data via `rank-tracker-competitors-overview` (if requested)

View File

@@ -68,19 +68,13 @@ python scripts/link_gap_finder.py --target https://example.com --competitor http
- Categorize link sources (editorial, directory, forum, blog, news)
- Prioritize by feasibility and impact
## Ahrefs MCP Tools Used
## Data Sources
| Tool | Purpose |
|------|---------|
| `site-explorer-all-backlinks` | Get all backlinks for a target |
| `site-explorer-backlinks-stats` | Backlink statistics overview |
| `site-explorer-referring-domains` | List referring domains |
| `site-explorer-anchors` | Anchor text distribution |
| `site-explorer-broken-backlinks` | Find broken backlinks |
| `site-explorer-domain-rating` | Get Domain Rating |
| `site-explorer-domain-rating-history` | DR trend over time |
| `site-explorer-refdomains-history` | Referring domains trend |
| `site-explorer-linked-domains` | Domains linked from target |
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
| WebSearch / WebFetch | Supplementary live data |
| Notion MCP | Save audit report to database |
## Output Format

View File

@@ -22,17 +22,10 @@ Analyze backlink profiles, detect toxic links, find competitor link gaps, track
## MCP Tool Usage
### Ahrefs for Backlink Data
### SEO Data
```
mcp__ahrefs__site-explorer-all-backlinks: Get all backlinks for a target
mcp__ahrefs__site-explorer-backlinks-stats: Backlink statistics overview
mcp__ahrefs__site-explorer-referring-domains: List referring domains
mcp__ahrefs__site-explorer-anchors: Anchor text distribution
mcp__ahrefs__site-explorer-broken-backlinks: Find broken backlinks
mcp__ahrefs__site-explorer-domain-rating: Get Domain Rating
mcp__ahrefs__site-explorer-domain-rating-history: DR trend over time
mcp__ahrefs__site-explorer-refdomains-history: Referring domains trend
mcp__ahrefs__site-explorer-linked-domains: Domains linked from target
our-seo-agent CLI: Primary backlink data source (future); use --input for pre-fetched JSON
WebSearch / WebFetch: Supplementary backlink data
```
### Notion for Report Storage

View File

@@ -2,7 +2,7 @@
## Overview
Content strategy tool for SEO-driven content planning. Performs content inventory via sitemap crawl and Ahrefs top pages, scores content performance, detects content decay, analyzes topic gaps vs competitors, maps topic clusters, and generates content briefs. Supports Korean content patterns (Naver Blog format, review/후기 content).
Content strategy tool for SEO-driven content planning. Performs content inventory via sitemap crawl and our-seo-agent CLI, scores content performance, detects content decay, analyzes topic gaps vs competitors, maps topic clusters, and generates content briefs. Supports Korean content patterns (Naver Blog format, review/후기 content).
## Quick Start
@@ -42,7 +42,7 @@ python scripts/content_auditor.py --url https://example.com --type blog --json
```
**Capabilities**:
- Content inventory via sitemap crawl + Ahrefs top-pages
- Content inventory via sitemap crawl + our-seo-agent CLI or pre-fetched data
- Performance scoring (traffic, rankings, backlinks)
- Content decay detection (pages losing traffic over time)
- Content type classification (blog, product, service, landing, resource)
@@ -85,15 +85,13 @@ python scripts/content_brief_generator.py --keyword "dental implant cost" --url
- Internal linking suggestions
- Korean content format recommendations
## Ahrefs MCP Tools Used
## Data Sources
| Tool | Purpose |
|------|---------|
| `site-explorer-top-pages` | Get top performing pages |
| `site-explorer-pages-by-traffic` | Pages ranked by organic traffic |
| `site-explorer-organic-keywords` | Keywords per page |
| `site-explorer-organic-competitors` | Find content competitors |
| `site-explorer-best-by-external-links` | Best content by links |
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
| WebSearch / WebFetch | Supplementary live data |
| Notion MCP | Save audit report to database |
## Output Format

View File

@@ -20,16 +20,10 @@ Audit existing content performance, identify topic gaps vs competitors, map topi
## MCP Tool Usage
### Ahrefs for Content Data
### SEO Data
```
site-explorer-top-pages: Get top performing pages
site-explorer-pages-by-traffic: Pages ranked by organic traffic
site-explorer-organic-keywords: Keywords per page
site-explorer-organic-competitors: Find content competitors
site-explorer-best-by-external-links: Best content by backlinks
keywords-explorer-matching-terms: Secondary keyword suggestions
keywords-explorer-related-terms: LSI keyword suggestions
serp-overview: Analyze top ranking results for a keyword
our-seo-agent CLI: Primary content/traffic data source (future); use --input for pre-fetched JSON
WebSearch / WebFetch: Supplementary content data
```
### WebSearch for Content Research
@@ -47,7 +41,7 @@ notion-create-pages: Save audit reports to SEO Audit Log
### 1. Content Audit
1. Crawl sitemap to discover all content URLs
2. Fetch top pages data from Ahrefs (traffic, keywords, backlinks)
2. Fetch top pages data via our-seo-agent CLI, pre-fetched JSON, or WebSearch
3. Classify content types (blog, product, service, landing, resource)
4. Score each page performance (0-100 composite)
5. Detect decaying content (traffic decline patterns)
@@ -56,7 +50,7 @@ notion-create-pages: Save audit reports to SEO Audit Log
8. Generate recommendations
### 2. Content Gap Analysis
1. Gather target site keywords from Ahrefs
1. Gather target site keywords via our-seo-agent CLI or pre-fetched data
2. Gather competitor top pages and keywords
3. Identify topics present in competitors but missing from target
4. Score gaps by priority (traffic potential + competition coverage)
@@ -123,7 +117,7 @@ notion-create-pages: Save audit reports to SEO Audit Log
## Limitations
- Ahrefs API required for traffic and keyword data
- our-seo-agent CLI or pre-fetched JSON required for traffic and keyword data
- Competitor analysis limited to publicly available content
- Content decay detection uses heuristic without historical data in standalone mode
- Topic clustering requires minimum 3 topics per cluster

View File

@@ -68,12 +68,13 @@ python scripts/product_schema_checker.py --sitemap https://example.com/product-s
- Merchant listing schema support
- Korean market: Naver Shopping structured data requirements
## Ahrefs MCP Tools Used
## Data Sources
| Tool | Purpose |
|------|---------|
| `site-explorer-pages-by-traffic` | Identify top product/category pages |
| `site-explorer-organic-keywords` | Product page keyword performance |
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
| WebSearch / WebFetch | Supplementary live data |
| Notion MCP | Save audit report to database |
## Output Format

View File

@@ -23,10 +23,10 @@ Audit e-commerce sites for product page optimization, structured data validation
## MCP Tool Usage
### Ahrefs for Product Page Discovery
### SEO Data
```
mcp__ahrefs__site-explorer-pages-by-traffic: Identify top product and category pages
mcp__ahrefs__site-explorer-organic-keywords: Product page keyword performance
our-seo-agent CLI: Primary product page data source (future); use --input for pre-fetched JSON
WebSearch / WebFetch: Supplementary product page data
```
### WebSearch for Marketplace Checks
@@ -43,7 +43,7 @@ mcp__notion__notion-create-pages: Save audit report to SEO Audit Log database
## Workflow
### 1. Product Page Audit
1. Discover product pages via Ahrefs pages-by-traffic or sitemap
1. Discover product pages via our-seo-agent CLI, pre-fetched JSON, or sitemap crawl
2. For each product page check:
- Title tag: contains product name, under 60 chars
- Meta description: includes price/feature info, under 155 chars

View File

@@ -2,7 +2,7 @@
## Overview
SEO KPI and performance framework for unified metrics aggregation across all SEO dimensions. Establishes baselines, sets targets (30/60/90-day), generates executive summaries with health scores, provides tactical breakdowns, estimates ROI using Ahrefs traffic cost, and supports period-over-period comparison (MoM, QoQ).
SEO KPI and performance framework for unified metrics aggregation across all SEO dimensions. Establishes baselines, sets targets (30/60/90-day), generates executive summaries with health scores, provides tactical breakdowns, estimates ROI using our-seo-agent traffic cost data, and supports period-over-period comparison (MoM, QoQ).
## Quick Start
@@ -49,10 +49,10 @@ python scripts/kpi_aggregator.py --url https://example.com --roi --json
- Content KPIs (indexed pages, content freshness score, thin content ratio)
- Link KPIs (domain rating, referring domains, link velocity)
- Local KPIs (GBP visibility, review score, citation accuracy)
- Multi-source data aggregation from Ahrefs and other skill outputs
- Multi-source data aggregation from our-seo-agent CLI and other skill outputs
- Baseline establishment and target setting (30/60/90 day)
- Overall health score (0-100) with weighted dimensions
- ROI estimation using Ahrefs organic traffic cost
- ROI estimation using organic traffic cost data
## Performance Reporter
@@ -79,15 +79,13 @@ python scripts/performance_reporter.py --url https://example.com --period monthl
- Target vs actual comparison with progress %
- Traffic value change (ROI proxy)
## Ahrefs MCP Tools Used
## Data Sources
| Tool | Purpose |
|------|---------|
| `site-explorer-metrics` | Current organic metrics |
| `site-explorer-metrics-history` | Historical metrics trends |
| `site-explorer-metrics-by-country` | Country-level breakdown |
| `site-explorer-domain-rating-history` | DR trend over time |
| `site-explorer-total-search-volume-history` | Total keyword volume trend |
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
| WebSearch / WebFetch | Supplementary live data |
| Notion MCP | Save audit report to database |
## Output Format

View File

@@ -10,26 +10,23 @@ description: |
## Purpose
Aggregate SEO KPIs across all dimensions into a unified dashboard. Establish baselines, set targets (30/60/90-day), generate executive summaries with health scores, provide tactical breakdowns, estimate ROI using Ahrefs traffic cost, and support period-over-period comparison (MoM, QoQ, YoY).
Aggregate SEO KPIs across all dimensions into a unified dashboard. Establish baselines, set targets (30/60/90-day), generate executive summaries with health scores, provide tactical breakdowns, estimate ROI using our-seo-agent traffic cost data, and support period-over-period comparison (MoM, QoQ, YoY).
## Core Capabilities
1. **KPI Aggregation** - Unified metrics across 7 dimensions (traffic, rankings, links, technical, content, engagement, local)
2. **Health Scoring** - Weighted 0-100 score with trend direction
3. **Baseline & Targets** - Establish baselines and set 30/60/90 day growth targets
4. **ROI Estimation** - Traffic value from Ahrefs organic cost
4. **ROI Estimation** - Traffic value from organic cost data
5. **Performance Reporting** - Period-over-period comparison with executive summary
6. **Tactical Breakdown** - Actionable next steps per dimension
## MCP Tool Usage
### Ahrefs for SEO Metrics
### SEO Data
```
mcp__ahrefs__site-explorer-metrics: Current organic metrics snapshot
mcp__ahrefs__site-explorer-metrics-history: Historical trend data
mcp__ahrefs__site-explorer-metrics-by-country: Country-level breakdown
mcp__ahrefs__site-explorer-domain-rating-history: Domain rating trend
mcp__ahrefs__site-explorer-total-search-volume-history: Keyword volume trend
our-seo-agent CLI: Primary metrics source (future); use --input for pre-fetched JSON
WebSearch / WebFetch: Supplementary metrics data
```
### Notion for Report Storage
@@ -45,7 +42,7 @@ mcp__notion__*: Save reports to SEO Audit Log database
3. Calculate dimension scores with weights (traffic 25%, rankings 20%, technical 20%, content 15%, links 15%, local 5%)
4. Compute overall health score (0-100)
5. Set 30/60/90 day targets (5%/10%/20% improvement)
6. Estimate ROI from Ahrefs traffic cost (divide raw cost by 100 for USD)
6. Estimate ROI from traffic cost data (use our-seo-agent CLI or pre-fetched JSON)
### 2. Performance Reporting
1. Determine date range from period (monthly/quarterly/yearly/custom)
@@ -94,10 +91,10 @@ mcp__notion__*: Save reports to SEO Audit Log database
## Limitations
- Local KPIs require external GBP data (not available via Ahrefs)
- Local KPIs require external GBP data (not available via our-seo-agent)
- Engagement KPIs (bounce rate, session duration) require Google Analytics
- Technical health is estimated heuristically from available data
- ROI is estimated from Ahrefs traffic cost, not actual revenue
- ROI is estimated from organic traffic cost data, not actual revenue
## Notion Output (Required)

View File

@@ -83,12 +83,13 @@ python scripts/international_auditor.py --url https://example.com --korean-expan
- CJK-specific URL encoding issues
- Regional search engine considerations (Naver, Baidu, Yahoo Japan)
## Ahrefs MCP Tools Used
## Data Sources
| Tool | Purpose |
|------|---------|
| `site-explorer-metrics-by-country` | Country-level traffic distribution |
| `site-explorer-organic-keywords` | Keywords by country filter |
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
| WebSearch / WebFetch | Supplementary live data |
| Notion MCP | Save audit report to database |
## Output Format

View File

@@ -21,10 +21,10 @@ Audit international SEO implementation: hreflang tags, URL structure patterns, c
## MCP Tool Usage
### Ahrefs for Country Metrics
### SEO Data
```
mcp__ahrefs__site-explorer-metrics-by-country: Country-level traffic distribution
mcp__ahrefs__site-explorer-organic-keywords: Keywords filtered by country
our-seo-agent CLI: Primary country metrics source (future); use --input for pre-fetched JSON
WebSearch / WebFetch: Supplementary international data
```
### Notion for Report Storage
@@ -69,7 +69,7 @@ WebSearch: Research hreflang implementation guides and regional search engine re
4. Recommend proper implementation (suggest, do not force)
### 5. Korean Expansion Analysis (Optional)
1. Analyze current traffic by country via Ahrefs
1. Analyze current traffic by country via our-seo-agent CLI or pre-fetched data
2. Recommend priority target markets for Korean businesses
3. Check CJK-specific URL encoding issues
4. Advise on regional search engines (Naver, Baidu, Yahoo Japan)

View File

@@ -2,7 +2,7 @@
## Overview
AI search visibility and brand radar tool for tracking how a brand appears in AI-generated search answers. Monitors AI answer citations, tracks share of voice in AI search vs competitors, analyzes cited domains and pages, and tracks impressions/mentions history. Uses Ahrefs Brand Radar APIs for comprehensive AI visibility monitoring.
AI search visibility and brand radar tool for tracking how a brand appears in AI-generated search answers. Monitors AI answer citations, tracks share of voice in AI search vs competitors, analyzes cited domains and pages, and tracks impressions/mentions history. Uses our-seo-agent CLI or pre-fetched data for comprehensive AI visibility monitoring.
## Quick Start
@@ -74,19 +74,13 @@ python scripts/ai_citation_analyzer.py --target example.com --responses --json
- Competitor citation comparison
- Recommendation generation for improving AI visibility
## Ahrefs MCP Tools Used
## Data Sources
| Tool | Purpose |
|------|---------|
| `brand-radar-ai-responses` | Get AI-generated responses mentioning brand |
| `brand-radar-cited-domains` | Domains cited in AI answers |
| `brand-radar-cited-pages` | Specific pages cited in AI answers |
| `brand-radar-impressions-history` | Brand impression trend over time |
| `brand-radar-impressions-overview` | Current impression metrics |
| `brand-radar-mentions-history` | Brand mention trend over time |
| `brand-radar-mentions-overview` | Current mention metrics |
| `brand-radar-sov-history` | Share of voice trend |
| `brand-radar-sov-overview` | Current share of voice |
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
| WebSearch / WebFetch | Supplementary live data |
| Notion MCP | Save audit report to database |
## Output Format

View File

@@ -2,14 +2,14 @@
name: seo-ai-visibility
description: |
AI search visibility and brand radar monitoring. Tracks how a brand appears
in AI-generated search answers using Ahrefs Brand Radar APIs.
in AI-generated search answers using our-seo-agent CLI or pre-fetched data.
Triggers: AI search, AI visibility, brand radar, AI citations,
share of voice, AI answers, AI mentions.
---
# SEO AI Visibility & Brand Radar
Monitor and analyze brand visibility in AI-generated search results. This skill uses Ahrefs Brand Radar APIs to track impressions, mentions, share of voice, cited domains, cited pages, and AI response content.
Monitor and analyze brand visibility in AI-generated search results. This skill uses our-seo-agent CLI or pre-fetched data to track impressions, mentions, share of voice, cited domains, cited pages, and AI response content.
## Capabilities
@@ -30,24 +30,18 @@ Monitor and analyze brand visibility in AI-generated search results. This skill
## Workflow
1. **Input**: User provides target domain and optional competitors
2. **Data Collection**: Fetch metrics from Ahrefs Brand Radar APIs
2. **Data Collection**: Fetch metrics from our-seo-agent CLI or pre-fetched JSON
3. **Analysis**: Calculate trends, compare competitors, analyze sentiment
4. **Recommendations**: Generate actionable Korean-language recommendations
5. **Output**: JSON report and Notion database entry
## Ahrefs MCP Tools
## Data Sources
| Tool | Purpose |
|------|---------|
| `brand-radar-ai-responses` | AI-generated responses mentioning brand |
| `brand-radar-cited-domains` | Domains cited in AI answers |
| `brand-radar-cited-pages` | Specific pages cited in AI answers |
| `brand-radar-impressions-history` | Impression trend over time |
| `brand-radar-impressions-overview` | Current impression metrics |
| `brand-radar-mentions-history` | Mention trend over time |
| `brand-radar-mentions-overview` | Current mention metrics |
| `brand-radar-sov-history` | Share of voice trend |
| `brand-radar-sov-overview` | Current share of voice |
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Primary AI visibility data source (future); use `--input` for pre-fetched JSON |
| WebSearch / WebFetch | Supplementary AI search data |
| Notion MCP | Save audit report to database |
## Notion Output
@@ -100,7 +94,7 @@ All reports are saved to the OurDigital SEO Audit Log:
## Limitations
- Requires Ahrefs Brand Radar API access (not available in basic plans)
- Requires our-seo-agent CLI or pre-fetched AI visibility data
- AI search landscape changes rapidly; data may not reflect real-time state
- Share of Voice metrics are relative to tracked competitor set only
- Sentiment analysis based on AI-generated text, not user perception

View File

@@ -2,7 +2,7 @@
## Overview
Knowledge Graph and Entity SEO tool for analyzing brand entity presence in Google Knowledge Graph, Knowledge Panels, People Also Ask (PAA), and FAQ rich results. Checks entity attribute completeness, Wikipedia/Wikidata presence, and Korean equivalents (Naver knowledge iN, Naver encyclopedia). Uses WebSearch and WebFetch for data collection, Ahrefs serp-overview for SERP feature detection.
Knowledge Graph and Entity SEO tool for analyzing brand entity presence in Google Knowledge Graph, Knowledge Panels, People Also Ask (PAA), and FAQ rich results. Checks entity attribute completeness, Wikipedia/Wikidata presence, and Korean equivalents (Naver knowledge iN, Naver encyclopedia). Uses WebSearch and WebFetch for data collection, WebSearch for SERP feature detection.
## Quick Start
@@ -75,7 +75,7 @@ python scripts/entity_auditor.py --url https://example.com --entity "Brand Name"
|--------|---------|
| WebSearch | Search for entity/brand to detect Knowledge Panel |
| WebFetch | Fetch Wikipedia, Wikidata, Naver pages |
| Ahrefs `serp-overview` | SERP feature detection for entity keywords |
| WebSearch | SERP feature detection for entity keywords |
## Output Format

View File

@@ -48,7 +48,7 @@ Analyze brand entity presence in Google Knowledge Graph, Knowledge Panels, Peopl
3. Check sameAs links accessibility
4. Use **WebSearch** to search brand name and analyze SERP features
5. Monitor PAA questions for brand keywords
6. Use **Ahrefs serp-overview** for SERP feature detection
6. Use **WebSearch** for SERP feature detection
7. Save report to **Notion** SEO Audit Log
## Tools Used
@@ -57,7 +57,7 @@ Analyze brand entity presence in Google Knowledge Graph, Knowledge Panels, Peopl
|------|---------|
| WebSearch | Search for entity/brand to detect Knowledge Panel |
| WebFetch | Fetch Wikipedia, Wikidata, Naver pages, website schemas |
| Ahrefs `serp-overview` | SERP feature detection for entity keywords |
| WebSearch | SERP feature detection for entity keywords |
| Notion | Save audit reports to SEO Audit Log database |
## Notion Output

View File

@@ -2,7 +2,7 @@
## Overview
Competitor intelligence and benchmarking tool for comprehensive SEO competitive analysis. Auto-discovers competitors via Ahrefs, builds competitor profile cards (DR, traffic, keywords, backlinks, content volume), creates head-to-head comparison matrices, tracks traffic trends, analyzes keyword overlap, compares content freshness/volume, and scores competitive threats. Supports Korean market competitor analysis including Naver Blog/Cafe presence.
Competitor intelligence and benchmarking tool for comprehensive SEO competitive analysis. Auto-discovers competitors via our-seo-agent CLI, builds competitor profile cards (DR, traffic, keywords, backlinks, content volume), creates head-to-head comparison matrices, tracks traffic trends, analyzes keyword overlap, compares content freshness/volume, and scores competitive threats. Supports Korean market competitor analysis including Naver Blog/Cafe presence.
## Quick Start
@@ -41,7 +41,7 @@ python scripts/competitor_profiler.py --target https://example.com --korean-mark
```
**Capabilities**:
- Competitor auto-discovery via Ahrefs organic-competitors
- Competitor auto-discovery via our-seo-agent CLI or pre-fetched data
- Competitor profile cards:
- Domain Rating (DR)
- Organic traffic estimate
@@ -78,21 +78,13 @@ python scripts/competitive_monitor.py --target https://example.com --scope traff
- Alert generation for significant competitive movements
- Market share estimation based on organic traffic
## Ahrefs MCP Tools Used
## Data Sources
| Tool | Purpose |
|------|---------|
| `site-explorer-metrics` | Current organic metrics |
| `site-explorer-metrics-history` | Historical metrics trends |
| `site-explorer-organic-competitors` | Discover organic competitors |
| `site-explorer-organic-keywords` | Keyword rankings per domain |
| `site-explorer-top-pages` | Top performing pages |
| `site-explorer-pages-by-traffic` | Pages ranked by traffic |
| `site-explorer-domain-rating` | Domain Rating |
| `site-explorer-domain-rating-history` | DR trend |
| `site-explorer-referring-domains` | Referring domain list |
| `site-explorer-backlinks-stats` | Backlink overview |
| `site-explorer-pages-history` | Page index history |
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
| WebSearch / WebFetch | Supplementary live data |
| Notion MCP | Save audit report to database |
## Output Format

View File

@@ -15,7 +15,7 @@ Comprehensive competitor intelligence for SEO: auto-discover competitors, build
## Core Capabilities
1. **Competitor Discovery** - Auto-discover organic competitors via Ahrefs
1. **Competitor Discovery** - Auto-discover organic competitors via our-seo-agent CLI
2. **Profile Cards** - DR, traffic, keywords, referring domains, top pages, content volume
3. **Comparison Matrix** - Multi-dimensional head-to-head comparison
4. **Keyword Overlap** - Shared, unique, and gap keyword analysis
@@ -26,19 +26,10 @@ Comprehensive competitor intelligence for SEO: auto-discover competitors, build
## MCP Tool Usage
### Ahrefs for Competitive Data
### SEO Data
```
mcp__ahrefs__site-explorer-organic-competitors: Discover organic competitors
mcp__ahrefs__site-explorer-metrics: Current organic metrics
mcp__ahrefs__site-explorer-metrics-history: Historical metric trends
mcp__ahrefs__site-explorer-domain-rating: Domain Rating score
mcp__ahrefs__site-explorer-domain-rating-history: DR trend over time
mcp__ahrefs__site-explorer-organic-keywords: Keyword rankings per domain
mcp__ahrefs__site-explorer-top-pages: Top performing pages
mcp__ahrefs__site-explorer-pages-by-traffic: Pages ranked by traffic
mcp__ahrefs__site-explorer-referring-domains: Referring domain list
mcp__ahrefs__site-explorer-backlinks-stats: Backlink overview
mcp__ahrefs__site-explorer-pages-history: Page index history
our-seo-agent CLI: Primary competitive data source (future); use --input for pre-fetched JSON
WebSearch / WebFetch: Supplementary competitor data
```
### Notion for Report Storage
@@ -55,7 +46,7 @@ WebSearch: Check Naver Blog/Cafe presence for competitors
### Competitor Profiling
1. Accept target URL/domain
2. Auto-discover competitors via Ahrefs organic-competitors (or use provided list)
2. Auto-discover competitors via our-seo-agent CLI or use provided list
3. Build profile card for target and each competitor (DR, traffic, keywords, backlinks, content)
4. Analyze keyword overlap between target and each competitor
5. Build multi-dimensional comparison matrix
@@ -136,9 +127,9 @@ WebSearch: Check Naver Blog/Cafe presence for competitors
## Limitations
- Ahrefs data has ~24h freshness lag
- Data freshness depends on source and collection method
- Keyword overlap limited to top 1,000 keywords per domain
- Content velocity based on Ahrefs page index (not real-time crawl)
- Content velocity based on page index data (not real-time crawl)
- Naver presence detection is heuristic-based
## Notion Output (Required)

View File

@@ -78,8 +78,8 @@ python scripts/crawl_budget_analyzer.py --log-file access.log --sitemap https://
# Per-bot profiling
python scripts/crawl_budget_analyzer.py --log-file access.log --scope bots --json
# With Ahrefs page history comparison
python scripts/crawl_budget_analyzer.py --log-file access.log --url https://example.com --ahrefs --json
# With external page history comparison
python scripts/crawl_budget_analyzer.py --log-file access.log --url https://example.com --input pages.json --json
```
**Capabilities**:
@@ -112,7 +112,7 @@ python scripts/crawl_budget_analyzer.py --log-file access.log --url https://exam
|--------|---------|
| Server access logs | Primary crawl data |
| XML sitemap | Reference for expected crawlable pages |
| Ahrefs `site-explorer-pages-history` | Compare indexed pages with crawled pages |
| `our-seo-agent` CLI | Compare indexed pages with crawled pages (future) |
## Output Format

View File

@@ -62,8 +62,8 @@ python scripts/crawl_budget_analyzer.py --log-file access.log --scope orphans --
python scripts/crawl_budget_analyzer.py --log-file access.log --scope bots --json
```
### Step 4: Cross-Reference with Ahrefs (Optional)
Use `site-explorer-pages-history` to compare indexed pages vs crawled pages.
### Step 4: Cross-Reference with External Data (Optional)
Use `our-seo-agent` CLI or provide pre-fetched JSON via `--input` to compare indexed pages vs crawled pages. WebSearch can supplement with current indexing data.
### Step 5: Generate Recommendations
Prioritized action items:
@@ -76,12 +76,21 @@ Prioritized action items:
### Step 6: Report to Notion
Save Korean-language report to SEO Audit Log database.
## MCP Tools Used
| Property | Type | Description |
|----------|------|-------------|
| Issue | Title | Report title (Korean + date) |
| Site | URL | Audited website URL |
| Category | Select | Crawl Budget |
| Priority | Select | Based on efficiency score |
| Found Date | Date | Analysis date (YYYY-MM-DD) |
| Audit ID | Rich Text | Format: CRAWL-YYYYMMDD-NNN |
| Tool | Purpose |
|------|---------|
| Ahrefs `site-explorer-pages-history` | Compare indexed pages with crawled pages |
| Notion | Save audit report to database |
## Data Sources
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Future primary data source; use `--input` for pre-fetched JSON |
| Notion MCP | Save audit report to database |
| WebSearch | Current bot documentation and best practices |
## Output Format
@@ -118,6 +127,27 @@ Save Korean-language report to SEO Audit Log database.
}
```
## Korean Output Example
```
# 크롤 예산 분석 보고서 - example.com
## 분석 기간: 2025-01-01 ~ 2025-01-31
### 봇별 크롤 현황
| 봇 | 요청 수 | 고유 URL | 일 평균 |
|----|---------|---------|---------|
| Googlebot | 80,000 | 12,000 | 2,580 |
| Yeti (Naver) | 35,000 | 8,000 | 1,129 |
### 크롤 낭비 요인
- 파라미터 URL: 5,000건 (3.3%)
- 리다이렉트 체인: 2,000건 (1.3%)
- 소프트 404: 1,500건 (1.0%)
### 효율성 점수: 72/100
```
## Limitations
- Requires actual server access logs (not available via standard web crawling)

View File

@@ -2,7 +2,7 @@
## Overview
SEO site migration planning and monitoring tool for comprehensive pre-migration risk assessment, redirect mapping, URL inventory, crawl baseline capture, and post-migration traffic/indexation monitoring. Supports domain moves, platform changes, URL restructuring, HTTPS migrations, and subdomain consolidation. Captures full URL inventory via Firecrawl crawl, builds traffic/keyword baselines via Ahrefs, generates redirect maps with per-URL risk scoring, and tracks post-launch recovery with automated alerts.
SEO site migration planning and monitoring tool for comprehensive pre-migration risk assessment, redirect mapping, URL inventory, crawl baseline capture, and post-migration traffic/indexation monitoring. Supports domain moves, platform changes, URL restructuring, HTTPS migrations, and subdomain consolidation. Captures full URL inventory via Firecrawl crawl, builds traffic/keyword baselines via our-seo-agent CLI, generates redirect maps with per-URL risk scoring, and tracks post-launch recovery with automated alerts.
## Quick Start
@@ -45,7 +45,7 @@ python scripts/migration_planner.py --domain https://blog.example.com --type sub
**Capabilities**:
- URL inventory via Firecrawl crawl (capture all URLs + status codes)
- Ahrefs top-pages baseline (traffic, keywords per page)
- our-seo-agent top-pages baseline (traffic, keywords per page)
- Redirect map generation (old URL -> new URL mapping)
- Risk scoring per URL (based on traffic + backlinks + keyword rankings)
- Pre-migration checklist generation
@@ -78,17 +78,13 @@ python scripts/migration_monitor.py --domain https://new-example.com --migration
- Recovery timeline estimation
- Alert generation for traffic drops >20%
## Ahrefs MCP Tools Used
## Data Sources
| Tool | Purpose |
|------|---------|
| `site-explorer-metrics` | Current organic metrics (traffic, keywords) |
| `site-explorer-metrics-history` | Historical metrics for pre/post comparison |
| `site-explorer-top-pages` | Top performing pages for baseline |
| `site-explorer-pages-by-traffic` | Pages ranked by traffic for risk scoring |
| `site-explorer-organic-keywords` | Keyword rankings per page |
| `site-explorer-referring-domains` | Referring domains per page for risk scoring |
| `site-explorer-backlinks-stats` | Backlink overview for migration impact |
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Primary data source (future); use `--input` for pre-fetched JSON |
| WebSearch / WebFetch | Supplementary live data |
| Notion MCP | Save audit report to database |
## Output Format

View File

@@ -8,12 +8,12 @@ description: |
## Purpose
Comprehensive site migration planning and post-migration monitoring for SEO: crawl-based URL inventory, traffic/keyword baseline capture via Ahrefs, redirect map generation with per-URL risk scoring, pre-migration checklist creation, and post-launch traffic/indexation/ranking recovery tracking with automated alerts. Supports domain moves, platform changes, URL restructuring, HTTPS migrations, and subdomain consolidation.
Comprehensive site migration planning and post-migration monitoring for SEO: crawl-based URL inventory, traffic/keyword baseline capture via our-seo-agent CLI, redirect map generation with per-URL risk scoring, pre-migration checklist creation, and post-launch traffic/indexation/ranking recovery tracking with automated alerts. Supports domain moves, platform changes, URL restructuring, HTTPS migrations, and subdomain consolidation.
## Core Capabilities
1. **URL Inventory** - Crawl entire site via Firecrawl to capture all URLs and status codes
2. **Traffic Baseline** - Capture per-page traffic, keywords, and backlinks via Ahrefs
2. **Traffic Baseline** - Capture per-page traffic, keywords, and backlinks via our-seo-agent CLI
3. **Redirect Map Generation** - Create old URL -> new URL mappings with 301 redirect rules
4. **Risk Scoring** - Score each URL (0-100) based on traffic, backlinks, and keyword rankings
5. **Pre-Migration Checklist** - Generate type-specific migration checklist (Korean)
@@ -26,15 +26,10 @@ Comprehensive site migration planning and post-migration monitoring for SEO: cra
## MCP Tool Usage
### Ahrefs for SEO Baseline & Monitoring
### SEO Data
```
mcp__ahrefs__site-explorer-metrics: Current organic metrics (traffic, keywords)
mcp__ahrefs__site-explorer-metrics-history: Historical metrics for pre/post comparison
mcp__ahrefs__site-explorer-top-pages: Top performing pages for baseline
mcp__ahrefs__site-explorer-pages-by-traffic: Pages ranked by traffic for risk scoring
mcp__ahrefs__site-explorer-organic-keywords: Keyword rankings per page
mcp__ahrefs__site-explorer-referring-domains: Referring domains for risk scoring
mcp__ahrefs__site-explorer-backlinks-stats: Backlink overview for migration impact
our-seo-agent CLI: Primary SEO baseline data source (future); use --input for pre-fetched JSON
WebSearch / WebFetch: Supplementary migration data
```
### Firecrawl for URL Inventory & Redirect Verification
@@ -58,9 +53,9 @@ mcp__perplexity__search: Research migration best practices and common pitfalls
### Pre-Migration Planning
1. Accept target domain, migration type, and new domain (if applicable)
2. Crawl URL inventory via Firecrawl (capture all URLs + status codes)
3. Fetch Ahrefs top pages baseline (traffic, keywords, backlinks per page)
3. Fetch top pages baseline via our-seo-agent CLI or pre-fetched data
4. Fetch site-level metrics (total traffic, keywords, referring domains)
5. Enrich URL inventory with Ahrefs traffic/backlink data
5. Enrich URL inventory with traffic/backlink data from our-seo-agent CLI
6. Score risk per URL (0-100) based on traffic weight (40%), backlinks (30%), keywords (30%)
7. Generate redirect map (old URL -> new URL) based on migration type
8. Aggregate risk assessment (high/medium/low URL counts, overall risk level)
@@ -69,7 +64,7 @@ mcp__perplexity__search: Research migration best practices and common pitfalls
### Post-Migration Monitoring
1. Accept domain, migration date, and optional baseline JSON
2. Compare pre vs post traffic using Ahrefs metrics history
2. Compare pre vs post traffic using our-seo-agent metrics history
3. Check redirect health via Firecrawl (broken, chains, loops)
4. Track indexation changes (pre vs post page count, missing pages)
5. Track keyword ranking changes for priority keywords
@@ -156,7 +151,7 @@ mcp__perplexity__search: Research migration best practices and common pitfalls
## Limitations
- Ahrefs data has ~24h freshness lag
- Data freshness depends on source and collection method
- Firecrawl crawl limited to 5,000 URLs per run
- Redirect chain detection depends on Firecrawl following redirects
- Recovery estimation is heuristic-based on industry averages

View File

@@ -98,12 +98,14 @@ python scripts/executive_report.py --report aggregated_report.json --audience c-
- Support for C-level, marketing team, and technical team audiences
- Markdown output format
## Ahrefs MCP Tools Used
## Data Sources
| Tool | Purpose |
|------|---------|
| `site-explorer-metrics` | Fresh current organic metrics snapshot |
| `site-explorer-metrics-history` | Historical metrics for trend visualization |
| Source | Purpose |
|--------|---------|
| `our-seo-agent` CLI | Primary data source (future); use `--input` flag to provide pre-fetched JSON |
| `--output-dir` flag | Scan local JSON files from skills 11-33 |
| WebSearch / WebFetch | Supplementary data for trend context |
| Notion MCP | Query past audits from SEO Audit Log database |
## Output Format

View File

@@ -20,8 +20,10 @@ from tenacity import (
retry_if_exception_type,
)
# Load environment variables
load_dotenv()
# Logging setup
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s - %(levelname)s - %(message)s",
@@ -34,6 +36,13 @@ class RateLimiter:
"""Rate limiter using token bucket algorithm."""
def __init__(self, rate: float, per: float = 1.0):
"""
Initialize rate limiter.
Args:
rate: Number of requests allowed
per: Time period in seconds (default: 1 second)
"""
self.rate = rate
self.per = per
self.tokens = rate
@@ -41,6 +50,7 @@ class RateLimiter:
self._lock = asyncio.Lock()
async def acquire(self) -> None:
"""Acquire a token, waiting if necessary."""
async with self._lock:
now = datetime.now()
elapsed = (now - self.last_update).total_seconds()
@@ -64,6 +74,14 @@ class BaseAsyncClient:
requests_per_second: float = 3.0,
logger: logging.Logger | None = None,
):
"""
Initialize base client.
Args:
max_concurrent: Maximum concurrent requests
requests_per_second: Rate limit
logger: Logger instance
"""
self.semaphore = Semaphore(max_concurrent)
self.rate_limiter = RateLimiter(requests_per_second)
self.logger = logger or logging.getLogger(self.__class__.__name__)
@@ -83,6 +101,7 @@ class BaseAsyncClient:
self,
coro: Callable[[], Any],
) -> Any:
"""Execute a request with rate limiting and retry."""
async with self.semaphore:
await self.rate_limiter.acquire()
self.stats["requests"] += 1
@@ -100,6 +119,7 @@ class BaseAsyncClient:
requests: list[Callable[[], Any]],
desc: str = "Processing",
) -> list[Any]:
"""Execute multiple requests concurrently."""
try:
from tqdm.asyncio import tqdm
has_tqdm = True
@@ -124,6 +144,7 @@ class BaseAsyncClient:
return await asyncio.gather(*tasks, return_exceptions=True)
def print_stats(self) -> None:
"""Print request statistics."""
self.logger.info("=" * 40)
self.logger.info("Request Statistics:")
self.logger.info(f" Total Requests: {self.stats['requests']}")
@@ -140,6 +161,8 @@ class ConfigManager:
@property
def google_credentials_path(self) -> str | None:
"""Get Google service account credentials path."""
# Prefer SEO-specific credentials, fallback to general credentials
seo_creds = os.path.expanduser("~/.credential/ourdigital-seo-agent.json")
if os.path.exists(seo_creds):
return seo_creds
@@ -147,23 +170,38 @@ class ConfigManager:
@property
def pagespeed_api_key(self) -> str | None:
"""Get PageSpeed Insights API key."""
return os.getenv("PAGESPEED_API_KEY")
@property
def custom_search_api_key(self) -> str | None:
"""Get Custom Search API key."""
return os.getenv("CUSTOM_SEARCH_API_KEY")
@property
def custom_search_engine_id(self) -> str | None:
"""Get Custom Search Engine ID."""
return os.getenv("CUSTOM_SEARCH_ENGINE_ID")
@property
def notion_token(self) -> str | None:
"""Get Notion API token."""
return os.getenv("NOTION_TOKEN") or os.getenv("NOTION_API_KEY")
def validate_google_credentials(self) -> bool:
"""Validate Google credentials are configured."""
creds_path = self.google_credentials_path
if not creds_path:
return False
return os.path.exists(creds_path)
def get_required(self, key: str) -> str:
"""Get required environment variable or raise error."""
value = os.getenv(key)
if not value:
raise ValueError(f"Missing required environment variable: {key}")
return value
# Singleton config instance
config = ConfigManager()

View File

@@ -453,11 +453,13 @@ CATEGORY_KOREAN_LABELS: dict[str, str] = {
"competitor": "경쟁사",
"schema": "스키마",
"kpi": "KPI",
"search_console": "Search Console",
"comprehensive": "종합 감사",
"search_console": "서치 콘솔",
"ecommerce": "이커머스",
"international": "국제 SEO",
"ai_search": "AI 검색",
"entity_seo": "엔티티 SEO",
"migration": "사이트 이전",
}

View File

@@ -123,11 +123,11 @@ CATEGORY_LABELS_KR = {
"competitor": "경쟁 분석",
"schema": "스키마/구조화 데이터",
"kpi": "KPI 프레임워크",
"search_console": "Search Console",
"search_console": "서치 콘솔",
"ecommerce": "이커머스 SEO",
"international": "국제 SEO",
"ai_search": "AI 검색 가시성",
"entity_seo": "Knowledge Graph",
"entity_seo": "지식 그래프",
}
# Common English issue descriptions -> Korean translations
@@ -434,11 +434,11 @@ class ExecutiveReportGenerator:
grade_kr = HEALTH_LABELS_KR.get(grade, grade)
trend_kr = TREND_LABELS_KR.get(summary.health_trend, summary.health_trend)
lines.append("## Health Score")
lines.append("## 종합 건강 점수")
lines.append("")
lines.append(f"| 지표 | 값 |")
lines.append(f"|------|-----|")
lines.append(f"| Overall Score | **{summary.health_score}/100** |")
lines.append(f"| 종합 점수 | **{summary.health_score}/100** |")
lines.append(f"| 등급 | {grade_kr} |")
lines.append(f"| 추세 | {trend_kr} |")
lines.append("")

View File

@@ -55,7 +55,7 @@ SKILL_REGISTRY = {
28: {"name": "knowledge-graph", "category": "entity_seo", "weight": 0.10},
31: {"name": "competitor-intel", "category": "competitor", "weight": 0.15},
32: {"name": "crawl-budget", "category": "technical", "weight": 0.10},
33: {"name": "page-experience", "category": "performance", "weight": 0.10},
33: {"name": "migration-planner", "category": "migration", "weight": 0.10},
}
CATEGORY_WEIGHTS = {
@@ -69,6 +69,13 @@ CATEGORY_WEIGHTS = {
"competitor": 0.05,
"schema": 0.05,
"kpi": 0.05,
"comprehensive": 1.0,
"search_console": 0.05,
"ecommerce": 0.05,
"international": 0.05,
"ai_search": 0.05,
"entity_seo": 0.05,
"migration": 0.05,
}
@@ -255,14 +262,15 @@ class ReportAggregator(BaseAsyncClient):
# Extract health score — check top-level first, then nested data dict
score_found = False
for key in ("health_score", "overall_health", "score"):
for key in ("health_score", "overall_health", "overall_score", "score",
"technical_score", "efficiency_score", "onpage_score"):
if key in data:
try:
skill_output.health_score = float(data[key])
score_found = True
break
except (ValueError, TypeError):
pass
break
continue
if not score_found:
nested = data.get("data", {})
@@ -276,9 +284,9 @@ class ReportAggregator(BaseAsyncClient):
if val is not None:
try:
skill_output.health_score = float(val)
break
except (ValueError, TypeError):
pass
break
continue
# Extract audit date
for key in ("audit_date", "report_date", "timestamp", "found_date"):

View File

@@ -20,10 +20,10 @@ Aggregate outputs from all SEO skills (11-33) into stakeholder-ready executive r
## MCP Tool Usage
### Ahrefs for Fresh Data Pull
### SEO Data
```
mcp__ahrefs__site-explorer-metrics: Pull current organic metrics snapshot for dashboard
mcp__ahrefs__site-explorer-metrics-history: Pull historical metrics for trend visualization
our-seo-agent CLI: Primary data source (future); use --input for pre-fetched JSON
WebSearch / WebFetch: Supplementary live data
```
### Notion for Reading Past Audits and Writing Reports
@@ -42,7 +42,7 @@ mcp__perplexity__*: Enrich reports with industry benchmarks and competitor conte
### Dashboard Generation
1. Accept target domain and optional date range
2. Query Notion SEO Audit Log for all past audit entries for the domain
3. Optionally pull fresh metrics from Ahrefs (site-explorer-metrics, metrics-history)
3. Optionally pull fresh metrics from our-seo-agent CLI or provide pre-fetched JSON via --input
4. Normalize all skill outputs into unified format
5. Compute cross-skill health score with weighted category dimensions
6. Identify top issues (sorted by severity) and top wins across all audits

View File

@@ -25,5 +25,8 @@ markdownify>=0.11.6
# HTTP client
httpx>=0.25.0
# Retry logic
tenacity>=8.2.0
# Data validation
pydantic>=2.5.0