Fix SEO skills 19-34 bugs, add slash commands, enhance reference-curator (#3)
* Fix SEO skill 34 bugs, Korean labels, and transition Ahrefs refs to our-seo-agent P0: Fix report_aggregator.py — wrong SKILL_REGISTRY[33] mapping, missing CATEGORY_WEIGHTS for 7 categories, and break bug in health score parsing that exited loop even on parse failure. P1: Remove VIEW tab references from skill 20, expand skill 32 docs, replace Ahrefs MCP references across all 16 skills (19-28, 31-34) with our-seo-agent CLI data source references. P2: Fix Korean labels in executive_report.py and dashboard_generator.py, add tenacity to base requirements, sync skill 34 base_client.py with canonical version from skill 12. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * Add Claude Code slash commands for SEO skills 19-34 and fix stale paths Create 14 new slash command files for skills 19-28, 31-34 so they appear as /seo-* commands in Claude Code. Also fix stale directory paths in 8 existing commands (skills 12-18, 29-30) that referenced pre-renumbering skill directories. Update .gitignore to track .claude/commands/ while keeping other .claude/ files ignored. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * Add 8 slash commands, enhance reference-curator with depth/output options - Add slash commands: ourdigital-brand-guide, notion-writer, notebooklm-agent, notebooklm-automation, notebooklm-studio, notebooklm-research, reference-curator, multi-agent-guide - Add --depth (light/standard/deep/full) with Firecrawl parameter mapping - Add --output with ~/Documents/reference-library/ default and user confirmation - Increase --max-sources default from 10 to 100 - Rename /reference-curator-pipeline to /reference-curator - Simplify web-crawler-orchestrator label to web-crawler in docs Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
66
.claude/commands/notebooklm-research.md
Normal file
66
.claude/commands/notebooklm-research.md
Normal file
@@ -0,0 +1,66 @@
|
||||
---
|
||||
description: NotebookLM research workflows - web research, Drive search, auto-import, source extraction
|
||||
---
|
||||
|
||||
# NotebookLM Research
|
||||
|
||||
Research workflows: web research, Drive search, auto-import, source extraction.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
```bash
|
||||
pip install notebooklm-py
|
||||
playwright install chromium
|
||||
notebooklm login
|
||||
```
|
||||
|
||||
## Research Commands
|
||||
|
||||
```bash
|
||||
# Web research
|
||||
notebooklm source add-research "topic"
|
||||
notebooklm source add-research "topic" --mode deep --import-all
|
||||
notebooklm source add-research "topic" --mode deep --no-wait
|
||||
|
||||
# Drive research
|
||||
notebooklm source add-research "topic" --from drive
|
||||
|
||||
# Status and wait
|
||||
notebooklm research status
|
||||
notebooklm research wait --import-all
|
||||
```
|
||||
|
||||
## Source Extraction
|
||||
|
||||
```bash
|
||||
notebooklm source fulltext <id>
|
||||
notebooklm source guide <id>
|
||||
```
|
||||
|
||||
## Research Modes
|
||||
|
||||
| Mode | Sources | Time |
|
||||
|------|---------|------|
|
||||
| `fast` | 5-10 | seconds |
|
||||
| `deep` | 20+ | 2-5 min |
|
||||
|
||||
## Subagent Pattern
|
||||
|
||||
```python
|
||||
# Non-blocking deep research
|
||||
notebooklm source add-research "topic" --mode deep --no-wait
|
||||
|
||||
# Spawn subagent to wait
|
||||
Task(
|
||||
prompt="Wait for research and import: notebooklm research wait -n {id} --import-all",
|
||||
subagent_type="general-purpose"
|
||||
)
|
||||
```
|
||||
|
||||
## Autonomy
|
||||
|
||||
**Auto-run:** `research status`, `source fulltext`, `source guide`
|
||||
**Ask first:** `source add-research`, `research wait --import-all`
|
||||
|
||||
## Source
|
||||
Full details: `custom-skills/53-notebooklm-research/code/CLAUDE.md`
|
||||
Reference in New Issue
Block a user