sitemap-robots
Automated sitemap generation for all locale URLs, robots.txt configuration, and llms.txt for AI crawler optimization. Use when setting up sitemap.xml, configuring crawling rules, or improving discoverability for search engines and AI systems.
$ Installieren
git clone https://github.com/majiayu000/claude-skill-registry /tmp/claude-skill-registry && cp -r /tmp/claude-skill-registry/skills/data/sitemap-robots ~/.claude/skills/claude-skill-registry// tip: Run this command in your terminal to install the skill
Repository

majiayu000
Author
majiayu000/claude-skill-registry/skills/data/sitemap-robots
0
Stars
0
Forks
Updated1w ago
Added1w ago