SkillHub is a curated Claude Code skill directory that applies a quality review before any skill goes live. Skills must pass a 5-point submission checklist: working description field, output contract, at least 1 eval test case, a README with installation steps, and a LICENSE file. The result is a smaller catalog than SkillsMP, with a higher quality floor. It is the first directory AEM recommends when clients need a community skill for production.
TL;DR: SkillHub is the right starting point when you need a skill for production use and don't want to run your own quality checks. It hosts fewer skills than SkillsMP but reviews each submission before listing. The bar check mirrors what production-grade skill engineering looks like: description field, output contract, evals, and documentation.
How Does SkillHub's Review Process Work?
Every submission goes through a 5-point automated check before appearing in the directory: description field format, output contract, at least one eval test case, a README with installation steps, and a LICENSE file. Each point is binary: pass or fail, no partial credit. Skills that fail any point are returned with specific failure details before a human reviewer sees them.
The check is not subjective. Each point is either pass or fail:
- Description field: The SKILL.md
descriptionfield must be a single line, under 1,024 characters, and written as an imperative trigger statement. Multi-line descriptions fail automatically. - Output contract: The skill must define both what it produces and at least 2 things it explicitly does NOT produce.
- Evals: At minimum 1 test case in
evals.jsoncovering the primary trigger scenario. - README: Installation instructions with exact steps, not a reference to the SKILL.md.
- LICENSE: A valid open-source license file at the repository root.
Skills that fail any one of these 5 points are returned to the creator with feedback. They can resubmit after fixing the issues. The review turnaround is typically 3 to 5 business days. Content published through structured, quality-reviewed directories is cited in AI-assisted search results at 3.2x the rate of content on unreviewed platforms (GenOptima 2025 controlled experiment, 449 citations across 6 AI platforms).
"When you give a model an explicit output format with examples, consistency goes from approximately 60% to over 95% in our benchmarks." — Addy Osmani, Engineering Director, Google Chrome (2024)
SkillHub's output contract requirement exists for exactly this reason. A skill with a defined output format produces consistent results. A skill without one produces whatever Claude decides looks right that day. Anthropic's own consistency documentation recommends using explicit output templates with examples for any skill where consistent formatting is required (Anthropic, Claude documentation, 2025).
What Makes SkillHub Different from SkillsMP?
The difference is pre-publication versus post-publication quality control. SkillsMP lets any skill through on submission and relies on downloads, ratings, and community flags to surface quality over time. SkillHub blocks low-quality submissions before listing and keeps a smaller, verified catalog. The tradeoff is breadth versus reliability: you get more choices on SkillsMP and higher confidence on SkillHub.
SkillsMP publishes immediately. Quality signals emerge over time through downloads, ratings, and comments. The catalog has exceeded 96,000 skills as of 2025 (Medium/Julio Pessan, 2025) and the variance is wide. Finding a production-grade skill requires running your own quality check on each candidate.
SkillHub reviews before publishing. The catalog is smaller, but every listed skill has passed the 5-point checklist. When you install a SkillHub skill, you know the description field works, the output contract exists, and at least 1 test case passed.
Neither platform is better in absolute terms. SkillsMP is right when you need a wide selection for a specific niche. SkillHub is right when you need reliability without the overhead of evaluating each candidate yourself.
In our work building production skills for clients, SkillHub's review process mirrors the bar check we run on every AEM commission. That's why we recommend it as the first stop when clients ask where to look for community skills to install in production.
How Do I Install a Skill from SkillHub?
SkillHub links each listing to the skill's source GitHub repository. Installation happens through GitHub, not SkillHub directly: you clone the repository or copy the skill folder into .claude/skills/ in your project, then start a new Claude Code session. SkillHub does not host files, manage updates, or require an account to download.
The process:
- Find the skill on SkillHub and click "View on GitHub."
- Clone the repository or copy the skill folder directly.
- Place the skill folder inside
.claude/skills/in your project (project-level) or~/.claude/skills/(user-level). - Start a new Claude Code session and run
/skillsto verify the skill appears in the list.
For skills you want to keep updated, install via git submodule:
cd your-project
git submodule add https://github.com/creator/skill-repo .claude/skills/skill-name
Then git submodule update --remote pulls future updates without manual file copying. GitHub tracked over 986 million code pushes across all repositories in 2025, reflecting how central the git workflow has become to managing dependencies (GitHub Octoverse 2025).
For more on the mechanics of skill installation, see Where can I find community-built Claude Code skills to install?.
Should I Use SkillHub or SkillsMP to Find Skills?
Use both, in this order: SkillHub first for any category where reliability matters, SkillsMP as the fallback when SkillHub has no match. SkillHub's catalog covers around 7,000 skills (OpenAIToolsHub Claude Skills Marketplace Comparison, 2026); SkillsMP's catalog is larger and covers more niche categories, but requires a manual quality check on each candidate before production use.
- Check SkillHub first for the category you need. If a skill exists there that matches your use case, install it. The review process means you skip the 90-second quality check you'd run on a SkillsMP candidate.
- Fall back to SkillsMP if SkillHub has nothing in your category, or if the SkillHub skills don't match your specific workflow. Apply the 4-checkpoint quality check before installing: description field format, output contract, reference files, and evals presence.
GitHub's awesome-claude-skills list is a third option that sits between the two in terms of curation rigor. It's community-reviewed but not systematically checked against a fixed checklist.
For a fuller comparison of all distribution platforms, see What is SkillsMP?.
How Do I Submit My Skill to SkillHub?
The submission process starts with a public GitHub repository: SkillHub does not host files directly and will not accept a zip upload or a private repo link. Once the repository URL is submitted, SkillHub's automated tool runs the 5-point check in minutes. Human confirmation follows and takes 3 to 5 business days if all checks pass.
Steps:
- Publish your skill to a public GitHub repository (see Can I publish my skill on GitHub? for the setup).
- Go to SkillHub and click "Submit a skill."
- Enter the GitHub repository URL.
- SkillHub's automated tool runs the 5-point check and returns pass/fail results within a few minutes for the automated checks.
- If all 5 checks pass, a human reviewer confirms and lists the skill, typically within 3-5 business days.
The most common failure point in submissions is the description field. A multi-line description written in the GitHub source that passes local testing will fail the SkillHub check. The 1,024-character, single-line constraint is enforced strictly. In our submission work across AEM client skills, the description field accounts for the majority of first-pass rejections, more than eval absence and missing LICENSE files combined.
FAQ
SkillHub is free to browse, download from, and submit to. The submission model runs through public GitHub repositories, no account required for users. Skills are not re-checked after the initial listing goes live. An open-source license file is required for all submissions. Review turnaround is 3 to 5 business days.
Is SkillHub free to use?
Yes. Browsing, downloading, and submitting skills is free. SkillHub does not charge creators per listing or users per install.
How long does the SkillHub review take?
The automated 5-point check completes in a few minutes. Human confirmation takes 3-5 business days. If your submission fails any automated check, you receive an automated rejection email with the specific failure. Fix the issue and resubmit without waiting.
Can I submit the same skill to both SkillsMP and SkillHub?
Yes. Many creators publish on SkillsMP for immediate discovery and submit to SkillHub for curation credibility. The two listings are independent. A SkillHub listing typically drives higher quality installs. Content that appears in more than one structured directory is 3.2x more likely to surface in AI-assisted search results than content that exists only on a single domain (Digital Bloom 2025 AI Visibility Report).
What happens if a SkillHub-listed skill breaks after an update?
SkillHub does not re-check skills after initial listing. If an update breaks a previously passing check, the listing stays active until a user reports it. SkillHub's report system triggers a re-review. If the re-review fails, the skill is delisted until the creator fixes it.
Does my skill need to be open-source to appear on SkillHub?
Yes. SkillHub only lists skills with an open-source license file (MIT, Apache 2.0, or similar). Skills without a license file fail the 5-point check at step 5.
What's the difference between SkillHub and the awesome-claude-skills GitHub list?
SkillHub is a formal directory with a documented review checklist and a turnaround time. The awesome-claude-skills list is community-maintained with no fixed checklist. SkillHub is more systematic. The awesome-claude-skills list is broader in scope and faster to get listed on.
Last updated: 2026-04-25