frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Show HN: OpenSkills – Stop bloating your LLM context with unused instructions

3•twwch•16h ago
Hello HN,

I’ve been building AI agents lately and ran into a common "Context Bloat" problem. When an agent has 20+ skills, stuffing every system prompt, reference doc, and tool definition into a single request quickly hits token limits and degrades model performance (the "lost in the middle" problem).

To solve this, I built OpenSkills, an open-source SDK that implements a Progressive Disclosure Architecture for agent skills.

The Core Concept: Instead of loading everything upfront, OpenSkills splits a skill into three layers:

Layer 1 (Metadata): Light-weight tags and triggers (always loaded for discovery).

Layer 2 (Instruction): The core SKILL.md prompt (loaded only when the skill is matched).

Layer 3 (Resources): Heavy reference docs or scripts that are conditionally loaded based on the specific conversation context.

Why this matters:

Scalability: You can have hundreds of skills without overwhelming the LLM's context window.

Markdown-First: Skills are defined in a simple SKILL.md format. It’s human-readable, git-friendly, and easy for the LLM to parse.

Conditional Resources: For example, a "Finance Skill" only pulls in the tax-code.pdf reference if the query actually mentions tax compliance.

Key Features:

Python 3.10+ SDK.

Automatic skill matching and invocation.

Support for script execution (via [INVOKE:script_name] syntax).

Multimodal support (Images via URL/base64).

GitHub: [https://github.com/twwch/OpenSkills] PyPI: pip install openskills-sdk

Comments

jasendo•8h ago
Interesting approach. Progressive disclosure helps with token limits, but I'm curious how you handle state across multi-step tasks where Layer 2/3 context from an earlier step becomes relevant again later? The "lost in the middle" problem is also about the model losing track of what happened 10 steps ago even if it was loaded at the time.