rscheiwe

greeting

27
2
# Install this skill:
npx skills add rscheiwe/open-skills

Or install specific skill: npx add-skill https://github.com/rscheiwe/open-skills/tree/main/examples/hello-world

# Description

Greeting message

# SKILL.md


name: hello_world
version: 1.0.0
entrypoint: scripts/main.py
description: A simple skill that greets the user
inputs:
- type: text
name: name
description: Name to greet
outputs:
- type: text
name: greeting
description: Greeting message
tags: [example, simple, greeting]
allow_network: false


Hello World Skill

A minimal example skill that demonstrates the basic structure of an open-skills skill bundle.

What it does

This skill takes a name as input and returns a friendly greeting message.

Usage

{
  "name": "Alice"
}

Returns:

{
  "greeting": "Hello, Alice! Welcome to open-skills."
}

Files

  • SKILL.md: Skill metadata and documentation (this file)
  • scripts/main.py: Main entrypoint with the run() function
  • tests/sample_input.json: Example input for testing

# README.md

open-skills

PyPI version
Python 3.11+
License: MIT

A framework-agnostic Skills subsystem for Python agents. Build, version, and execute reusable agent capabilities as code bundles β€” embed directly in your app or deploy as a service.

Inspired by Anthropic's Skills feature for Claude.

Overview

open-skills provides a complete system for managing executable code bundles (skills) that AI agents can discover and invoke. Think of it as a plugin system for LLM applications with version control, auto-discovery, and execution tracking.

Version 0.2.0 introduces library mode β€” embed open-skills directly into any Python application without running a separate service.

Key Features

βœ… Framework-Agnostic β€” Works with OpenAI, Anthropic, LangChain, LlamaIndex, or custom agents

βœ… Two Deployment Modes β€” Library (embedded) or Service (microservice)

βœ… Auto-Discovery β€” Skills registered from folder structure at startup

βœ… Context-Aware Prompts β€” Automatic skill injection into system prompts

βœ… Versioned Bundles β€” Skills as folders with metadata, scripts, and resources

βœ… Embedding-Based Search β€” Automatic skill selection via vector similarity

βœ… Tool Manifest β€” Standard .well-known/skills.json for any LLM framework

βœ… Real-Time Streaming β€” SSE for execution updates

βœ… Artifact Generation β€” File outputs with S3-compatible storage

βœ… Multi-Skill Composition β€” Chain or parallelize execution

Quick Start

Library Mode (Embed in Your App)

Install:

pip install open-skills

Integrate into FastAPI:

from fastapi import FastAPI
from open_skills import mount_open_skills

app = FastAPI()

# One-line integration
await mount_open_skills(
    app,
    skills_dir="./skills",              # Auto-discover from this folder
    database_url="postgresql+asyncpg://localhost/mydb",
    openai_api_key="sk-...",
)

# Skills are now:
# - Auto-registered from ./skills folder
# - Discoverable at /.well-known/skills.json
# - Executable via /skills/api/runs

Use with any agent framework:

from open_skills import as_agent_tools, to_openai_tool
import openai

# Get available tools
tools = await as_agent_tools(published_only=True)
openai_tools = [to_openai_tool(t) for t in tools]

# Use with OpenAI
client = openai.AsyncOpenAI()
response = await client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Summarize this document..."}],
    tools=openai_tools,
)

Service Mode (Microservice)

Run as standalone service:

# Using Docker Compose
docker-compose up -d

# Or directly
python -m open_skills.service.main

Access from any language:

curl http://localhost:8000/.well-known/skills.json  # Discover tools
curl -X POST http://localhost:8000/api/runs \
  -d '{"skill_version_ids": ["..."], "input": {...}}'

Two Ways to Use

Mode Best For Pros Cons
Library Monolithic apps, low latency In-process, zero network overhead Shares resources
Service Microservices, polyglot apps Process isolation, language-agnostic Network overhead

See INTEGRATION_GUIDE.md for complete integration patterns.

Skill Bundle Format

A skill is a directory containing:

my-skill/
β”œβ”€β”€ SKILL.md          # Metadata (YAML frontmatter + description)
β”œβ”€β”€ scripts/
β”‚   └── main.py       # Entrypoint function
β”œβ”€β”€ resources/        # Optional: templates, data files
β”‚   └── template.txt
└── tests/            # Optional: test inputs
    └── sample.json

SKILL.md Example:

---
name: text_summarizer
version: 1.0.0
entrypoint: scripts/main.py
description: Summarizes long text into key points
inputs:
  - type: text
outputs:
  - type: text
tags: [nlp, summarization, text]
---

# Text Summarizer

This skill takes long text and produces a concise summary.

scripts/main.py Example:

async def run(input_payload: dict) -> dict:
    text = input_payload.get("text", "")
    summary = text[:200] + "..."  # Simple truncation

    return {
        "outputs": {"summary": summary},
        "artifacts": []
    }

Common Use Cases

1. Embed in Existing FastAPI App

from fastapi import FastAPI
from open_skills import mount_open_skills

app = FastAPI()

# Your existing routes
@app.get("/")
async def root():
    return {"app": "my-app"}

# Add skills
@app.on_event("startup")
async def startup():
    await mount_open_skills(
        app,
        prefix="/skills",
        skills_dir="./skills",
        auto_register=True,
    )

2. Use with OpenAI Tool Calling

from open_skills import configure, as_agent_tools, to_openai_tool
from open_skills.core.executor import SkillExecutor
from open_skills.core.manager import SkillManager
import openai

# Configure library
configure(database_url="postgresql+asyncpg://...", openai_api_key="sk-...")

# Get tools
tools = await as_agent_tools()
openai_tools = [to_openai_tool(t) for t in tools]

# Call OpenAI
client = openai.AsyncOpenAI()
response = await client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "Help me summarize this..."}],
    tools=openai_tools,
)

# Execute skill if tool called
if response.choices[0].message.tool_calls:
    for tool_call in response.choices[0].message.tool_calls:
        function_name = tool_call.function.name
        tool = next(t for t in tools if t["name"] == function_name)

        # Execute the skill
        # ... (see examples/openai_agents_sdk_example.py for full example)

3. Context-Aware Prompts (Skill Injection)

from open_skills import configure, inject_skills_context

configure(database_url="postgresql+asyncpg://...", openai_api_key="sk-...")

# Create a context-aware system prompt
base_prompt = "You are a helpful AI assistant."

# Inject available skills into the prompt
system_prompt = await inject_skills_context(
    base_prompt,
    format="detailed"  # or "compact", "numbered"
)

# Now the agent knows what skills are available
agent = Agent(system_prompt=system_prompt)

4. Auto-Discovery from Folder

from open_skills import configure, register_skills_from_folder

configure(database_url="postgresql+asyncpg://...", openai_api_key="sk-...")

# Auto-register all skills in ./skills folder
versions = await register_skills_from_folder(
    "./skills",
    auto_publish=True,
    visibility="org",
)

print(f"Registered {len(versions)} skills")

5. Real-Time Execution Streaming

# Backend (Python)
import httpx

async with httpx.AsyncClient() as client:
    async with client.stream("GET", f"/api/runs/{run_id}/stream") as response:
        async for line in response.aiter_lines():
            # Process Server-Sent Events
            print(line)
// Frontend (JavaScript)
const eventSource = new EventSource(`/api/runs/${runId}/stream`);

eventSource.addEventListener("status", (e) => {
  console.log("Status:", JSON.parse(e.data).status);
});

eventSource.addEventListener("complete", (e) => {
  console.log("Done:", JSON.parse(e.data));
  eventSource.close();
});

Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                    Your Application                     β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚  Library Mode                 β”‚  Service Mode           β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”      β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚ mount_open_skills() β”‚      β”‚  β”‚ HTTP Client      β”‚  β”‚
β”‚  β”‚  β€’ Auto-register    β”‚      β”‚  β”‚  β€’ REST API      β”‚  β”‚
β”‚  β”‚  β€’ Tool discovery   β”‚      β”‚  β”‚  β€’ Language-     β”‚  β”‚
β”‚  β”‚  β€’ In-process exec  β”‚      β”‚  β”‚    agnostic      β”‚  β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜      β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                      β”‚
                      β–Ό
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
        β”‚   open-skills Core       β”‚
        β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
        β”‚  β€’ Skill Manager         β”‚
        β”‚  β€’ Skill Router          β”‚
        β”‚  β€’ Skill Executor        β”‚
        β”‚  β€’ Auto-Discovery        β”‚
        β”‚  β€’ Tool Manifest         β”‚
        β””β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”˜
             β”‚                 β”‚
             β–Ό                 β–Ό
        β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”      β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
        β”‚Postgres β”‚      β”‚    S3    β”‚
        β”‚+pgvectorβ”‚      β”‚Artifacts β”‚
        β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜      β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Installation

Prerequisites

  • Python 3.11+
  • PostgreSQL 14+ with pgvector extension
  • OpenAI API key (for embeddings)

Install Package

pip install open-skills

# Or for development
git clone https://github.com/rscheiwe/open-skills.git
cd open-skills
pip install -e ".[dev]"

Database Setup

# Using Docker (recommended)
docker run -d \
  --name openskills-postgres \
  -e POSTGRES_PASSWORD=postgres \
  -e POSTGRES_DB=openskills \
  -p 5432:5432 \
  pgvector/pgvector:pg16

# Run migrations
alembic upgrade head

Configuration

Library Mode

from open_skills import configure

configure(
    database_url="postgresql+asyncpg://localhost/mydb",
    openai_api_key="sk-...",
    storage_root="./skills",
    artifacts_root="./artifacts",
    # Optional S3 configuration
    s3_endpoint="https://s3.amazonaws.com",
    s3_bucket="my-bucket",
)

Service Mode

Create .env file:

POSTGRES_URL=postgresql+asyncpg://user:password@localhost:5432/openskills
OPENAI_API_KEY=sk-...
JWT_SECRET=your-secret-key-here
STORAGE_ROOT=./storage
ARTIFACTS_ROOT=./artifacts

# Optional
S3_ENDPOINT=https://s3.amazonaws.com
S3_BUCKET=open-skills-artifacts
LANGFUSE_API_KEY=  # Telemetry

API Endpoints

When using mount_open_skills() or service mode:

Endpoint Method Description
/.well-known/skills.json GET Tool discovery manifest
/api/health GET Health check
/api/skills GET, POST List/create skills
/api/skills/{id}/versions GET, POST Manage versions
/api/skills/search POST Embedding-based search
/api/runs POST Execute skills
/api/runs/{id} GET Get run details
/api/runs/{id}/stream GET Real-time SSE stream

See INTEGRATION_GUIDE.md for complete API reference.

CLI Tools

# Create a new skill
open-skills init my-skill

# Validate skill bundle
open-skills validate ./my-skill

# Test locally
open-skills run-local ./my-skill input.json

# Publish to service
open-skills publish ./my-skill

# Start service
open-skills serve --port 8000

Examples

Documentation

Framework Compatibility

Open-skills provides tool converters for:

  • OpenAI - Function calling format
  • Anthropic - Tool use format
  • LangChain - Tool format
  • Custom - Generic tool contract
from open_skills import as_agent_tools, to_openai_tool, to_anthropic_tool, to_langchain_tool

tools = await as_agent_tools()

# Convert to framework-specific formats
openai_tools = [to_openai_tool(t) for t in tools]
anthropic_tools = [to_anthropic_tool(t) for t in tools]
langchain_tools = [to_langchain_tool(t) for t in tools]

Development

Run Tests

pytest                    # All tests
pytest -m unit            # Unit tests only
pytest -m integration     # Integration tests
pytest --cov=open_skills  # With coverage

Code Quality

black open_skills tests   # Format
ruff check open_skills    # Lint
mypy open_skills          # Type check

Database Migrations

alembic revision --autogenerate -m "description"  # Create migration
alembic upgrade head                              # Apply
alembic downgrade -1                              # Rollback

Deployment

Docker (Service Mode)

docker build -t open-skills:latest .
docker run -p 8000:8000 --env-file .env open-skills:latest

Kubernetes

kubectl apply -f k8s/

Library Mode (Embedded)

Deploy as part of your application β€” no separate deployment needed!

See docs/deployment.md for production setup.

Troubleshooting

Skills not appearing in manifest

from open_skills.core.manager import SkillManager

async with db_session() as db:
    manager = SkillManager(db)
    skills = await manager.list_skills()
    print(f"Found {len(skills)} skills")

Database connection issues

# Verify pgvector extension
psql -d openskills -c "\dx"

# Test connection
psql postgresql://postgres:postgres@localhost:5432/openskills

See INTEGRATION_GUIDE.md for more.

Contributing

Contributions welcome! Please read CONTRIBUTING.md for guidelines.

License

MIT License - see LICENSE file for details.

Acknowledgments

Inspired by Anthropic's Skills feature for Claude, designed to work with any LLM framework.


Current Version: 0.2.0 (Framework-Agnostic Release)
Status: Production-ready for library mode, service mode, and hybrid deployments

# Supported AI Coding Agents

This skill is compatible with the SKILL.md standard and works with all major AI coding agents:

Learn more about the SKILL.md standard and how to use these skills with your preferred AI coding agent.