Caching strategies for LLM prompts including Anthropic prompt caching, response caching, and CAG (Cache Augmented Generation) Use when: prompt caching, cache prompt, response cache, cag, cache augmented.
Caching strategies for LLM prompts including Anthropic prompt caching, response caching, and CAG (Cache Augmented Generation) Use when: prompt caching, cache prompt, response cache, cag, cache augmented.
Modern shell prompt configuration with Powerlevel10k and Zsh Vi Mode. Use when configuring shell prompts, setting up vi/vim keybindings in zsh, customizing cursor styles per mode, adding mode...
Transform a user's 'basic prompt' into an 'optimized prompt' by applying context engineering -> the right context beats the right prompt.
Caching strategies for LLM prompts including Anthropic prompt caching, response caching, and CAG (Cache Augmented Generation)Use when "prompt caching, cache prompt, response cache, cag, cache...
Transform vague prompts into precise, well-structured specifications using EARS (Easy Approach to Requirements Syntax) methodology. This skill should be used when users provide loose requirements,...
Optimize prompts for LLMs and append to PROMPT.md
The meta-skill that powers all other AI tools. Prompt engineering for creative applications is the art and science of communicating with AI models to produce exactly what you envision—in images,...
Expert prompt optimization for LLMs and AI systems. Use PROACTIVELY when building AI features, improving agent performance, or crafting system prompts. Masters prompt patterns and techniques.
Master the art of "vibe coding" - creating playable games through natural language prompts to AI. Covers effective prompting strategies, framework choices, workflow patterns, and avoiding common...
Master advanced prompt engineering techniques to maximize LLM performance, reliability, and controllability in production. Use when optimizing prompts, improving LLM outputs, or designing...
Master advanced prompt engineering techniques to maximize LLM performance, reliability, and controllability in production. Use when optimizing prompts, improving LLM outputs, or designing...
Master advanced prompt engineering techniques to maximize LLM performance, reliability, and controllability in production. Use when optimizing prompts, improving LLM outputs, or designing...
Use when prompts produce inconsistent or unreliable outputs, need explicit structure and constraints, require safety guardrails or quality checks, involve multi-step reasoning that needs...
Automatically intercepts and optimizes prompts using the prompt-learning MCP server. Learns from performance over time via embedding-indexed history. Uses APE, OPRO, DSPy patterns. Activate on...
Prompt optimization consultant for AI system prompts and custom instructions. Supports Claude API, OpenAI API, Gemini API, CLAUDE.md, ChatGPT, and n8n. Use when user wants to improve, review, or...
Transform vague requests into production-ready, hallucination-free prompts optimized for Claude 4.x. Applies investigation-first protocols, anti-hallucination guards, extended thinking patterns,...
|
World-class prompt engineering skill for LLM optimization, prompt patterns, structured outputs, and AI product development. Expertise in Claude, GPT-4, prompt design patterns, few-shot learning,...
World-class prompt engineering skill for LLM optimization, prompt patterns, structured outputs, and AI product development. Expertise in Claude, GPT-4, prompt design patterns, few-shot learning,...