armelhbobdad

opik-prompt

0
0
# Install this skill:
npx skills add armelhbobdad/opik-skills --skill "opik-prompt"

Install specific skill from multi-skill repository

# Description

Manage prompt versions and run comparisons. Use when versioning prompts, comparing prompt variations, or optimizing prompt performance.

# SKILL.md


name: opik-prompt
description: Manage prompt versions and run comparisons. Use when versioning prompts, comparing prompt variations, or optimizing prompt performance.


opik-prompt

Quick Reference

Create:   client.create_prompt(name="my-prompt", prompt="template")
Get:      client.get_prompt(name="my-prompt")
Version:  client.get_prompt(name="my-prompt", version=2)

Note: Prompt management APIs are Python-primary in the Opik SDK.

Quick Health Check

Python: Run opik healthcheck
TypeScript: Verify config exists at ~/.opik.config or env vars are set

βœ… "Connection successful" / config exists β†’ Continue below
❌ "Connection failed" / no config β†’ Run /opik-setup first, then return here

Creating Prompts

from opik import Opik

client = Opik()

# Create a new prompt
prompt = client.create_prompt(
    name="qa-assistant",
    prompt="Answer this question: {question}\n\nContext: {context}"
)

Retrieving Prompts

Latest Version

prompt = client.get_prompt(name="qa-assistant")
print(prompt.prompt)  # The template string

Specific Version

prompt = client.get_prompt(name="qa-assistant", version=1)

Using Prompts

Format with Variables

prompt = client.get_prompt(name="qa-assistant")

formatted = prompt.format(
    question="What is Python?",
    context="Python is a programming language."
)
# "Answer this question: What is Python?\n\nContext: Python is a programming language."

With LLM Calls

from openai import OpenAI

client = OpenAI()
opik = Opik()

prompt = opik.get_prompt(name="qa-assistant")
formatted = prompt.format(question=user_question, context=retrieved_context)

response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": formatted}]
)

Version Management

Create New Version

Creating a prompt with the same name automatically creates a new version:

# Version 1
client.create_prompt(name="qa-assistant", prompt="Answer: {question}")

# Version 2 (new version, same name)
client.create_prompt(name="qa-assistant", prompt="Please answer: {question}")

List Versions

See VERSIONING.md for detailed version management.

Next Steps

  • VERSIONING.md - Detailed version management guide
  • /opik-eval - Evaluate prompts against datasets

# Supported AI Coding Agents

This skill is compatible with the SKILL.md standard and works with all major AI coding agents:

Learn more about the SKILL.md standard and how to use these skills with your preferred AI coding agent.