vercel-labs

observability

1
0
# Install this skill:
npx skills add vercel-labs/vercel-plugin --skill "observability"

Install specific skill from multi-skill repository

# Description

Vercel Observability expert guidance β€” Drains (logs, traces, speed insights, web analytics), Web Analytics, Speed Insights, runtime logs, custom events, OpenTelemetry integration, and monitoring dashboards. Use when instrumenting, debugging, or optimizing application performance and user experience on Vercel.

# SKILL.md


name: observability
description: Vercel Observability expert guidance β€” Drains (logs, traces, speed insights, web analytics), Web Analytics, Speed Insights, runtime logs, custom events, OpenTelemetry integration, and monitoring dashboards. Use when instrumenting, debugging, or optimizing application performance and user experience on Vercel.


Vercel Observability

You are an expert in Vercel's observability stack β€” Drains, Web Analytics, Speed Insights, runtime logs, and monitoring integrations.

Web Analytics

Privacy-friendly, first-party analytics with no cookie banners required.

Installation

npm install @vercel/analytics

Setup (Next.js App Router)

// app/layout.tsx
import { Analytics } from '@vercel/analytics/next'

export default function RootLayout({ children }: { children: React.ReactNode }) {
  return (
    <html>
      <body>
        {children}
        <Analytics />
      </body>
    </html>
  )
}

Custom Events (Pro/Enterprise)

Track business-specific events beyond pageviews.

import { track } from '@vercel/analytics'

// Track a conversion
track('purchase', {
  product: 'pro-plan',
  value: 20,
  currency: 'USD',
})

// Track a feature usage
track('feature_used', {
  name: 'ai-chat',
  duration_ms: 3200,
})

Server-Side Tracking

import { track } from '@vercel/analytics/server'

export async function POST(req: Request) {
  const data = await req.json()
  await processOrder(data)

  track('order_completed', {
    order_id: data.id,
    total: data.total,
  })

  return Response.json({ success: true })
}

Speed Insights

Real-user performance monitoring built on Core Web Vitals.

Installation

npm install @vercel/speed-insights

Setup (Next.js App Router)

// app/layout.tsx
import { SpeedInsights } from '@vercel/speed-insights/next'

export default function RootLayout({ children }: { children: React.ReactNode }) {
  return (
    <html>
      <body>
        {children}
        <SpeedInsights />
      </body>
    </html>
  )
}

Metrics Tracked

Metric What It Measures Good Threshold
LCP Largest Contentful Paint < 2.5s
INP Interaction to Next Paint < 200ms
CLS Cumulative Layout Shift < 0.1
FCP First Contentful Paint < 1.8s
TTFB Time to First Byte < 800ms

Performance Attribution

Speed Insights attributes metrics to specific routes and pages, letting you identify which pages are slow and why.

Runtime Logs

Vercel provides real-time logs for all function invocations.

Structured Logging

// app/api/process/route.ts
export async function POST(req: Request) {
  const start = Date.now()
  const data = await req.json()

  // Structured logs appear in Vercel's log viewer
  console.log(JSON.stringify({
    level: 'info',
    message: 'Processing request',
    requestId: req.headers.get('x-vercel-id'),
    payload_size: JSON.stringify(data).length,
  }))

  try {
    const result = await processData(data)
    console.log(JSON.stringify({
      level: 'info',
      message: 'Request completed',
      duration_ms: Date.now() - start,
    }))
    return Response.json(result)
  } catch (error) {
    console.error(JSON.stringify({
      level: 'error',
      message: 'Processing failed',
      error: error instanceof Error ? error.message : String(error),
      duration_ms: Date.now() - start,
    }))
    return Response.json({ error: 'Internal error' }, { status: 500 })
  }
}

Next.js Instrumentation

// instrumentation.ts (Next.js 16)
export async function register() {
  if (process.env.NEXT_RUNTIME === 'nodejs') {
    // Initialize monitoring on server startup
    const { initMonitoring } = await import('./lib/monitoring')
    initMonitoring()
  }
}

Runtime Logs via REST API

Query deployment runtime logs programmatically. The endpoint returns application/stream+json β€” a streaming response where each line is a separate JSON object.

# Stream runtime logs for a deployment (returns application/stream+json)
curl -N -H "Authorization: Bearer $VERCEL_TOKEN" \
  "https://api.vercel.com/v3/deployments/<deployment-id>/events" \
  --max-time 120

Streaming guidance: The response is unbounded β€” always set a timeout (--max-time in curl, AbortController with setTimeout in fetch). Parse line-by-line as NDJSON. Each line contains { timestamp, text, level, source }.

// Programmatic streaming with timeout
const controller = new AbortController()
const timeout = setTimeout(() => controller.abort(), 60_000) // 60s max

const res = await fetch(
  `https://api.vercel.com/v3/deployments/${deploymentId}/events`,
  {
    headers: { Authorization: `Bearer ${process.env.VERCEL_TOKEN}` },
    signal: controller.signal,
  }
)

const reader = res.body!.getReader()
const decoder = new TextDecoder()
let buffer = ''

try {
  while (true) {
    const { done, value } = await reader.read()
    if (done) break
    buffer += decoder.decode(value, { stream: true })
    const lines = buffer.split('\n')
    buffer = lines.pop()! // keep incomplete line in buffer
    for (const line of lines) {
      if (!line.trim()) continue
      const event = JSON.parse(line)
      console.log(`[${event.level}] ${event.text}`)
    }
  }
} finally {
  clearTimeout(timeout)
}

MCP alternative: Use get_runtime_logs via the Vercel MCP server for agent-friendly log queries without managing streams directly. See β€³ skill: vercel-api.

Drains

Drains forward observability data from Vercel to external endpoints. They are the primary mechanism for exporting logs, traces, Speed Insights, and Web Analytics data to third-party platforms.

Plan requirement: Drains require a Pro or Enterprise plan. For Hobby plans, see the Fallback Guidance section below.

Data Types

Drains can forward multiple categories of telemetry:

Data Type What It Contains Use Case
Logs Runtime function logs, build logs, static access logs Centralized log aggregation
Traces OpenTelemetry-compatible distributed traces End-to-end request tracing
Speed Insights Core Web Vitals and performance metrics Performance monitoring pipelines
Web Analytics Pageviews, custom events, visitor data Analytics data warehousing

Supported Formats

Format Protocol Best For
JSON HTTPS POST Custom backends, generic log collectors
NDJSON HTTPS POST Streaming-friendly consumers, high-volume pipelines
Syslog TLS syslog Traditional log management (rsyslog, syslog-ng)

Setting Up Drains

Drains are configured via the Vercel Dashboard or REST API β€” there is no CLI command for drains.

Via Dashboard

  1. Go to Team Settings β†’ Log Drains in the Vercel Dashboard
  2. Click Add Log Drain
  3. Select format (JSON, NDJSON, or Syslog)
  4. Enter your endpoint URL
  5. Choose data sources to include (Static, Lambda, Edge, Build, External)
  6. Optionally filter by project or environment
  7. Save β€” Vercel sends a verification request with x-vercel-signature header

Via REST API (/v1/drains)

# List all drains
curl -s -H "Authorization: Bearer $VERCEL_TOKEN" \
  "https://api.vercel.com/v1/drains?teamId=$TEAM_ID" | jq

# Create a JSON drain
curl -X POST -H "Authorization: Bearer $VERCEL_TOKEN" \
  -H "Content-Type: application/json" \
  "https://api.vercel.com/v1/drains?teamId=$TEAM_ID" \
  -d '{
    "url": "https://your-endpoint.example.com/logs",
    "type": "json",
    "sources": ["lambda", "edge", "static"],
    "environments": ["production"]
  }'

# Test a drain (sends a test payload to your endpoint)
curl -X POST -H "Authorization: Bearer $VERCEL_TOKEN" \
  "https://api.vercel.com/v1/drains/<drain-id>/test?teamId=$TEAM_ID"

# Update a drain (change URL, sources, or environments)
curl -X PATCH -H "Authorization: Bearer $VERCEL_TOKEN" \
  -H "Content-Type: application/json" \
  "https://api.vercel.com/v1/drains/<drain-id>?teamId=$TEAM_ID" \
  -d '{
    "url": "https://new-endpoint.example.com/logs",
    "environments": ["production", "preview"]
  }'

# Delete a drain
curl -X DELETE -H "Authorization: Bearer $VERCEL_TOKEN" \
  "https://api.vercel.com/v1/drains/<drain-id>?teamId=$TEAM_ID"

Web Analytics Drains Reference

When a drain is configured to receive Web Analytics data, payloads arrive as batched events. The format depends on your drain type.

JSON Payload Schema

[
  {
    "type": "pageview",
    "url": "https://example.com/blog/post-1",
    "referrer": "https://google.com",
    "timestamp": 1709568000000,
    "geo": { "country": "US", "region": "CA", "city": "San Francisco" },
    "device": { "os": "macOS", "browser": "Chrome", "isBot": false },
    "projectId": "prj_xxxxx",
    "environment": "production"
  },
  {
    "type": "custom_event",
    "name": "purchase",
    "url": "https://example.com/checkout",
    "properties": { "product": "pro-plan", "value": 20 },
    "timestamp": 1709568100000,
    "geo": { "country": "US" },
    "device": { "os": "macOS", "browser": "Chrome", "isBot": false },
    "projectId": "prj_xxxxx",
    "environment": "production"
  }
]

NDJSON Payload Format

Each line is a separate JSON object (one event per line):

{"type":"pageview","url":"https://example.com/","timestamp":1709568000000,"geo":{"country":"US"},"device":{"browser":"Chrome"},...}
{"type":"pageview","url":"https://example.com/about","timestamp":1709568001000,"geo":{"country":"DE"},"device":{"browser":"Firefox"},...}
{"type":"custom_event","name":"signup","url":"https://example.com/register","timestamp":1709568002000,...}

Ingestion tip: For NDJSON, process line-by-line as events arrive. This format is preferred for high-volume pipelines where batch parsing overhead matters.

Security: Signature Verification

Vercel signs every drain payload with an HMAC-SHA1 signature in the x-vercel-signature header. Always verify signatures in production to prevent spoofed data.

Critical: You must verify against the raw request body (not a parsed/re-serialized version). JSON parsing and re-stringifying can change key order or whitespace, breaking the signature match.

import { createHmac, timingSafeEqual } from 'crypto'

function verifyDrainSignature(rawBody: string, signature: string, secret: string): boolean {
  const expected = createHmac('sha1', secret).update(rawBody).digest('hex')
  // Use timing-safe comparison to prevent timing attacks
  if (expected.length !== signature.length) return false
  return timingSafeEqual(Buffer.from(expected), Buffer.from(signature))
}

Usage in a drain endpoint:

// app/api/drain/route.ts
export async function POST(req: Request) {
  const rawBody = await req.text()
  const signature = req.headers.get('x-vercel-signature')
  const secret = process.env.DRAIN_SECRET!

  if (!signature || !verifyDrainSignature(rawBody, signature, secret)) {
    return new Response('Unauthorized', { status: 401 })
  }

  const events = JSON.parse(rawBody)
  // Process verified events...
  return new Response('OK', { status: 200 })
}

Secret management: The drain signing secret is shown once when you create the drain. Store it in an environment variable (e.g., DRAIN_SECRET). If lost, delete and recreate the drain.

OpenTelemetry Integration

Vercel exports traces in OpenTelemetry-compatible format via Drains. Configure an OTel-compatible drain endpoint via the Dashboard or REST API (see above).

Vendor Integrations

# Install via Marketplace (recommended β€” auto-configures drain)
vercel integration add datadog

Or manually create a drain via Dashboard / REST API pointing to:

Vendor Endpoint Auth Header
Datadog https://http-intake.logs.datadoghq.com/api/v2/logs DD-API-KEY
Honeycomb https://api.honeycomb.io/1/batch/<dataset> X-Honeycomb-Team

Fallback Guidance (No Drains)

If drains are unavailable (Hobby plan or not yet configured), use these alternatives:

Need Alternative How
View runtime logs Vercel Dashboard Deployments β†’ select deployment β†’ Logs tab
Stream logs from terminal Vercel CLI vercel logs <deployment-url> --follow (see β€³ skill: vercel-cli)
Query logs programmatically MCP / REST API get_runtime_logs tool or /v3/deployments/:id/events (see β€³ skill: vercel-api)
Monitor errors post-deploy CLI vercel logs <url> --level error --since 1h
Web Analytics data Dashboard only Analytics tab in project dashboard (no export without drains)
Performance metrics Dashboard only Speed Insights tab in project dashboard

Upgrade path: When ready for centralized observability, upgrade to Pro and configure drains via REST API or Dashboard. The drain setup is typically < 5 minutes.

Monitoring Dashboard Patterns

Full-Stack Observability Setup

Combine all Vercel observability tools for comprehensive coverage.

// app/layout.tsx β€” complete observability setup
import { Analytics } from '@vercel/analytics/next'
import { SpeedInsights } from '@vercel/speed-insights/next'

export default function RootLayout({ children }: { children: React.ReactNode }) {
  return (
    <html>
      <body>
        {children}
        <Analytics />
        <SpeedInsights />
      </body>
    </html>
  )
}

Custom Monitoring with waitUntil

Fire-and-forget telemetry that doesn't block responses.

import { waitUntil } from '@vercel/functions'

export async function GET(req: Request) {
  const start = Date.now()
  const result = await fetchData()

  // Send response immediately
  const response = Response.json(result)

  // Report metrics in background
  waitUntil(async () => {
    await reportMetric('api_latency', Date.now() - start, {
      route: '/api/data',
      status: 200,
    })
  })

  return response
}

Error Tracking Pattern

// lib/error-reporting.ts
export async function reportError(error: unknown, context: Record<string, unknown>) {
  const payload = {
    message: error instanceof Error ? error.message : String(error),
    stack: error instanceof Error ? error.stack : undefined,
    timestamp: new Date().toISOString(),
    ...context,
  }

  // Log for Vercel's runtime logs
  console.error(JSON.stringify(payload))

  // Also send to external service if configured
  if (process.env.ERROR_WEBHOOK_URL) {
    await fetch(process.env.ERROR_WEBHOOK_URL, {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify(payload),
    })
  }
}

Decision Matrix

Need Use Why
Page views, traffic sources Web Analytics First-party, privacy-friendly
Business event tracking Web Analytics custom events Track conversions, feature usage
Core Web Vitals monitoring Speed Insights Real user data per route
Function debugging Runtime Logs (CLI / Dashboard / REST) Real-time, per-invocation logs
Export logs to external platform Drains (JSON/NDJSON/Syslog) Centralize observability (Pro+)
Export analytics data Drains (Web Analytics type) Warehouse pageviews + custom events (Pro+)
OpenTelemetry traces Drains (OTel-compatible endpoint) Standards-based distributed tracing (Pro+)
Post-response telemetry waitUntil + custom reporting Non-blocking metrics
Server-side event tracking @vercel/analytics/server Track API-triggered events
Hobby plan log access CLI vercel logs + Dashboard No drains needed

Cross-References

  • Drains REST API & runtime logs endpoint β†’ β€³ skill: vercel-api (Observability APIs section)
  • CLI log streaming (--follow, --since, --level) β†’ β€³ skill: vercel-cli (Logs & Inspection section)
  • Marketplace vendor integrations β†’ β€³ skill: marketplace

Official Documentation

# Supported AI Coding Agents

This skill is compatible with the SKILL.md standard and works with all major AI coding agents:

Learn more about the SKILL.md standard and how to use these skills with your preferred AI coding agent.