Documentation

Everything you need to integrate and operate Skalpel AI.

Getting Started

Quick Start

Go from zero to first proxied request in under 10 minutes.

  • 1. Create a workspace at skalpel.ai
  • 2. Generate your workspace key
  • 3. Run: npx skalpel init
  • 4. Send your first request

Base URL Swap

The fastest integration path. Keep your existing SDK.

  • Replace your provider base URL with https://gateway.skalpel.ai/v1
  • Set your SKALPEL_KEY environment variable
  • Send traffic as normal

SDK Wrapper

Lightweight wrapper for OpenAI and Anthropic clients.

  • npm install @skalpel/sdk
  • Wrap your existing client instance
  • Keep your existing code intact

CLI Reference

skalpel init

Scan repo, detect SDKs, patch safely.

skalpel connect

Connect workspace and verify credentials.

skalpel doctor

Run diagnostics on your integration.

skalpel logs tail

Stream live request logs from your workspace.

API Reference

POST /v1/chat/completions

OpenAI-compatible chat endpoint.

POST /v1/messages

Anthropic-compatible messages endpoint.

GET /api/workspaces

List workspaces.

GET /api/requests

Query request history and traces.

GET /api/billing

Current usage and savings breakdown.

Concepts

Request Lifecycle

Ingress, normalization, auth, policy resolution, cache, routing, forwarding, tracing.

Savings Accounting

Verified, estimated, and provider-side savings buckets.

Quality Floors

Shadow evaluations and kill switches for every optimization.

Provider Adapters

Normalized interface across OpenAI, Anthropic, Google, and more.