Architecture
Self-hosted Skyvern has four components running on your infrastructure:| Component | Role |
|---|---|
| Skyvern API Server | Orchestrates tasks, processes LLM responses, stores results. Includes an embedded Playwright-managed Chromium browser that executes web automation. |
| PostgreSQL | Stores task history, workflows, credentials, and organization data |
| LLM Provider | Analyzes screenshots and determines actions. You provide the API key (OpenAI, Anthropic, Azure OpenAI, Google Vertex AI, Amazon Bedrock, Groq, OpenRouter, or local via Ollama) |
How a task executes
Skyvern runs a perception-action loop for each task step:- Screenshot: The browser captures the current page state
- Analyze: The screenshot is sent to your LLM, which identifies interactive elements and decides the next action
- Execute: Skyvern performs the action in the browser (click, type, scroll, extract data)
- Repeat: Steps 1-3 loop until the task goal is met or the step limit (
MAX_STEPS_PER_RUN) is reached
What changes from Cloud
| You gain | You manage |
|---|---|
| Full data control: browser sessions and results stay on your network | Infrastructure: servers, scaling, uptime |
| Any LLM provider, including local models via Ollama | LLM API costs: pay your provider directly |
| No per-task pricing | Proxies: bring your own provider |
| Full access to browser configuration and extensions | Software updates: pull new Docker images manually |
| Deploy in air-gapped or restricted networks | Database backups and maintenance |
The most significant operational difference is proxies. Skyvern Cloud routes browser traffic through managed residential proxies to avoid bot detection. Self-hosted deployments need you to configure your own proxy provider.
Prerequisites
Before deploying, ensure you have:Docker and Docker Compose
Required for containerized deployment. Install Docker
LLM API key
From OpenAI, Anthropic, Azure OpenAI, Google Gemini, or AWS Bedrock. Alternatively, run local models with Ollama.
DATABASE_STRING to point to your own instance.
Choose your deployment method
| Method | Best for |
|---|---|
| Docker Compose | Getting started, small teams, single-server deployments |
| Kubernetes | Production at scale, teams with existing K8s infrastructure, high availability requirements |
Next steps
Docker Setup
Get Skyvern running in 10 minutes with Docker Compose
Kubernetes Deployment
Deploy to production with Kubernetes manifests

