Skip to content

Danigm-dev/sim

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3,939 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Sim Logo

The open-source platform to build AI agents and run your agentic workforce. Connect 1,000+ integrations and LLMs to orchestrate agentic workflows.

Sim.ai Discord Twitter Documentation

Ask DeepWiki Set Up with Cursor

Build Workflows with Ease

Design agent workflows visually on a canvas—connect agents, tools, and blocks, then run them instantly.

Workflow Builder Demo

Supercharge with Copilot

Leverage Copilot to generate nodes, fix errors, and iterate on flows directly from natural language.

Copilot Demo

Integrate Vector Databases

Upload documents to a vector store and let agents answer questions grounded in your specific content.

Knowledge Uploads and Retrieval Demo

Quickstart

Cloud-hosted: sim.ai

Sim.ai

Self-hosted: NPM Package

npx simstudio

http://localhost:3000

Note

Docker must be installed and running on your machine.

Options

Flag Description
-p, --port <port> Port to run Sim on (default 3000)
--no-pull Skip pulling latest Docker images

Self-hosted: Docker Compose

For local Docker builds, use the local compose file:

git clone https://github.com/simstudioai/sim.git && cd sim
docker compose -f docker-compose.local.yml up -d --build

For a cloud or production-style deployment, use the published images:

git clone https://github.com/simstudioai/sim.git && cd sim
docker compose -f docker-compose.prod.yml up -d

Open http://localhost:3000

OpenCode Setup

OpenCode is opt-in. By default the OpenCode block stays hidden so the base Sim UX and deployment path remain unchanged.

Quick deploy paths:

Sim only
  • Do not set NEXT_PUBLIC_OPENCODE_ENABLED.
  • Do not set any OPENCODE_* variables.
  • Do not use docker-compose.opencode.yml or docker-compose.opencode.local.yml.
  • Start Sim with the normal upstream flow.
Sim + OpenCode local overlay
  • Set NEXT_PUBLIC_OPENCODE_ENABLED=true.
  • Set OPENCODE_SERVER_USERNAME and OPENCODE_SERVER_PASSWORD.
  • Set OPENCODE_REPOSITORY_ROOT=/app/repos unless you intentionally changed the runtime root.
  • Set OPENCODE_REPOS to one or more HTTPS repository URLs.
  • Set at least one provider key such as OPENAI_API_KEY, ANTHROPIC_API_KEY, GEMINI_API_KEY, or GOOGLE_GENERATIVE_AI_API_KEY.
  • Set GIT_USERNAME and GIT_TOKEN or GITHUB_TOKEN if any repository is private.
  • For host-side next dev, also set OPENCODE_BASE_URL=http://127.0.0.1:4096.
  • Start with docker compose -f docker-compose.local.yml -f docker-compose.opencode.local.yml up -d --build.
Sim + external OpenCode runtime
  • Set NEXT_PUBLIC_OPENCODE_ENABLED=true.
  • Set OPENCODE_BASE_URL to the external OpenCode server.
  • Set OPENCODE_SERVER_USERNAME and OPENCODE_SERVER_PASSWORD to the credentials expected by that server.
  • Set OPENCODE_REPOSITORY_ROOT to the same worktree root used by the external OpenCode deployment.
  • Set OPENCODE_REPOS to the repository catalog you expect the runtime to clone or expose.
  • Ensure the external runtime already has at least one provider key configured.
  • Ensure the external runtime can clone private repositories with the right git credentials if needed.
  • Verify /global/health and one real prompt before exposing the block to users.

Minimum setup:

cp apps/sim/.env.example apps/sim/.env

Then add these values to apps/sim/.env:

NEXT_PUBLIC_OPENCODE_ENABLED=true
OPENCODE_REPOSITORY_ROOT=/app/repos
OPENCODE_SERVER_USERNAME=opencode
OPENCODE_SERVER_PASSWORD=change-me
OPENCODE_REPOS=https://github.com/octocat/Hello-World.git

# Pick at least one provider key that OpenCode can use
GEMINI_API_KEY=your-gemini-key
# or OPENAI_API_KEY=...
# or ANTHROPIC_API_KEY=...

If you want private repositories:

# Generic HTTPS or Azure Repos
GIT_USERNAME=your-user-or-email
GIT_TOKEN=your-token-or-pat

# Optional GitHub-only fallback
GITHUB_TOKEN=your-github-token

Important:

  • The OpenCode block remains hidden unless NEXT_PUBLIC_OPENCODE_ENABLED=true is set on the Sim app.
  • docker compose reads environment from the shell, not from apps/sim/.env automatically.
  • If you want the app and the OpenCode runtime to use the same credentials, load that file before starting compose:
set -a
source apps/sim/.env
set +a
docker compose -f docker-compose.local.yml -f docker-compose.opencode.local.yml up -d --build

Local vs production behavior:

  • docker-compose.local.yml
    • remains unchanged
  • docker-compose.opencode.local.yml
    • adds OpenCode locally without changing the base local compose file
    • publishes OPENCODE_PORT to the host so next dev on the host can talk to OpenCode
    • defaults OPENCODE_REPOSITORY_ROOT=/app/repos
    • defaults OPENCODE_SERVER_USERNAME=opencode
    • defaults OPENCODE_SERVER_PASSWORD=dev-opencode-password if you do not set one explicitly
  • docker-compose.prod.yml
    • contains the upstream-style base deployment only
  • docker-compose.opencode.yml
    • adds the opencode service as a production overlay
    • builds the OpenCode runtime locally from this repository instead of requiring an official Sim-hosted image
    • injects the required NEXT_PUBLIC_OPENCODE_ENABLED and OPENCODE_* variables into simstudio
    • keeps OpenCode internal to the Docker network with expose, not a published host port
    • defaults OPENCODE_REPOSITORY_ROOT=/app/repos
    • requires OPENCODE_SERVER_PASSWORD to be set explicitly before docker compose starts

Production deploy command:

docker compose -f docker-compose.prod.yml -f docker-compose.opencode.yml up -d --build

For local hot reload with next dev on the host, also set this in apps/sim/.env:

OPENCODE_BASE_URL=http://127.0.0.1:4096

Then start only the optional OpenCode runtime:

docker compose -f docker-compose.local.yml -f docker-compose.opencode.local.yml up -d --build opencode

Without that override, host-side Next.js cannot reliably reach the Docker service alias.

Notes:

  • If OPENCODE_REPOS is empty, opencode still starts but no repositories are cloned.
  • Repositories are cloned into ${OPENCODE_REPOSITORY_ROOT:-/app/repos}/<repo-name>.
  • Private Azure Repos must use https plus GIT_USERNAME and GIT_TOKEN; the container will not prompt interactively for passwords.
  • GOOGLE_GENERATIVE_AI_API_KEY is optional; the optional overlays map it automatically from GEMINI_API_KEY if not set.
  • If you prefer to run OpenCode in separate infrastructure, skip the overlays and point Sim at that deployment with OPENCODE_BASE_URL, OPENCODE_SERVER_USERNAME, OPENCODE_SERVER_PASSWORD, and the matching OPENCODE_REPOSITORY_ROOT.

Basic verification after startup:

curl -u "opencode:change-me" http://127.0.0.1:4096/global/health

If you changed the username, password, or port, use those values instead.

See docker/opencode/README.md for service-specific verification steps and runtime behavior.

Using Local Models with Ollama

Run Sim with local AI models using Ollama - no external APIs required:

# Start with GPU support (automatically downloads gemma3:4b model)
docker compose -f docker-compose.ollama.yml --profile setup up -d

# For CPU-only systems:
docker compose -f docker-compose.ollama.yml --profile cpu --profile setup up -d

Wait for the model to download, then visit http://localhost:3000. Add more models with:

docker compose -f docker-compose.ollama.yml exec ollama ollama pull llama3.1:8b

Using an External Ollama Instance

If Ollama is running on your host machine, use host.docker.internal instead of localhost:

OLLAMA_URL=http://host.docker.internal:11434 docker compose -f docker-compose.prod.yml up -d

On Linux, use your host's IP address or add extra_hosts: ["host.docker.internal:host-gateway"] to the compose file.

Using vLLM

Sim supports vLLM for self-hosted models. Set VLLM_BASE_URL and optionally VLLM_API_KEY in your environment.

Self-hosted: Dev Containers

  1. Open VS Code with the Remote - Containers extension
  2. Open the project and click "Reopen in Container" when prompted
  3. Run bun run dev:full in the terminal or use the sim-start alias
    • This starts both the main application and the realtime socket server

Self-hosted: Manual Setup

Requirements: Bun, Node.js v20+, PostgreSQL 12+ with pgvector

  1. Clone and install:
git clone https://github.com/simstudioai/sim.git
cd sim
bun install
  1. Set up PostgreSQL with pgvector:
docker run --name simstudio-db -e POSTGRES_PASSWORD=your_password -e POSTGRES_DB=simstudio -p 5432:5432 -d pgvector/pgvector:pg17

Or install manually via the pgvector guide.

  1. Configure environment:
cp apps/sim/.env.example apps/sim/.env
cp packages/db/.env.example packages/db/.env
# Edit both .env files to set DATABASE_URL="postgresql://postgres:your_password@localhost:5432/simstudio"

If you want to use the OpenCode workflow block while running next dev on the host, also set these in apps/sim/.env:

NEXT_PUBLIC_OPENCODE_ENABLED=true
OPENCODE_BASE_URL=http://127.0.0.1:4096
OPENCODE_SERVER_USERNAME=opencode
OPENCODE_SERVER_PASSWORD=change-me
OPENCODE_REPOS=https://github.com/octocat/Hello-World.git
GEMINI_API_KEY=your-gemini-key

Then export the same environment before starting the OpenCode container so the app and Docker use identical credentials:

set -a
source apps/sim/.env
set +a
docker compose -f docker-compose.local.yml -f docker-compose.opencode.local.yml up -d --build opencode
  1. Run migrations:
cd packages/db && bunx drizzle-kit migrate --config=./drizzle.config.ts
  1. Start development servers:
bun run dev:full  # Starts both Next.js app and realtime socket server
bun run dev:full:webpack  # Same, but using Webpack instead of Turbopack

Or run separately: bun run dev (Next.js/Turbopack), cd apps/sim && bun run dev:webpack (Next.js/Webpack), and cd apps/sim && bun run dev:sockets (realtime).

Copilot API Keys

Copilot is a Sim-managed service. To use Copilot on a self-hosted instance:

  • Go to https://sim.ai → Settings → Copilot and generate a Copilot API key
  • Set COPILOT_API_KEY environment variable in your self-hosted apps/sim/.env file to that value

Environment Variables

See the environment variables reference for the full list, or apps/sim/.env.example for defaults.

Tech Stack

Contributing

We welcome contributions! Please see our Contributing Guide for details.

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

Made with ❤️ by the Sim Team

About

Build, deploy, and orchestrate AI agents. Sim is the central intelligence layer for your AI workforce.

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • TypeScript 69.5%
  • MDX 30.3%
  • Python 0.1%
  • CSS 0.1%
  • JavaScript 0.0%
  • PLpgSQL 0.0%