Don't run agents.Build them.
Behind every agent lies months of plumbing and observability.
Cerca helps you ship in days.
Calvin French-Owen
Founder · previously Segment, OpenAI
Shipping an agent means building two things. There's the agent itself: the prompts, what tools it should use, and its personality traits. And then there's the infrastructure to run it: scheduling, sandboxes, retry logic, credential storage, and context management.
The infrastructure is the table stakes. The agent is what makes your product yours. Yet most teams spend the majority of their time wiring all that infrastructure together.
We're here to change that. Cerca runs the infrastructure. You build the agent. If it's working, you forget it's there.
Durable execution
Your agent runs in an isolated sandbox with its own filesystem and network. If a tool call fails, it retries. If the container dies, execution resumes from the last checkpoint.
State persists automatically. No scheduler, no queue, no recovery logic to write. The runtime keeps the operational machinery present but invisible.
// provision a durable runtime for your userconst agent = await cerca.agents.create({ userId: 'user_123', environmentId: env.id, configuration: { instructions: 'You are a diligent executive assistant.', tools: ['web_fetch', 'sandbox_exec', 'tool_call'], approvalRequired: ['sandbox_exec'], },})As many agents as you'd like. Isolated sandbox, persistent state, automatic recovery.
Memory that persists
The agents that feel magical remember. Each agent has persistent memory that survives across sessions and restarts.
Cerca manages the context window — providing search, compacting long conversations, and preserving what matters. Your agent picks up where it left off.
All available via API. You can take your data with you.
// add your own memory — loaded into every threadawait cerca.context.upsert(agent.id, { key: 'user/', content: 'Product ops lead. Prefers concise bullets.',})const thread = await cerca.threads.create(agent.id, { prompt: 'Summarize unread emails from this morning.',})// stream events back as the agent worksthread.on('event', e => console.log('new message:', e))Every thread retains full context. Pick up any conversation where it left off.
Tools without the plumbing
Your agent needs Gmail, Slack, GitHub. You shouldn't write OAuth flows for each one. Cerca manages credential storage, token refresh, and scope management.
Connect a service once; every thread can use it. Tokens stay encrypted — the agent never touches them.
// connect Gmail — Cerca handles OAuth,// token refresh, and scope managementconst oauth = await cerca.oauth.connect('google', { scope: `env:${orgId}:${envId}`, scopes: ['gmail.modify'], returnOrigin: window.location.origin,})// after approval, attach to agent// agent discovers google.email.*Connect once. Every thread can use it. Token refresh is automatic.
Security at the core
What keeps most teams from shipping isn't capability — it's fear. An agent sending 1,000 emails. Leaked credentials. A runaway loop burning tokens overnight.
Cerca addresses all three without forcing product teams to invent their own risk model from scratch.
Approvals baked in
Any tool can require human sign-off. The thread pauses, the approval hits your queue, and resumes on a decision. Full audit trail.
Credentials never reach the model
OAuth tokens stay encrypted in Cerca's vault. The agent calls tools through the runtime — it never sees raw secrets.
Per-agent isolation
Each agent is its own security boundary. Scope permissions, set rate limits, cap budgets — design the trust model that fits your product.
Ship the agent, not the ops
Every agent product I've admired had a team that built their own infrastructure. Every one of them, privately, wished they hadn't. Agent infrastructure is the new database: every product needs it, nobody should build it, and the winners will be teams who spent their time on craft instead of plumbing.
Cerca is in private beta. If you're building an agent with its own voice and tools, serving real users — grab an API key below, or write me directly. I read everything.