Cloud Execution Guide

Run AI agents on cloud VMs to explore repositories, suggest tasks, and implement code changes with automatic pull request creation

Last updated: February 2026

Overview

Cloud Execution lets you run AI agents on cloud VMs that interact directly with your GitHub repositories. Instead of copying code into chat windows, the AI agent clones your repo, reads the codebase, and works with the actual files—just like a developer would.

There are two modes, each designed for a different stage of your workflow:

Explore Mode
AI explores your repository and suggests workunit tasks based on your problem statement. It reads your code, understands the architecture, and proposes specific, actionable tasks with file references and effort estimates.
Implement Mode
AI implements your workunit tasks by writing code, running tests, and creating a pull request. It pushes a feature branch to your repository with a diff summary and test results for your review.
Recommended Workflow
Start with Explore to get AI-suggested tasks, review and accept the ones you want, then use Implement to execute them. This two-step process keeps you in control while letting AI handle the heavy lifting.

Prerequisites

Cloud Execution uses a Bring Your Own Key (BYOK) model. You provide an API key for the cloud VM provider (Sprites) and connect your OpenRouter account for AI models. Workunit orchestrates the execution, giving you full control over costs and model selection.

Sprites.dev Account

Sprites.dev provides the cloud VMs where AI agents run. Each execution spins up an isolated container with your repository cloned and ready to work.

  1. Visit sprites.dev and create an account
  2. Navigate to your account settings and generate an API token
  3. Copy the token — you'll paste it into Workunit's API key settings
Sprites charges based on VM runtime. Typical Explore sessions cost a few cents; Implement sessions vary depending on repository size and task complexity.

OpenRouter Account

OpenRouter provides access to AI models (Claude, GPT, Gemini, and more) through a unified API. You connect your OpenRouter account to Workunit through a secure OAuth flow, and the AI agent uses your account to call the model you select.

  1. Visit openrouter.ai and create an account
  2. Add credits to your account (pay-as-you-go)
You'll connect your OpenRouter account through Workunit's settings — no need to copy API keys manually. OpenRouter charges per token based on the model you select.

GitHub App

The Workunit GitHub App connects your repositories to the platform. Implement mode needs it to push branches and create pull requests. Explore mode needs it to access private repositories.

Public Repositories
If you only use Explore mode with public repositories, you can skip installing the GitHub App. However, you'll need it to explore private repositories or to use Implement mode.

Setting Up Connections

Connections are configured per-organization in your organization settings. All team members with access to the organization can use the configured connections for cloud execution.

Sprites API Key

  1. Navigate to API Keys: Go to Organization Settings and select the API Keys tab
  2. Paste your Sprites API token into the Sprites field
  3. Click "Test Connection" to verify the key works — you'll see a green checkmark on success
  4. Save your settings
Your API key is encrypted at rest using AES-256-GCM with a key managed by Scaleway KMS. It is never stored in plaintext. See the Security section for details.

OpenRouter Connection

OpenRouter uses a secure OAuth flow to connect your account. No API keys to copy or paste.

  1. Navigate to API Keys: Go to Organization Settings and select the API Keys tab
  2. Click "Connect OpenRouter" on the OpenRouter card
  3. Authorize on OpenRouter: You'll be redirected to OpenRouter where you approve the connection
  4. Return to Workunit: After authorizing, you're redirected back and the connection is established
The OAuth flow uses PKCE (Proof Key for Code Exchange) for security. Workunit never sees or stores your OpenRouter password — only a scoped authorization token.

Connecting GitHub

Install the Workunit GitHub App on your repositories to enable Implement mode (push branches and create pull requests) and Explore mode on private repositories.

Installing the App

  1. Go to Organization Settings: Navigate to Organization Settings and select the GitHub tab
  2. Click "Install GitHub App" — this opens GitHub's authorization page
  3. Select which repositories to grant access to (you can choose all or select specific repos)
  4. Click "Install" on GitHub to authorize the app
  5. You'll be redirected back to Workunit with a confirmation showing the connected repositories

Permissions & Security

What the GitHub App Can Access:
  • Repository contents: Read and write access to push feature branches
  • Pull requests: Create and manage pull requests for code changes
  • Metadata: Read repository metadata (name, description, default branch)
Security Guarantees:
  • Installation tokens only: Workunit uses short-lived GitHub installation tokens, never long-lived credentials
  • Tokens are never stored: Installation tokens are generated on-demand and expire within one hour
  • Scoped access: The app only accesses repositories you explicitly authorize
  • Revocable: You can uninstall the app from GitHub settings at any time

Explore Mode

Explore mode sends an AI agent to analyze your repository and suggest tasks based on your workunit's problem statement. The agent reads your codebase, understands the architecture, and proposes specific changes with file references.

Step-by-Step Walkthrough

1
Open your workunit
Navigate to the workunit you want to explore. Make sure it has a clear problem statement — the AI uses this to understand what to look for.
2
Click "Explore" in the Cloud Execution panel
You'll find the Cloud Execution panel on the workunit detail page. Select the Explore tab.
3
Enter a GitHub repository URL
Paste the full URL of the repository you want to explore (e.g., https://github.com/your-org/your-repo). You can also specify a branch.
Tip: Set the repository URL and branch at the project level so all workunits in the project inherit them automatically.
4
Select an AI model
Choose which model the agent should use via OpenRouter. The model dropdown shows per-token pricing to help you decide.
5
Start the execution
Click "Start Explore" and watch the progress. The agent clones the repo, reads relevant files, and analyzes the codebase. This typically takes 2–5 minutes.
6
Review suggested tasks
When the agent finishes, it presents a list of suggested tasks. Review each one, then accept the tasks you want added to your workunit.

Reviewing Results

The Explore agent returns structured task suggestions. Each suggestion includes:

  • Task title: A clear, actionable description of the change
  • Description: Detailed explanation of what needs to be done and why
  • File references: Specific files and code locations the task affects
  • Reasoning: Why the AI suggests this task based on the problem statement
  • Effort estimate: Approximate complexity and time to implement
You decide what gets added. Review each suggestion, accept the useful ones, and discard the rest. Accepted tasks are added directly to your workunit with the AI-generated details preserved.
Pro Tip: Refine Your Problem Statement
If the suggested tasks don't match what you need, try making your problem statement more specific. The AI uses it as the primary guide for what to look for in the codebase.

Implement Mode

Implement mode takes your workunit tasks and executes them against your repository. The AI agent writes code, runs tests, pushes a feature branch, and creates a pull request for your review.

Requires GitHub App
Implement mode needs write access to your repository. Make sure you've installed the GitHub App before starting.

Step-by-Step Walkthrough

1
Ensure your workunit has tasks
The AI agent works from your task list. Add tasks manually or use Explore mode first to generate them from codebase analysis.
2
Click "Implement" in the Cloud Execution panel
Select the Implement tab in the Cloud Execution panel on your workunit page.
3
Enter the repository URL and select a model
Same as Explore: paste the GitHub URL and choose your preferred AI model.
4
Start the execution
Click "Start Implement" and monitor the progress. The agent clones the repo, reads the tasks, writes code, and runs tests. This may take 5–15 minutes depending on task complexity.
5
Review the results
When the agent completes, you'll see a summary of changes: files modified, tests run, and a diff overview. Review these before creating the pull request.

Creating a Pull Request

After reviewing the implementation results, you can create a pull request directly from Workunit:

What you'll see before creating the PR:
  • Diff summary: Overview of all changed files with additions and deletions
  • Test results: Whether the agent ran tests and if they passed
  • Branch name: The feature branch the agent created
  • PR description: Auto-generated description linking back to your workunit
Creating the PR:
  • • Review the diff summary and test results
  • • Click "Create Pull Request" to push the branch and open the PR on GitHub
  • • The PR links back to your workunit for full traceability
  • • Review, request changes, or merge the PR through your normal GitHub workflow
Always Review AI-Created PRs
Cloud Execution is a powerful tool, but treat AI-generated code like any other contribution. Review the diff carefully, run your CI pipeline, and ensure the changes meet your standards before merging.

Security & Encryption

Your credentials (API keys and OAuth tokens) are sensitive. Workunit uses multiple layers of encryption and security controls to protect them.

How We Protect Your Credentials

Encrypted at rest
Your credentials are encrypted before they ever touch the database. They are never stored in plaintext.
Per-organization isolation
Each organization has its own encryption key. Keys from one organization cannot decrypt another organization's data.
European KMS
Encryption keys are managed by Scaleway Key Management Service, hosted in European data centers with strict access controls.
Decrypted only when needed
Credentials are decrypted in-memory only at the moment of execution, then immediately discarded. They never appear in logs or error messages.
Audit trail
Every credential access is logged with a timestamp and context, so you can review when and why your credentials were used.

Technical Details

Envelope Encryption
Workunit uses envelope encryption, the same pattern used by AWS, GCP, and Azure for key management:
  1. A Data Encryption Key (DEK) is generated per organization using a cryptographically secure random number generator
  2. The DEK encrypts your credentials (API keys and OAuth tokens) using AES-256-GCM (authenticated encryption)
  3. The DEK itself is encrypted by a Key Encryption Key (KEK) managed in Scaleway KMS
  4. Only the encrypted DEK and encrypted credentials are stored in the database
Encryption Specifications
  • Algorithm: AES-256-GCM (256-bit key, 128-bit authentication tag)
  • Nonce: 96-bit, cryptographically random, unique per encryption operation
  • Key derivation: Scaleway KMS with hardware-backed key storage
  • Key rotation: KEK rotation supported without re-encrypting all data
What This Means in Practice
  • • Even if the database is compromised, credentials cannot be decrypted without access to Scaleway KMS
  • • Each organization's data requires a separate KMS call to decrypt, preventing bulk exposure
  • • The GCM authentication tag ensures encrypted data hasn't been tampered with

Cost & Usage

Cloud Execution uses a Bring Your Own Key (BYOK) model. You pay the providers directly for what you use. Workunit does not charge extra for cloud execution.

Sprites.dev (VM Compute)
Charged by runtime duration. Typical costs:
  • Explore session: 2–5 minutes → a few cents
  • Implement session: 5–15 minutes → varies by complexity
OpenRouter (AI Models)
Charged per token based on the model selected. Pricing varies by model — the model dropdown in Workunit shows the per-token cost next to each option so you can compare before starting an execution.
You can also browse all available models and pricing at OpenRouter's model directory.
Cost Optimization Tip
Start with Explore to generate tasks (cheaper), then review and refine the tasks before running Implement (more expensive). A well-defined task list leads to more focused implementation and lower costs.

Troubleshooting

Common issues and how to resolve them:

"Connection failed" or "API key test failed"
Cause: The Sprites API key is invalid/expired, or the OpenRouter connection has been revoked.
Fix: Regenerate your Sprites key and paste the new key in Organization Settings. If the issue is with OpenRouter, disconnect and reconnect your account through the OAuth flow.
"Repository not found" or "Access denied"
Cause: The repository URL is incorrect or the GitHub App doesn't have access.
Fix: Verify the URL. For private repos in Implement mode, ensure the GitHub App is installed on that repository.
"Execution timed out"
Cause: The repository is very large, or the AI model took too long to respond.
Fix: Try again with a faster, lighter model. For very large repos, try a more specific problem statement to narrow the scope.
"No tasks generated" after Explore
Cause: The problem statement is too vague for the AI to identify specific changes.
Fix: Make your problem statement more specific. Instead of "improve the app," try "add input validation to the user registration form in src/auth."
"Tests failed" in Implement results
Cause: The AI's code changes broke existing tests or the new tests failed.
Fix: Review the test output in the results panel. You can still create the PR and fix the failing tests manually, or run Implement again with more specific task descriptions.
"GitHub App not installed"
Cause: Trying to use Implement mode without the GitHub App connected.
Fix: Go to Organization Settings > GitHub tab and install the app. See Connecting GitHub above.

Next Steps

Ready to use Cloud Execution? Here's where to go next:

Questions About Cloud Execution?

We're here to help you get the most out of AI-powered code exploration and implementation.