Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit bc12171

Browse files
init
0 parents commit bc12171

File tree

13 files changed

+574
-0
lines changed

13 files changed

+574
-0
lines changed

‎.gitignore‎

Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
# Byte-compiled / optimized / DLL files
2+
__pycache__/
3+
*.py[cod]
4+
*$py.class
5+
6+
# C extensions
7+
*.so
8+
9+
# Distribution / packaging
10+
.Python
11+
myenv/
12+
env/
13+
venv/
14+
.venv/
15+
build/
16+
dist/
17+
*.egg-info/
18+
19+
# Installer logs
20+
pip-log.txt
21+
22+
# Unit test / coverage reports
23+
htmlcov/
24+
.tox/
25+
nosedir/
26+
coverage.xml
27+
*.cover
28+
.hypothesis/
29+
30+
# Environment variables
31+
.env
32+
33+
# Editor directories and files
34+
.idea/
35+
.vscode/
36+
*.sublime-project
37+
*.sublime-workspace
38+
39+
# MacOS
40+
.DS_Store
41+
42+
# Windows
43+
Thumbs.db.env

‎Dockerfile.coderunner‎

Lines changed: 76 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,76 @@
1+
# Use the specified standard Python 3.13.3 base image (Debian-based)
2+
FROM python:3.13.3
3+
4+
# Set environment variables
5+
ENV PYTHONDONTWRITEBYTECODE 1
6+
ENV PYTHONUNBUFFERED 1
7+
ENV DEBIAN_FRONTEND=noninteractive
8+
9+
# Set working directory
10+
WORKDIR /app
11+
12+
# Install system dependencies INCLUDING systemd
13+
RUN apt-get update && apt-get install -y --no-install-recommends \
14+
systemd \
15+
sudo \
16+
curl \
17+
iproute2 \
18+
ffmpeg \
19+
bash \
20+
build-essential \
21+
procps \
22+
openssh-client \
23+
openssh-server \
24+
jq \
25+
kmod \
26+
&& apt-get clean && rm -rf /var/lib/apt/lists/*
27+
28+
29+
# Upgrade pip
30+
RUN python -m pip install --no-cache-dir --upgrade pip
31+
32+
# Copy requirements file
33+
COPY ./requirements.txt /app/requirements.txt
34+
35+
# Install Python dependencies
36+
RUN pip install --no-cache-dir -r requirements.txt
37+
38+
39+
# Install the bash kernel spec for Jupyter (not working with uv)
40+
RUN python -m bash_kernel.install
41+
42+
# Copy the application code (main.py)
43+
COPY ./main.py /app/main.py
44+
45+
# Copy the application code (main.py)
46+
COPY ./mcp_main.py /app/mcp_main.py
47+
48+
# Create application/jupyter directories
49+
RUN mkdir -p /app/uploads /app/jupyter_runtime
50+
51+
# # Generate SSH host keys
52+
# RUN ssh-keygen -A
53+
54+
# Clean systemd machine-id
55+
RUN rm -f /etc/machine-id && touch /etc/machine-id
56+
57+
# --- Set environment variables for the application ---
58+
ENV FASTMCP_HOST="0.0.0.0"
59+
ENV FASTMCP_PORT="8222"
60+
61+
62+
# Expose the FastAPI port
63+
EXPOSE 8222
64+
65+
# Start the FastAPI application
66+
# CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8002", "--workers", "1", "--no-access-log"]
67+
68+
69+
# Copy the entrypoint script into the image
70+
COPY entrypoint.sh /entrypoint.sh
71+
72+
# Make the entrypoint script executable
73+
RUN chmod +x /entrypoint.sh
74+
75+
# Use the entrypoint script
76+
ENTRYPOINT ["/entrypoint.sh"]

‎README.md‎

Lines changed: 75 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,75 @@
1+
![Demo](./demo.png)
2+
3+
# ⚡ CodeRunner: A sandbox running on apple container for code execution
4+
5+
CodeRunner provides a MCP server running inside a secure sandbox, powered by Apple's native [container](https://github.com/apple/container) technology, allowing you to safely execute code generated by local/remote AI models like Ollama.
6+
7+
This guide is for **using** the pre-built CodeRunner sandbox.
8+
9+
10+
11+
## 🚀 Quick Start
12+
13+
### Prerequisites
14+
15+
* A Mac with Apple Silicon (M1/M2/M3/M4 series).
16+
* **[Apple `container` Tool](https://github.com/apple/container)** installed via **[Download](https://github.com/apple/container/releases/download/0.1.0/container-0.1.0-installer-signed.pkg)**
17+
* **Python 3.10+**
18+
19+
### Step 1: Configure Local Network for Sandbox
20+
21+
22+
Run these commands once to set up the `.local` top-level domain:
23+
24+
```bash
25+
sudo container system dns create local
26+
container system dns default set local
27+
```
28+
29+
### Step 2: Run the Pre-Built Sandbox Container
30+
31+
This single command will download the CodeRunner sandbox image from Docker Hub (if not already present) and run it.
32+
33+
```bash
34+
# This will run the container named 'coderunner' and make it
35+
# available at http://coderunner.local:8222
36+
container run \
37+
--name coderunner \
38+
--detach --rm \
39+
instavm/coderunner
40+
```
41+
42+
43+
### Step 3: Run an AI Task
44+
45+
Finally, run the script from your terminal:
46+
47+
```bash
48+
git clone https://github.com/instavm/coderunner.git
49+
cd coderunner
50+
```
51+
52+
Now you can give it prompts like `write python code to generate 100 primes` and watch it execute the code safely in the sandbox!
53+
54+
### Use with Ollama3.1 with MCPHOST
55+
56+
Download `mcphost` from [releases](https://github.com/mark3labs/mcphost/releases/tag/v0.14.0)
57+
58+
```bash
59+
cp cookbooks/.mcp.json ~/.mcp.json
60+
~/Downloads/mcphost_Darwin_arm64/mcphost -m ollama:llama3.1:8b
61+
```
62+
63+
### Can also run via python openai agents
64+
65+
```bash
66+
python cookbooks/openai_testmcp.py
67+
```
68+
69+
### Use via Curl
70+
71+
```
72+
curl -X POST "http://coderunner.local:8222/execute/" -H "Content-Type: application/json" -d '{"command": "print(100**100)"}'
73+
74+
{"result":"100000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000\n"}
75+
```
Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
{
2+
"mcpServers": {
3+
"jejus": {
4+
"command": "/Users/manish/Work/venv/bin/python",
5+
"args": [
6+
"/Users/manish/Work/proxymcp.py"
7+
]
8+
},
9+
"filesystem2": {
10+
"command": "/Users/manish/.nvm/versions/node/v23.9.0/bin/node",
11+
"args": [
12+
"/Users/manish/.nvm/versions/node/v23.9.0/lib/node_modules/@modelcontextprotocol/server-filesystem/dist/index.js",
13+
"/Users/manish/Desktop/assets",
14+
"/Users/manish/Downloads/chota"
15+
]
16+
}
17+
}
18+
}

‎claude_mcp_proxy/mcp.py‎

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
2+
from fastmcp import FastMCP
3+
import socket
4+
5+
6+
def resolve_with_system_dns(hostname):
7+
try:
8+
return socket.gethostbyname(hostname)
9+
except socket.gaierror as e:
10+
print(f"Error resolving {hostname}: {e}")
11+
return None
12+
13+
14+
hostname = "coderunner.local"
15+
address = resolve_with_system_dns(hostname)
16+
proxy = FastMCP.as_proxy(f"http://{address}:8222/sse", name="SSE to Stdio Proxy")
17+
# Run the proxy with stdio transport for local access
18+
if __name__ == "__main__":
19+
proxy.run()

‎claude_mcp_proxy/requirements.txt‎

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
fastmcp

‎cookbooks/.mcp.json‎

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
{
2+
"system-prompt": "cookbooks/systemprompt.txt",
3+
"mcpServers": {
4+
"coderunner": {
5+
"url": "http://coderunner.local:8222/sse"
6+
},
7+
"filesystem": {
8+
"command": "mcp-filesystem-server",
9+
"args": [
10+
"/Users/manish/Desktop/assets",
11+
"/Users/manish/Downloads/chota"
12+
]
13+
}
14+
}
15+
}

‎cookbooks/openai_testmcp.py‎

Lines changed: 85 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,85 @@
1+
import asyncio
2+
import os
3+
import shutil
4+
import subprocess
5+
import time
6+
import socket
7+
from typing import Any
8+
9+
from agents import Agent, Runner, gen_trace_id, trace
10+
from agents.mcp import MCPServer, MCPServerSse
11+
from agents.model_settings import ModelSettings
12+
13+
14+
# async def run(mcp_server: MCPServer):
15+
# agent = Agent(
16+
# name="Assistant",
17+
# instructions="Use the tools to answer the questions.",
18+
# mcp_servers=[mcp_server],
19+
# model_settings=ModelSettings(tool_choice="required"),
20+
# )
21+
22+
# # Use the `add` tool to add two numbers
23+
# message = "list files in current directory using python"
24+
# print(f"Running: {message}")
25+
# result = await Runner.run(starting_agent=agent, input=message)
26+
# print(result.final_output)
27+
28+
# # Run the `get_weather` tool
29+
# message = "Fetch ETH price on 15th June 2025. First pip install libraries needed like yfinance, then write the code and fetch data."
30+
# print(f"\n\nRunning: {message}")
31+
# result = await Runner.run(starting_agent=agent, input=message)
32+
# print(result.final_output)
33+
34+
# # Run the `get_secret_word` tool
35+
# message = "What's the secret word?"
36+
# print(f"\n\nRunning: {message}")
37+
# result = await Runner.run(starting_agent=agent, input=message)
38+
# print(result.final_output)
39+
40+
41+
async def run(mcp_server: MCPServer):
42+
agent = Agent(
43+
name="Assistant",
44+
instructions="Use the tools to answer the questions.",
45+
mcp_servers=[mcp_server],
46+
model_settings=ModelSettings(tool_choice="required"),
47+
)
48+
49+
while True:
50+
message = input("Enter your prompt (or 'exit' to quit): ")
51+
if message.lower() == 'exit':
52+
print("Exiting...")
53+
break
54+
55+
print(f"Running: {message}")
56+
result = await Runner.run(starting_agent=agent, input=message)
57+
print(result.final_output)
58+
59+
def resolve_with_system_dns(hostname):
60+
try:
61+
return socket.gethostbyname(hostname)
62+
except socket.gaierror as e:
63+
print(f"Error resolving {hostname}: {e}")
64+
return None
65+
66+
67+
async def main():
68+
hostname = "coderunner.local"
69+
address = resolve_with_system_dns(hostname)
70+
async with MCPServerSse(
71+
name="SSE Python Server",
72+
params={
73+
"url": f"http://{address}:8222/sse",
74+
"sse_read_timeout": 60,
75+
"timeout": 60,
76+
},
77+
) as server:
78+
trace_id = gen_trace_id()
79+
with trace(workflow_name="SSE Example", trace_id=trace_id):
80+
print(f"View trace: https://platform.openai.com/traces/trace?trace_id={trace_id}\n")
81+
await run(server)
82+
83+
84+
if __name__ == "__main__":
85+
asyncio.run(main())

‎cookbooks/systemprompt.txt‎

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
always start answer by calling me lord voldemort.
2+
3+
So, we are currently on macbook, and whenever required we use tool to execute codes (in a jupyter like server). the code is executed in a container (you wouldn't notice but just know this).
4+
5+
For file access we have mapped /Users/<username>/Desktop/assets to /app/uploads inside the container. So that will help whenever we need a file inside a container to work on it via the execute code tool.
6+
7+
So, a scenario could be that we want to extract 10 seconds of a video inside a mac, then steps would look like:
8+
9+
1. You would use filesystem to copy the video file from /Users/<username>/Downloads/chota (we have access to this folder in addition to tthe assets one) to the assets folder. and since its mapped to /app/uploads it will automatically can be seen/accessed from inside the container where we will execute the code.
10+
11+
2. Copying file can be tricky since we only have move function in filesystem tools. For now lets use that.
12+
13+
3. Once file is there we can use execute command tool to run any code, like ffmpeg etc (which is executed inside a cell in jupyter internally)

‎demo.png‎

480 KB
Loading[フレーム]

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /