diff --git a/.prettierignore b/.prettierignore deleted file mode 100644 index df26bac..0000000 --- a/.prettierignore +++ /dev/null @@ -1,4 +0,0 @@ -# frontend/** -backend/ai/** -backend/database/** -backend/storage/** \ No newline at end of file diff --git a/backend/server/.prettierrc b/.prettierrc similarity index 100% rename from backend/server/.prettierrc rename to .prettierrc diff --git a/README.md b/README.md index 0da6e4a..d608a6b 100644 --- a/README.md +++ b/README.md @@ -1,103 +1,213 @@ -# Sandbox 📦🪄 +# GitWit Sandbox 📦🪄 -Screenshot 2024-05-31 at 8 33 56 AM +![2024-10-2307 17 42-ezgif com-resize](https://github.com/user-attachments/assets/a4057129-81a7-4a31-a093-c8bc8189ae72) -Sandbox is an open-source cloud-based code editing environment with custom AI code autocompletion and real-time collaboration. +Sandbox is an open-source cloud-based code editing environment with custom AI code generation, live preview, real-time collaboration and AI chat. -Check out the [Twitter thread](https://x.com/ishaandey_/status/1796338262002573526) with the demo video! - -Check out this [guide](https://dev.to/jamesmurdza/how-to-setup-ishaan1013sandbox-locally-503p) made by [@jamesmurdza](https://x.com/jamesmurdza) on setting it up locally! +For the latest updates, join our Discord server: [discord.gitwit.dev](https://discord.gitwit.dev/). ## Running Locally -### Frontend +Notes: -Install dependencies +- Double check that whatever you change "SUPERDUPERSECRET" to, it's the same in all config files. +- Right now we are loading project templates from a custom Cloudflare bucket which isn't covered in this guide, but that be updated/fixed very soon. + +### 0. Requirements + +The application uses NodeJS for the backend, NextJS for the frontend and Cloudflare workers for additional backend tasks. + +Needed accounts to set up: + +- [Clerk](https://clerk.com/): Used for user authentication. +- [Liveblocks](https://liveblocks.io/): Used for collaborative editing. +- [E2B](https://e2b.dev/): Used for the terminals and live preview. +- [Cloudflare](https://www.cloudflare.com/): Used for relational data storage (D2) and file storage (R2). + +A quick overview of the tech before we start: The deployment uses a **NextJS** app for the frontend and an **ExpressJS** server on the backend. Presumably that's because NextJS integrates well with Clerk middleware but not with Socket.io. + +### 1. Initial setup + +No surprise in the first step: ```bash -cd frontend -npm install +git clone https://github.com/jamesmurdza/sandbox +cd sandbox ``` -Add the required environment variables in `.env` (example file provided in `.env.example`). You will need to make an account on [Clerk](https://clerk.com/) and [Liveblocks](https://liveblocks.io/) to get API keys. +Run `npm install` in: -Then, run in development mode - -```bash -npm run dev +``` +/frontend +/backend/database +/backend/storage +/backend/server +/backend/ai ``` -### Backend +### 2. Adding Clerk -The backend consists of a primary Express and Socket.io server, and 3 Cloudflare Workers microservices for the D1 database, R2 storage, and Workers AI. The D1 database also contains a [service binding](https://developers.cloudflare.com/workers/runtime-apis/bindings/service-bindings/) to the R2 storage worker. Each open sandbox instantiates a secure Linux sandboxes on E2B, which is used for the terminal and live preview. +Setup the Clerk account. +Get the API keys from Clerk. -You will need to make an account on [E2B](https://e2b.dev/) to get an API key. +Update `/frontend/.env`: -#### Socket.io server - -Install dependencies - -```bash -cd backend/server -npm install +``` +NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY='🔑' +CLERK_SECRET_KEY='🔑' ``` -Add the required environment variables in `.env` (example file provided in `.env.example`) +### 3. Deploying the storage bucket -Project files will be stored in the `projects/` directory. The middleware contains basic authorization logic for connecting to the server. +Go to Cloudflare. +Create and name an R2 storage bucket in the control panel. +Copy the account ID of one domain. -Run in development mode +Update `/backend/storage/src/wrangler.toml`: -```bash -npm run dev +``` +account_id = '🔑' +bucket_name = '🔑' +key = 'SUPERDUPERSECRET' ``` -This directory is dockerized, so feel free to deploy a container on any platform of your choice! I chose not to deploy this project for public access due to costs & safety, but deploying your own for personal use should be no problem. +In the `/backend/storage/src` directory: -#### Cloudflare Workers (Database, Storage, AI) - -Directories: - -- `/backend/database`: D1 database -- `/backend/storage`: R2 storage -- `/backend/ai`: Workers AI - -Install dependencies - -```bash -cd backend/database -npm install - -cd ../storage -npm install - -cd ../ai -npm install ``` - -Read the [documentation](https://developers.cloudflare.com/workers/) to learn more about workers. - -For each directory, add the required environment variables in `wrangler.toml` (example file provided in `wrangler.example.toml`). For the AI worker, you can define any value you want for the `CF_AI_KEY` -- set this in other `.env` files to authorize access. - -Run in development mode - -```bash -npm run dev -``` - -Deploy to Cloudflare with [Wrangler](https://developers.cloudflare.com/workers/wrangler/install-and-update/) - -```bash npx wrangler deploy ``` ---- +### 4. Deploying the database + +Create a database: + +``` +npx wrangler d1 create sandbox-database +``` + +Use the output for the next setp. + +Update `/backend/database/src/wrangler.toml`: + +``` +database_name = '🔑' +database_id = '🔑' +KEY = 'SUPERDUPERSECRET' +STORAGE_WORKER_URL = 'https://storage.🍎.workers.dev' +``` + +In the `/backend/database/src` directory: + +``` +npx wrangler deploy +``` + +### 5. Applying the database schema + +Delete the `/backend/database/drizzle/meta` directory. + +In the `/backend/database/` directory: + +``` +npm run generate +npx wrangler d1 execute sandbox-database --remote --file=./drizzle/0000_🍏_🍐.sql +``` + +### 6. Configuring the server + +Update `/backend/server/.env`: + +``` +DATABASE_WORKER_URL='https://database.🍎.workers.dev' +STORAGE_WORKER_URL='https://storage.🍎.workers.dev' +WORKERS_KEY='SUPERDUPERSECRET' +``` + +### 7. Adding Liveblocks + +Setup the Liveblocks account. + +Update `/frontend/.env`: + +``` +NEXT_PUBLIC_LIVEBLOCKS_PUBLIC_KEY='🔑' +LIVEBLOCKS_SECRET_KEY='🔑' +``` + +### 8. Adding E2B + +Setup the E2B account. + +Update `/backend/server/.env`: + +``` +E2B_API_KEY='🔑' +``` + +### 9. Adding AI code generation + +In the `/backend/ai` directory: + +``` +npx wrangler deploy +``` + +Update `/backend/server/.env`: + +``` +AI_WORKER_URL='https://ai.🍎.workers.dev' +``` + +### 10. Configuring the frontend + +Update `/frontend/.env`: + +``` +NEXT_PUBLIC_DATABASE_WORKER_URL='https://database.🍎.workers.dev' +NEXT_PUBLIC_STORAGE_WORKER_URL='https://storage.🍎.workers.dev' +NEXT_PUBLIC_WORKERS_KEY='SUPERDUPERSECRET' +``` + +### 11. Running the IDE + +Run `npm run dev` simultaneously in: + +``` +/frontend +/backend/server +``` + +## Setting up Deployments + +The steps above do not include steps to setup [Dokku](https://github.com/dokku/dokku), which is required for deployments. + +**Note:** This is completely optional to set up if you just want to run GitWit Sandbox. + +Setting up deployments first requires a separate domain (such as gitwit.app, which we use). + +We then deploy Dokku on a separate server, according to this guide: https://dev.to/jamesmurdza/host-your-own-paas-platform-as-a-service-on-amazon-web-services-3f0d + +The Sandbox platform connects to the Dokku server via SSH, using SSH keys specifically generated for this connection. The SSH key is stored on the Sandbox server, and the following environment variables are set in /backend/server/.env: + +```bash +DOKKU_HOST= +DOKKU_USERNAME= +DOKKU_KEY= +``` + +## Creating Custom Templates + +We're working on a process whereby anyone can contribute a custom template that others can use in the Sandbox environment. The process includes: + +- Creating a [custom E2B Sandbox](https://e2b.dev/docs/sandbox-template) including the template files and dependencies +- Creating a file to specify the run command (e.g. "npm run dev") +- Testing the template with Dokku for deployment + +Please reach out to us [on Discord](https://discord.gitwit.dev/) if you're interested in contributing. ## Contributing -Thanks for your interest in contributing! Review this section before submitting your first pull request. If you need any help, feel free to reach out to [@ishaandey\_](https://x.com/ishaandey_). - -Please prioritize existing issues, but feel free to contribute new issues if you have ideas for a feature or bug that you think would be useful. +Thanks for your interest in contributing! Review this section before submitting your first pull request. If you need any help, feel free contact us [on Discord](https://discord.gitwit.dev/). ### Structure @@ -116,13 +226,13 @@ backend/ └── ai ``` -| Path | Description | -| ------------------ | -------------------------------------------------------------------------- | -| `frontend` | The Next.js application for the frontend. | -| `backend/server` | The Express websocket server. | -| `backend/database` | API for interfacing with the D1 database (SQLite). | +| Path | Description | +| ------------------ | ------------------------------------------------------------ | +| `frontend` | The Next.js application for the frontend. | +| `backend/server` | The Express websocket server. | +| `backend/database` | API for interfacing with the D1 database (SQLite). | | `backend/storage` | API for interfacing with R2 storage. Service-bound to `/backend/database`. | -| `backend/ai` | API for making requests to Workers AI . | +| `backend/ai` | API for making requests to Workers AI . | ### Development @@ -151,11 +261,15 @@ It should be in the form `category(scope or module): message` in your commit mes - `feat / feature`: all changes that introduce completely new code or new features + - `fix`: changes that fix a bug (ideally you will additionally reference an issue if present) + - `refactor`: any code related change that is not a fix nor a feature + - `docs`: changing existing or creating new documentation (i.e. README, docs for usage of a lib or cli usage) + - `chore`: all changes to the repository that do not fit into any of the above categories diff --git a/backend/ai/.prettierrc b/backend/ai/.prettierrc deleted file mode 100644 index 42830e2..0000000 --- a/backend/ai/.prettierrc +++ /dev/null @@ -1,5 +0,0 @@ -{ - "tabWidth": 2, - "semi": false, - "singleQuote": false -} \ No newline at end of file diff --git a/backend/ai/package.json b/backend/ai/package.json index 59dc0ae..c5c2216 100644 --- a/backend/ai/package.json +++ b/backend/ai/package.json @@ -1,22 +1,22 @@ { - "name": "ai", - "version": "0.0.0", - "private": true, - "scripts": { - "deploy": "wrangler deploy", - "dev": "wrangler dev", - "start": "wrangler dev", - "test": "vitest", - "cf-typegen": "wrangler types" - }, - "devDependencies": { - "@cloudflare/vitest-pool-workers": "^0.1.0", - "@cloudflare/workers-types": "^4.20240512.0", - "typescript": "^5.0.4", - "vitest": "1.3.0", - "wrangler": "^3.0.0" - }, - "dependencies": { - "@anthropic-ai/sdk": "^0.27.2" - } -} + "name": "ai", + "version": "0.0.0", + "private": true, + "scripts": { + "deploy": "wrangler deploy", + "dev": "wrangler dev", + "start": "wrangler dev", + "test": "vitest", + "cf-typegen": "wrangler types" + }, + "devDependencies": { + "@cloudflare/vitest-pool-workers": "^0.1.0", + "@cloudflare/workers-types": "^4.20240512.0", + "typescript": "^5.0.4", + "vitest": "1.3.0", + "wrangler": "^3.0.0" + }, + "dependencies": { + "@anthropic-ai/sdk": "^0.27.2" + } +} \ No newline at end of file diff --git a/backend/ai/src/index.ts b/backend/ai/src/index.ts index ff4635f..cd9bd4a 100644 --- a/backend/ai/src/index.ts +++ b/backend/ai/src/index.ts @@ -1,57 +1,61 @@ -import { Anthropic } from "@anthropic-ai/sdk"; -import { MessageParam } from "@anthropic-ai/sdk/src/resources/messages.js"; +import { Anthropic } from "@anthropic-ai/sdk" +import { MessageParam } from "@anthropic-ai/sdk/src/resources/messages.js" export interface Env { - ANTHROPIC_API_KEY: string; + ANTHROPIC_API_KEY: string } export default { - async fetch(request: Request, env: Env): Promise { - // Handle CORS preflight requests - if (request.method === "OPTIONS") { - return new Response(null, { - headers: { - "Access-Control-Allow-Origin": "*", - "Access-Control-Allow-Methods": "GET, POST, OPTIONS", - "Access-Control-Allow-Headers": "Content-Type", - }, - }); - } + async fetch(request: Request, env: Env): Promise { + // Handle CORS preflight requests + if (request.method === "OPTIONS") { + return new Response(null, { + headers: { + "Access-Control-Allow-Origin": "*", + "Access-Control-Allow-Methods": "GET, POST, OPTIONS", + "Access-Control-Allow-Headers": "Content-Type", + }, + }) + } - if (request.method !== "GET" && request.method !== "POST") { - return new Response("Method Not Allowed", { status: 405 }); - } + if (request.method !== "GET" && request.method !== "POST") { + return new Response("Method Not Allowed", { status: 405 }) + } - let body; - let isEditCodeWidget = false; - if (request.method === "POST") { - body = await request.json() as { messages: unknown; context: unknown; activeFileContent: string }; - } else { - const url = new URL(request.url); - const fileName = url.searchParams.get("fileName") || ""; - const code = url.searchParams.get("code") || ""; - const line = url.searchParams.get("line") || ""; - const instructions = url.searchParams.get("instructions") || ""; + let body + let isEditCodeWidget = false + if (request.method === "POST") { + body = (await request.json()) as { + messages: unknown + context: unknown + activeFileContent: string + } + } else { + const url = new URL(request.url) + const fileName = url.searchParams.get("fileName") || "" + const code = url.searchParams.get("code") || "" + const line = url.searchParams.get("line") || "" + const instructions = url.searchParams.get("instructions") || "" - body = { - messages: [{ role: "human", content: instructions }], - context: `File: ${fileName}\nLine: ${line}\nCode:\n${code}`, - activeFileContent: code, - }; - isEditCodeWidget = true; - } + body = { + messages: [{ role: "human", content: instructions }], + context: `File: ${fileName}\nLine: ${line}\nCode:\n${code}`, + activeFileContent: code, + } + isEditCodeWidget = true + } - const messages = body.messages; - const context = body.context; - const activeFileContent = body.activeFileContent; + const messages = body.messages + const context = body.context + const activeFileContent = body.activeFileContent - if (!Array.isArray(messages) || messages.length === 0) { - return new Response("Invalid or empty messages", { status: 400 }); - } + if (!Array.isArray(messages) || messages.length === 0) { + return new Response("Invalid or empty messages", { status: 400 }) + } - let systemMessage; - if (isEditCodeWidget) { - systemMessage = `You are an AI code editor. Your task is to modify the given code based on the user's instructions. Only output the modified code, without any explanations or markdown formatting. The code should be a direct replacement for the existing code. + let systemMessage + if (isEditCodeWidget) { + systemMessage = `You are an AI code editor. Your task is to modify the given code based on the user's instructions. Only output the modified code, without any explanations or markdown formatting. The code should be a direct replacement for the existing code. Context: ${context} @@ -61,9 +65,9 @@ ${activeFileContent} Instructions: ${messages[0].content} -Respond only with the modified code that can directly replace the existing code.`; - } else { - systemMessage = `You are an intelligent programming assistant. Please respond to the following request concisely. If your response includes code, please format it using triple backticks (\`\`\`) with the appropriate language identifier. For example: +Respond only with the modified code that can directly replace the existing code.` + } else { + systemMessage = `You are an intelligent programming assistant. Please respond to the following request concisely. If your response includes code, please format it using triple backticks (\`\`\`) with the appropriate language identifier. For example: \`\`\`python print("Hello, World!") @@ -71,51 +75,54 @@ print("Hello, World!") Provide a clear and concise explanation along with any code snippets. Keep your response brief and to the point. -${context ? `Context:\n${context}\n` : ''} -${activeFileContent ? `Active File Content:\n${activeFileContent}\n` : ''}`; - } +${context ? `Context:\n${context}\n` : ""} +${activeFileContent ? `Active File Content:\n${activeFileContent}\n` : ""}` + } - const anthropicMessages = messages.map(msg => ({ - role: msg.role === 'human' ? 'user' : 'assistant', - content: msg.content - })) as MessageParam[]; + const anthropicMessages = messages.map((msg) => ({ + role: msg.role === "human" ? "user" : "assistant", + content: msg.content, + })) as MessageParam[] - try { - const anthropic = new Anthropic({ apiKey: env.ANTHROPIC_API_KEY }); + try { + const anthropic = new Anthropic({ apiKey: env.ANTHROPIC_API_KEY }) - const stream = await anthropic.messages.create({ - model: "claude-3-5-sonnet-20240620", - max_tokens: 1024, - system: systemMessage, - messages: anthropicMessages, - stream: true, - }); + const stream = await anthropic.messages.create({ + model: "claude-3-5-sonnet-20240620", + max_tokens: 1024, + system: systemMessage, + messages: anthropicMessages, + stream: true, + }) - const encoder = new TextEncoder(); + const encoder = new TextEncoder() - const streamResponse = new ReadableStream({ - async start(controller) { - for await (const chunk of stream) { - if (chunk.type === 'content_block_delta' && chunk.delta.type === 'text_delta') { - const bytes = encoder.encode(chunk.delta.text); - controller.enqueue(bytes); - } - } - controller.close(); - }, - }); + const streamResponse = new ReadableStream({ + async start(controller) { + for await (const chunk of stream) { + if ( + chunk.type === "content_block_delta" && + chunk.delta.type === "text_delta" + ) { + const bytes = encoder.encode(chunk.delta.text) + controller.enqueue(bytes) + } + } + controller.close() + }, + }) - return new Response(streamResponse, { - headers: { - "Content-Type": "text/plain; charset=utf-8", - "Access-Control-Allow-Origin": "*", - "Cache-Control": "no-cache", - "Connection": "keep-alive", - }, - }); - } catch (error) { - console.error("Error:", error); - return new Response("Internal Server Error", { status: 500 }); - } - }, -}; + return new Response(streamResponse, { + headers: { + "Content-Type": "text/plain; charset=utf-8", + "Access-Control-Allow-Origin": "*", + "Cache-Control": "no-cache", + Connection: "keep-alive", + }, + }) + } catch (error) { + console.error("Error:", error) + return new Response("Internal Server Error", { status: 500 }) + } + }, +} diff --git a/backend/ai/test/index.spec.ts b/backend/ai/test/index.spec.ts index fbee335..706f17c 100644 --- a/backend/ai/test/index.spec.ts +++ b/backend/ai/test/index.spec.ts @@ -1,25 +1,30 @@ // test/index.spec.ts -import { env, createExecutionContext, waitOnExecutionContext, SELF } from 'cloudflare:test'; -import { describe, it, expect } from 'vitest'; -import worker from '../src/index'; +import { + createExecutionContext, + env, + SELF, + waitOnExecutionContext, +} from "cloudflare:test" +import { describe, expect, it } from "vitest" +import worker from "../src/index" // For now, you'll need to do something like this to get a correctly-typed // `Request` to pass to `worker.fetch()`. -const IncomingRequest = Request; +const IncomingRequest = Request -describe('Hello World worker', () => { - it('responds with Hello World! (unit style)', async () => { - const request = new IncomingRequest('http://example.com'); +describe("Hello World worker", () => { + it("responds with Hello World! (unit style)", async () => { + const request = new IncomingRequest("http://example.com") // Create an empty context to pass to `worker.fetch()`. - const ctx = createExecutionContext(); - const response = await worker.fetch(request, env, ctx); + const ctx = createExecutionContext() + const response = await worker.fetch(request, env, ctx) // Wait for all `Promise`s passed to `ctx.waitUntil()` to settle before running test assertions - await waitOnExecutionContext(ctx); - expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`); - }); + await waitOnExecutionContext(ctx) + expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`) + }) - it('responds with Hello World! (integration style)', async () => { - const response = await SELF.fetch('https://example.com'); - expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`); - }); -}); + it("responds with Hello World! (integration style)", async () => { + const response = await SELF.fetch("https://example.com") + expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`) + }) +}) diff --git a/backend/ai/test/tsconfig.json b/backend/ai/test/tsconfig.json index 509425f..339ee9b 100644 --- a/backend/ai/test/tsconfig.json +++ b/backend/ai/test/tsconfig.json @@ -1,11 +1,11 @@ { - "extends": "../tsconfig.json", - "compilerOptions": { - "types": [ - "@cloudflare/workers-types/experimental", - "@cloudflare/vitest-pool-workers" - ] - }, - "include": ["./**/*.ts", "../src/env.d.ts"], - "exclude": [] + "extends": "../tsconfig.json", + "compilerOptions": { + "types": [ + "@cloudflare/workers-types/experimental", + "@cloudflare/vitest-pool-workers" + ] + }, + "include": ["./**/*.ts", "../src/env.d.ts"], + "exclude": [] } diff --git a/backend/ai/tsconfig.json b/backend/ai/tsconfig.json index 9192490..8b55b9c 100644 --- a/backend/ai/tsconfig.json +++ b/backend/ai/tsconfig.json @@ -12,7 +12,9 @@ /* Language and Environment */ "target": "es2021" /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */, - "lib": ["es2021"] /* Specify a set of bundled library declaration files that describe the target runtime environment. */, + "lib": [ + "es2021" + ] /* Specify a set of bundled library declaration files that describe the target runtime environment. */, "jsx": "react" /* Specify what JSX code is generated. */, // "experimentalDecorators": true, /* Enable experimental support for TC39 stage 2 draft decorators. */ // "emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */ diff --git a/backend/ai/vitest.config.ts b/backend/ai/vitest.config.ts index 973627c..5643ba3 100644 --- a/backend/ai/vitest.config.ts +++ b/backend/ai/vitest.config.ts @@ -1,11 +1,11 @@ -import { defineWorkersConfig } from "@cloudflare/vitest-pool-workers/config"; +import { defineWorkersConfig } from "@cloudflare/vitest-pool-workers/config" export default defineWorkersConfig({ - test: { - poolOptions: { - workers: { - wrangler: { configPath: "./wrangler.toml" }, - }, - }, - }, -}); + test: { + poolOptions: { + workers: { + wrangler: { configPath: "./wrangler.toml" }, + }, + }, + }, +}) diff --git a/backend/ai/worker-configuration.d.ts b/backend/ai/worker-configuration.d.ts index 5b2319b..a3f43d2 100644 --- a/backend/ai/worker-configuration.d.ts +++ b/backend/ai/worker-configuration.d.ts @@ -1,4 +1,3 @@ // Generated by Wrangler // After adding bindings to `wrangler.toml`, regenerate this interface via `npm run cf-typegen` -interface Env { -} +interface Env {} diff --git a/backend/ai/wrangler.example.toml b/backend/ai/wrangler.example.toml index ede189b..8dcd480 100644 --- a/backend/ai/wrangler.example.toml +++ b/backend/ai/wrangler.example.toml @@ -5,3 +5,6 @@ compatibility_flags = ["nodejs_compat"] [ai] binding = "AI" + +[vars] +ANTHROPIC_API_KEY = "" diff --git a/backend/database/.prettierrc b/backend/database/.prettierrc deleted file mode 100644 index 42830e2..0000000 --- a/backend/database/.prettierrc +++ /dev/null @@ -1,5 +0,0 @@ -{ - "tabWidth": 2, - "semi": false, - "singleQuote": false -} \ No newline at end of file diff --git a/backend/database/drizzle.config.ts b/backend/database/drizzle.config.ts index 6b6291b..a551fed 100644 --- a/backend/database/drizzle.config.ts +++ b/backend/database/drizzle.config.ts @@ -1,4 +1,4 @@ -import type { Config } from "drizzle-kit"; +import type { Config } from "drizzle-kit" export default process.env.LOCAL_DB_PATH ? ({ @@ -16,4 +16,4 @@ export default process.env.LOCAL_DB_PATH wranglerConfigPath: "wrangler.toml", dbName: "d1-sandbox", }, - } satisfies Config); + } satisfies Config) diff --git a/backend/database/package.json b/backend/database/package.json index 6aae34c..d1aad0f 100644 --- a/backend/database/package.json +++ b/backend/database/package.json @@ -1,32 +1,32 @@ { - "name": "database", - "version": "0.0.0", - "private": true, - "scripts": { - "deploy": "wrangler deploy", - "dev": "wrangler dev", - "start": "wrangler dev", - "test": "vitest", - "generate": "drizzle-kit generate:sqlite --schema=src/schema.ts", - "up": "drizzle-kit up:sqlite --schema=src/schema.ts", - "db:studio": "cross-env LOCAL_DB_PATH=$(find .wrangler/state/v3/d1/miniflare-D1DatabaseObject -type f -name '*.sqlite' -print -quit) drizzle-kit studio" - }, - "devDependencies": { - "@cloudflare/vitest-pool-workers": "^0.1.0", - "@cloudflare/workers-types": "^4.20240405.0", - "@types/itty-router-extras": "^0.4.3", - "drizzle-kit": "^0.20.17", - "typescript": "^5.0.4", - "vitest": "1.3.0", - "wrangler": "^3.0.0" - }, - "dependencies": { - "@paralleldrive/cuid2": "^2.2.2", - "better-sqlite3": "^9.5.0", - "cross-env": "^7.0.3", - "drizzle-orm": "^0.30.8", - "itty-router": "^5.0.16", - "itty-router-extras": "^0.4.6", - "zod": "^3.22.4" - } -} + "name": "database", + "version": "0.0.0", + "private": true, + "scripts": { + "deploy": "wrangler deploy", + "dev": "wrangler dev", + "start": "wrangler dev", + "test": "vitest", + "generate": "drizzle-kit generate:sqlite --schema=src/schema.ts", + "up": "drizzle-kit up:sqlite --schema=src/schema.ts", + "db:studio": "cross-env LOCAL_DB_PATH=$(find .wrangler/state/v3/d1/miniflare-D1DatabaseObject -type f -name '*.sqlite' -print -quit) drizzle-kit studio" + }, + "devDependencies": { + "@cloudflare/vitest-pool-workers": "^0.1.0", + "@cloudflare/workers-types": "^4.20240405.0", + "@types/itty-router-extras": "^0.4.3", + "drizzle-kit": "^0.20.17", + "typescript": "^5.0.4", + "vitest": "1.3.0", + "wrangler": "^3.0.0" + }, + "dependencies": { + "@paralleldrive/cuid2": "^2.2.2", + "better-sqlite3": "^9.5.0", + "cross-env": "^7.0.3", + "drizzle-orm": "^0.30.8", + "itty-router": "^5.0.16", + "itty-router-extras": "^0.4.6", + "zod": "^3.22.4" + } +} \ No newline at end of file diff --git a/backend/database/src/index.ts b/backend/database/src/index.ts index f4eec2a..d0069e2 100644 --- a/backend/database/src/index.ts +++ b/backend/database/src/index.ts @@ -1,11 +1,11 @@ // import type { DrizzleD1Database } from "drizzle-orm/d1"; import { drizzle } from "drizzle-orm/d1" import { json } from "itty-router-extras" -import { ZodError, z } from "zod" +import { z } from "zod" -import { user, sandbox, usersToSandboxes } from "./schema" -import * as schema from "./schema" import { and, eq, sql } from "drizzle-orm" +import * as schema from "./schema" +import { sandbox, user, usersToSandboxes } from "./schema" export interface Env { DB: D1Database diff --git a/backend/database/src/schema.ts b/backend/database/src/schema.ts index 0d088dd..5b974a7 100644 --- a/backend/database/src/schema.ts +++ b/backend/database/src/schema.ts @@ -1,6 +1,6 @@ -import { integer, sqliteTable, text } from "drizzle-orm/sqlite-core"; -import { createId } from "@paralleldrive/cuid2"; -import { relations, sql } from "drizzle-orm"; +import { createId } from "@paralleldrive/cuid2" +import { relations } from "drizzle-orm" +import { integer, sqliteTable, text } from "drizzle-orm/sqlite-core" export const user = sqliteTable("user", { id: text("id") @@ -11,14 +11,14 @@ export const user = sqliteTable("user", { email: text("email").notNull(), image: text("image"), generations: integer("generations").default(0), -}); +}) -export type User = typeof user.$inferSelect; +export type User = typeof user.$inferSelect export const userRelations = relations(user, ({ many }) => ({ sandbox: many(sandbox), usersToSandboxes: many(usersToSandboxes), -})); +})) export const sandbox = sqliteTable("sandbox", { id: text("id") @@ -32,9 +32,9 @@ export const sandbox = sqliteTable("sandbox", { userId: text("user_id") .notNull() .references(() => user.id), -}); +}) -export type Sandbox = typeof sandbox.$inferSelect; +export type Sandbox = typeof sandbox.$inferSelect export const sandboxRelations = relations(sandbox, ({ one, many }) => ({ author: one(user, { @@ -42,7 +42,7 @@ export const sandboxRelations = relations(sandbox, ({ one, many }) => ({ references: [user.id], }), usersToSandboxes: many(usersToSandboxes), -})); +})) export const usersToSandboxes = sqliteTable("users_to_sandboxes", { userId: text("userId") @@ -52,15 +52,18 @@ export const usersToSandboxes = sqliteTable("users_to_sandboxes", { .notNull() .references(() => sandbox.id), sharedOn: integer("sharedOn", { mode: "timestamp_ms" }), -}); +}) -export const usersToSandboxesRelations = relations(usersToSandboxes, ({ one }) => ({ - group: one(sandbox, { - fields: [usersToSandboxes.sandboxId], - references: [sandbox.id], - }), - user: one(user, { - fields: [usersToSandboxes.userId], - references: [user.id], - }), -})); +export const usersToSandboxesRelations = relations( + usersToSandboxes, + ({ one }) => ({ + group: one(sandbox, { + fields: [usersToSandboxes.sandboxId], + references: [sandbox.id], + }), + user: one(user, { + fields: [usersToSandboxes.userId], + references: [user.id], + }), + }) +) diff --git a/backend/database/test/index.spec.ts b/backend/database/test/index.spec.ts index 7522d5b..706f17c 100644 --- a/backend/database/test/index.spec.ts +++ b/backend/database/test/index.spec.ts @@ -1,25 +1,30 @@ // test/index.spec.ts -import { env, createExecutionContext, waitOnExecutionContext, SELF } from "cloudflare:test"; -import { describe, it, expect } from "vitest"; -import worker from "../src/index"; +import { + createExecutionContext, + env, + SELF, + waitOnExecutionContext, +} from "cloudflare:test" +import { describe, expect, it } from "vitest" +import worker from "../src/index" // For now, you'll need to do something like this to get a correctly-typed // `Request` to pass to `worker.fetch()`. -const IncomingRequest = Request; +const IncomingRequest = Request describe("Hello World worker", () => { - it("responds with Hello World! (unit style)", async () => { - const request = new IncomingRequest("http://example.com"); - // Create an empty context to pass to `worker.fetch()`. - const ctx = createExecutionContext(); - const response = await worker.fetch(request, env, ctx); - // Wait for all `Promise`s passed to `ctx.waitUntil()` to settle before running test assertions - await waitOnExecutionContext(ctx); - expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`); - }); + it("responds with Hello World! (unit style)", async () => { + const request = new IncomingRequest("http://example.com") + // Create an empty context to pass to `worker.fetch()`. + const ctx = createExecutionContext() + const response = await worker.fetch(request, env, ctx) + // Wait for all `Promise`s passed to `ctx.waitUntil()` to settle before running test assertions + await waitOnExecutionContext(ctx) + expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`) + }) - it("responds with Hello World! (integration style)", async () => { - const response = await SELF.fetch("https://example.com"); - expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`); - }); -}); + it("responds with Hello World! (integration style)", async () => { + const response = await SELF.fetch("https://example.com") + expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`) + }) +}) diff --git a/backend/database/test/tsconfig.json b/backend/database/test/tsconfig.json index 509425f..339ee9b 100644 --- a/backend/database/test/tsconfig.json +++ b/backend/database/test/tsconfig.json @@ -1,11 +1,11 @@ { - "extends": "../tsconfig.json", - "compilerOptions": { - "types": [ - "@cloudflare/workers-types/experimental", - "@cloudflare/vitest-pool-workers" - ] - }, - "include": ["./**/*.ts", "../src/env.d.ts"], - "exclude": [] + "extends": "../tsconfig.json", + "compilerOptions": { + "types": [ + "@cloudflare/workers-types/experimental", + "@cloudflare/vitest-pool-workers" + ] + }, + "include": ["./**/*.ts", "../src/env.d.ts"], + "exclude": [] } diff --git a/backend/database/tsconfig.json b/backend/database/tsconfig.json index 9192490..8b55b9c 100644 --- a/backend/database/tsconfig.json +++ b/backend/database/tsconfig.json @@ -12,7 +12,9 @@ /* Language and Environment */ "target": "es2021" /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */, - "lib": ["es2021"] /* Specify a set of bundled library declaration files that describe the target runtime environment. */, + "lib": [ + "es2021" + ] /* Specify a set of bundled library declaration files that describe the target runtime environment. */, "jsx": "react" /* Specify what JSX code is generated. */, // "experimentalDecorators": true, /* Enable experimental support for TC39 stage 2 draft decorators. */ // "emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */ diff --git a/backend/database/vitest.config.ts b/backend/database/vitest.config.ts index 973627c..5643ba3 100644 --- a/backend/database/vitest.config.ts +++ b/backend/database/vitest.config.ts @@ -1,11 +1,11 @@ -import { defineWorkersConfig } from "@cloudflare/vitest-pool-workers/config"; +import { defineWorkersConfig } from "@cloudflare/vitest-pool-workers/config" export default defineWorkersConfig({ - test: { - poolOptions: { - workers: { - wrangler: { configPath: "./wrangler.toml" }, - }, - }, - }, -}); + test: { + poolOptions: { + workers: { + wrangler: { configPath: "./wrangler.toml" }, + }, + }, + }, +}) diff --git a/backend/server/src/ConnectionManager.ts b/backend/server/src/ConnectionManager.ts new file mode 100644 index 0000000..45b5432 --- /dev/null +++ b/backend/server/src/ConnectionManager.ts @@ -0,0 +1,58 @@ +import { Socket } from "socket.io" + +class Counter { + private count: number = 0 + + increment() { + this.count++ + } + + decrement() { + this.count = Math.max(0, this.count - 1) + } + + getValue(): number { + return this.count + } +} + +// Owner Connection Management +export class ConnectionManager { + // Counts how many times the owner is connected to a sandbox + private ownerConnections: Record = {} + // Stores all sockets connected to a given sandbox + private sockets: Record> = {} + + // Checks if the owner of a sandbox is connected + ownerIsConnected(sandboxId: string): boolean { + return this.ownerConnections[sandboxId]?.getValue() > 0 + } + + // Adds a connection for a sandbox + addConnectionForSandbox(socket: Socket, sandboxId: string, isOwner: boolean) { + this.sockets[sandboxId] ??= new Set() + this.sockets[sandboxId].add(socket) + + // If the connection is for the owner, increments the owner connection counter + if (isOwner) { + this.ownerConnections[sandboxId] ??= new Counter() + this.ownerConnections[sandboxId].increment() + } + } + + // Removes a connection for a sandbox + removeConnectionForSandbox(socket: Socket, sandboxId: string, isOwner: boolean) { + this.sockets[sandboxId]?.delete(socket) + + // If the connection being removed is for the owner, decrements the owner connection counter + if (isOwner) { + this.ownerConnections[sandboxId]?.decrement() + } + } + + // Returns the set of sockets connected to a given sandbox + connectionsForSandbox(sandboxId: string): Set { + return this.sockets[sandboxId] ?? new Set(); + } + +} \ No newline at end of file diff --git a/backend/server/src/FileManager.ts b/backend/server/src/FileManager.ts index 278d060..88d53f3 100644 --- a/backend/server/src/FileManager.ts +++ b/backend/server/src/FileManager.ts @@ -4,12 +4,6 @@ import RemoteFileStorage from "./RemoteFileStorage" import { MAX_BODY_SIZE } from "./ratelimit" import { TFile, TFileData, TFolder } from "./types" -// Define the structure for sandbox files -export type SandboxFiles = { - files: (TFolder | TFile)[] - fileData: TFileData[] -} - // Convert list of paths to the hierchical file structure used by the editor function generateFileStructure(paths: string[]): (TFolder | TFile)[] { const root: TFolder = { id: "/", type: "folder", name: "/", children: [] } @@ -52,20 +46,22 @@ function generateFileStructure(paths: string[]): (TFolder | TFile)[] { export class FileManager { private sandboxId: string private sandbox: Sandbox - public sandboxFiles: SandboxFiles + public files: (TFolder | TFile)[] + public fileData: TFileData[] private fileWatchers: WatchHandle[] = [] private dirName = "/home/user/project" - private refreshFileList: (files: SandboxFiles) => void + private refreshFileList: ((files: (TFolder | TFile)[]) => void) | null // Constructor to initialize the FileManager constructor( sandboxId: string, sandbox: Sandbox, - refreshFileList: (files: SandboxFiles) => void + refreshFileList: ((files: (TFolder | TFile)[]) => void) | null ) { this.sandboxId = sandboxId this.sandbox = sandbox - this.sandboxFiles = { files: [], fileData: [] } + this.files = [] + this.fileData = [] this.refreshFileList = refreshFileList } @@ -110,16 +106,16 @@ export class FileManager { private async updateFileData(): Promise { const remotePaths = await RemoteFileStorage.getSandboxPaths(this.sandboxId) const localPaths = this.getLocalFileIds(remotePaths) - this.sandboxFiles.fileData = await this.generateFileData(localPaths) - return this.sandboxFiles.fileData + this.fileData = await this.generateFileData(localPaths) + return this.fileData } // Update file structure private async updateFileStructure(): Promise<(TFolder | TFile)[]> { const remotePaths = await RemoteFileStorage.getSandboxPaths(this.sandboxId) const localPaths = this.getLocalFileIds(remotePaths) - this.sandboxFiles.files = generateFileStructure(localPaths) - return this.sandboxFiles.files + this.files = generateFileStructure(localPaths) + return this.files } // Initialize the FileManager @@ -130,9 +126,9 @@ export class FileManager { await this.updateFileData() // Copy all files from the project to the container - const promises = this.sandboxFiles.fileData.map(async (file) => { + const promises = this.fileData.map(async (file) => { try { - const filePath = path.join(this.dirName, file.id) + const filePath = path.posix.join(this.dirName, file.id) const parentDirectory = path.dirname(filePath) if (!this.sandbox.files.exists(parentDirectory)) { await this.sandbox.files.makeDir(parentDirectory) @@ -209,7 +205,7 @@ export class FileManager { // Handle file/directory creation event if (event.type === "create") { const folder = findFolderById( - this.sandboxFiles.files, + this.files, sandboxDirectory ) as TFolder const isDir = await this.isDirectory(containerFilePath) @@ -232,7 +228,7 @@ export class FileManager { folder.children.push(newItem) } else { // If folder doesn't exist, add the new item to the root - this.sandboxFiles.files.push(newItem) + this.files.push(newItem) } if (!isDir) { @@ -241,7 +237,7 @@ export class FileManager { ) const fileContents = typeof fileData === "string" ? fileData : "" - this.sandboxFiles.fileData.push({ + this.fileData.push({ id: sandboxFilePath, data: fileContents, }) @@ -253,7 +249,7 @@ export class FileManager { // Handle file/directory removal or rename event else if (event.type === "remove" || event.type == "rename") { const folder = findFolderById( - this.sandboxFiles.files, + this.files, sandboxDirectory ) as TFolder const isDir = await this.isDirectory(containerFilePath) @@ -269,13 +265,13 @@ export class FileManager { ) } else { // Remove from the root if it's not inside a folder - this.sandboxFiles.files = this.sandboxFiles.files.filter( + this.files = this.files.filter( (file: TFolder | TFile) => !isFileMatch(file) ) } // Also remove any corresponding file data - this.sandboxFiles.fileData = this.sandboxFiles.fileData.filter( + this.fileData = this.fileData.filter( (file: TFileData) => !isFileMatch(file) ) @@ -285,10 +281,10 @@ export class FileManager { // Handle file write event else if (event.type === "write") { const folder = findFolderById( - this.sandboxFiles.files, + this.files, sandboxDirectory ) as TFolder - const fileToWrite = this.sandboxFiles.fileData.find( + const fileToWrite = this.fileData.find( (file) => file.id === sandboxFilePath ) @@ -308,7 +304,7 @@ export class FileManager { ) const fileContents = typeof fileData === "string" ? fileData : "" - this.sandboxFiles.fileData.push({ + this.fileData.push({ id: sandboxFilePath, data: fileContents, }) @@ -318,7 +314,9 @@ export class FileManager { } // Tell the client to reload the file list - this.refreshFileList(this.sandboxFiles) + if (event.type !== "chmod") { + this.refreshFileList?.(this.files) + } } catch (error) { console.error( `Error handling ${event.type} event for ${event.name}:`, @@ -350,7 +348,7 @@ export class FileManager { // Get file content async getFile(fileId: string): Promise { - const file = this.sandboxFiles.fileData.find((f) => f.id === fileId) + const file = this.fileData.find((f) => f.id === fileId) return file?.data } @@ -368,7 +366,7 @@ export class FileManager { throw new Error("File size too large. Please reduce the file size.") } await RemoteFileStorage.saveFile(this.getRemoteFileId(fileId), body) - const file = this.sandboxFiles.fileData.find((f) => f.id === fileId) + const file = this.fileData.find((f) => f.id === fileId) if (!file) return file.data = body @@ -381,9 +379,9 @@ export class FileManager { fileId: string, folderId: string ): Promise<(TFolder | TFile)[]> { - const fileData = this.sandboxFiles.fileData.find((f) => f.id === fileId) - const file = this.sandboxFiles.files.find((f) => f.id === fileId) - if (!fileData || !file) return this.sandboxFiles.files + const fileData = this.fileData.find((f) => f.id === fileId) + const file = this.files.find((f) => f.id === fileId) + if (!fileData || !file) return this.files const parts = fileId.split("/") const newFileId = folderId + "/" + parts.pop() @@ -427,13 +425,13 @@ export class FileManager { await this.sandbox.files.write(path.posix.join(this.dirName, id), "") await this.fixPermissions() - this.sandboxFiles.files.push({ + this.files.push({ id, name, type: "file", }) - this.sandboxFiles.fileData.push({ + this.fileData.push({ id, data: "", }) @@ -451,8 +449,8 @@ export class FileManager { // Rename a file async renameFile(fileId: string, newName: string): Promise { - const fileData = this.sandboxFiles.fileData.find((f) => f.id === fileId) - const file = this.sandboxFiles.files.find((f) => f.id === fileId) + const fileData = this.fileData.find((f) => f.id === fileId) + const file = this.files.find((f) => f.id === fileId) if (!fileData || !file) return const parts = fileId.split("/") @@ -468,11 +466,11 @@ export class FileManager { // Delete a file async deleteFile(fileId: string): Promise<(TFolder | TFile)[]> { - const file = this.sandboxFiles.fileData.find((f) => f.id === fileId) - if (!file) return this.sandboxFiles.files + const file = this.fileData.find((f) => f.id === fileId) + if (!file) return this.files await this.sandbox.files.remove(path.posix.join(this.dirName, fileId)) - this.sandboxFiles.fileData = this.sandboxFiles.fileData.filter( + this.fileData = this.fileData.filter( (f) => f.id !== fileId ) @@ -487,7 +485,7 @@ export class FileManager { await Promise.all( files.map(async (file) => { await this.sandbox.files.remove(path.posix.join(this.dirName, file)) - this.sandboxFiles.fileData = this.sandboxFiles.fileData.filter( + this.fileData = this.fileData.filter( (f) => f.id !== file ) await RemoteFileStorage.deleteFile(this.getRemoteFileId(file)) diff --git a/backend/server/src/Sandbox.ts b/backend/server/src/Sandbox.ts new file mode 100644 index 0000000..fda2237 --- /dev/null +++ b/backend/server/src/Sandbox.ts @@ -0,0 +1,243 @@ +import { Sandbox as E2BSandbox } from "e2b" +import { Socket } from "socket.io" +import { AIWorker } from "./AIWorker" +import { CONTAINER_TIMEOUT } from "./constants" +import { DokkuClient } from "./DokkuClient" +import { FileManager } from "./FileManager" +import { + createFileRL, + createFolderRL, + deleteFileRL, + renameFileRL, + saveFileRL, +} from "./ratelimit" +import { SecureGitClient } from "./SecureGitClient" +import { TerminalManager } from "./TerminalManager" +import { TFile, TFolder } from "./types" +import { LockManager } from "./utils" + +const lockManager = new LockManager() + +// Define a type for SocketHandler functions +type SocketHandler> = (args: T) => any; + +// Extract port number from a string +function extractPortNumber(inputString: string): number | null { + const cleanedString = inputString.replace(/\x1B\[[0-9;]*m/g, "") + const regex = /http:\/\/localhost:(\d+)/ + const match = cleanedString.match(regex) + return match ? parseInt(match[1]) : null +} + +type ServerContext = { + aiWorker: AIWorker; + dokkuClient: DokkuClient | null; + gitClient: SecureGitClient | null; +}; + +export class Sandbox { + // Sandbox properties: + sandboxId: string; + fileManager: FileManager | null; + terminalManager: TerminalManager | null; + container: E2BSandbox | null; + // Server context: + dokkuClient: DokkuClient | null; + gitClient: SecureGitClient | null; + aiWorker: AIWorker; + + constructor(sandboxId: string, { aiWorker, dokkuClient, gitClient }: ServerContext) { + // Sandbox properties: + this.sandboxId = sandboxId; + this.fileManager = null; + this.terminalManager = null; + this.container = null; + // Server context: + this.aiWorker = aiWorker; + this.dokkuClient = dokkuClient; + this.gitClient = gitClient; + } + + // Initializes the container for the sandbox environment + async initialize( + fileWatchCallback: ((files: (TFolder | TFile)[]) => void) | undefined + ) { + // Acquire a lock to ensure exclusive access to the sandbox environment + await lockManager.acquireLock(this.sandboxId, async () => { + // Check if a container already exists and is running + if (this.container && await this.container.isRunning()) { + console.log(`Found existing container ${this.sandboxId}`) + } else { + console.log("Creating container", this.sandboxId) + // Create a new container with a specified timeout + this.container = await E2BSandbox.create({ + timeoutMs: CONTAINER_TIMEOUT, + }) + } + }) + // Ensure a container was successfully created + if (!this.container) throw new Error("Failed to create container") + + // Initialize the terminal manager if it hasn't been set up yet + if (!this.terminalManager) { + this.terminalManager = new TerminalManager(this.container) + console.log(`Terminal manager set up for ${this.sandboxId}`) + } + + // Initialize the file manager if it hasn't been set up yet + if (!this.fileManager) { + this.fileManager = new FileManager( + this.sandboxId, + this.container, + fileWatchCallback ?? null + ) + // Initialize the file manager and emit the initial files + await this.fileManager.initialize() + } + } + + // Called when the client disconnects from the Sandbox + async disconnect() { + // Close all terminals managed by the terminal manager + await this.terminalManager?.closeAllTerminals() + // This way the terminal manager will be set up again if we reconnect + this.terminalManager = null; + // Close all file watchers managed by the file manager + await this.fileManager?.closeWatchers() + // This way the file manager will be set up again if we reconnect + this.fileManager = null; + } + + handlers(connection: { userId: string, isOwner: boolean, socket: Socket }) { + + // Handle heartbeat from a socket connection + const handleHeartbeat: SocketHandler = (_: any) => { + // Only keep the sandbox alive if the owner is still connected + if (connection.isOwner) { + this.container?.setTimeout(CONTAINER_TIMEOUT) + } + } + + // Handle getting a file + const handleGetFile: SocketHandler = ({ fileId }: any) => { + return this.fileManager?.getFile(fileId) + } + + // Handle getting a folder + const handleGetFolder: SocketHandler = ({ folderId }: any) => { + return this.fileManager?.getFolder(folderId) + } + + // Handle saving a file + const handleSaveFile: SocketHandler = async ({ fileId, body }: any) => { + await saveFileRL.consume(connection.userId, 1); + return this.fileManager?.saveFile(fileId, body) + } + + // Handle moving a file + const handleMoveFile: SocketHandler = ({ fileId, folderId }: any) => { + return this.fileManager?.moveFile(fileId, folderId) + } + + // Handle listing apps + const handleListApps: SocketHandler = async (_: any) => { + if (!this.dokkuClient) throw Error("Failed to retrieve apps list: No Dokku client") + return { success: true, apps: await this.dokkuClient.listApps() } + } + + // Handle deploying code + const handleDeploy: SocketHandler = async (_: any) => { + if (!this.gitClient) throw Error("No git client") + if (!this.fileManager) throw Error("No file manager") + await this.gitClient.pushFiles(this.fileManager?.fileData, this.sandboxId) + return { success: true } + } + + // Handle creating a file + const handleCreateFile: SocketHandler = async ({ name }: any) => { + await createFileRL.consume(connection.userId, 1); + return { "success": await this.fileManager?.createFile(name) } + } + + // Handle creating a folder + const handleCreateFolder: SocketHandler = async ({ name }: any) => { + await createFolderRL.consume(connection.userId, 1); + return { "success": await this.fileManager?.createFolder(name) } + } + + // Handle renaming a file + const handleRenameFile: SocketHandler = async ({ fileId, newName }: any) => { + await renameFileRL.consume(connection.userId, 1) + return this.fileManager?.renameFile(fileId, newName) + } + + // Handle deleting a file + const handleDeleteFile: SocketHandler = async ({ fileId }: any) => { + await deleteFileRL.consume(connection.userId, 1) + return this.fileManager?.deleteFile(fileId) + } + + // Handle deleting a folder + const handleDeleteFolder: SocketHandler = ({ folderId }: any) => { + return this.fileManager?.deleteFolder(folderId) + } + + // Handle creating a terminal session + const handleCreateTerminal: SocketHandler = async ({ id }: any) => { + await lockManager.acquireLock(this.sandboxId, async () => { + await this.terminalManager?.createTerminal(id, (responseString: string) => { + connection.socket.emit("terminalResponse", { id, data: responseString }) + const port = extractPortNumber(responseString) + if (port) { + connection.socket.emit( + "previewURL", + "https://" + this.container?.getHost(port) + ) + } + }) + }) + } + + // Handle resizing a terminal + const handleResizeTerminal: SocketHandler = ({ dimensions }: any) => { + this.terminalManager?.resizeTerminal(dimensions) + } + + // Handle sending data to a terminal + const handleTerminalData: SocketHandler = ({ id, data }: any) => { + return this.terminalManager?.sendTerminalData(id, data) + } + + // Handle closing a terminal + const handleCloseTerminal: SocketHandler = ({ id }: any) => { + return this.terminalManager?.closeTerminal(id) + } + + // Handle generating code + const handleGenerateCode: SocketHandler = ({ fileName, code, line, instructions }: any) => { + return this.aiWorker.generateCode(connection.userId, fileName, code, line, instructions) + } + + return { + "heartbeat": handleHeartbeat, + "getFile": handleGetFile, + "getFolder": handleGetFolder, + "saveFile": handleSaveFile, + "moveFile": handleMoveFile, + "list": handleListApps, + "deploy": handleDeploy, + "createFile": handleCreateFile, + "createFolder": handleCreateFolder, + "renameFile": handleRenameFile, + "deleteFile": handleDeleteFile, + "deleteFolder": handleDeleteFolder, + "createTerminal": handleCreateTerminal, + "resizeTerminal": handleResizeTerminal, + "terminalData": handleTerminalData, + "closeTerminal": handleCloseTerminal, + "generateCode": handleGenerateCode, + }; + + } + +} \ No newline at end of file diff --git a/backend/server/src/constants.ts b/backend/server/src/constants.ts new file mode 100644 index 0000000..dfd5ce3 --- /dev/null +++ b/backend/server/src/constants.ts @@ -0,0 +1,2 @@ +// The amount of time in ms that a container will stay alive without a hearbeat. +export const CONTAINER_TIMEOUT = 120_000 \ No newline at end of file diff --git a/backend/server/src/index.ts b/backend/server/src/index.ts index f69a303..cf95824 100644 --- a/backend/server/src/index.ts +++ b/backend/server/src/index.ts @@ -1,42 +1,39 @@ import cors from "cors" import dotenv from "dotenv" -import { Sandbox } from "e2b" import express, { Express } from "express" import fs from "fs" import { createServer } from "http" -import { Server } from "socket.io" -import { z } from "zod" +import { Server, Socket } from "socket.io" import { AIWorker } from "./AIWorker" + +import { ConnectionManager } from "./ConnectionManager" import { DokkuClient } from "./DokkuClient" -import { FileManager, SandboxFiles } from "./FileManager" -import { - createFileRL, - createFolderRL, - deleteFileRL, - renameFileRL, - saveFileRL, -} from "./ratelimit" +import { Sandbox } from "./Sandbox" import { SecureGitClient } from "./SecureGitClient" -import { TerminalManager } from "./TerminalManager" -import { User } from "./types" -import { LockManager } from "./utils" +import { socketAuth } from "./socketAuth"; // Import the new socketAuth middleware +import { TFile, TFolder } from "./types" + +// Log errors and send a notification to the client +export const handleErrors = (message: string, error: any, socket: Socket) => { + console.error(message, error); + socket.emit("error", `${message} ${error.message ?? error}`); +}; // Handle uncaught exceptions process.on("uncaughtException", (error) => { console.error("Uncaught Exception:", error) // Do not exit the process - // You can add additional logging or recovery logic here }) // Handle unhandled promise rejections process.on("unhandledRejection", (reason, promise) => { console.error("Unhandled Rejection at:", promise, "reason:", reason) // Do not exit the process - // You can also handle the rejected promise here if needed }) -// The amount of time in ms that a container will stay alive without a hearbeat. -const CONTAINER_TIMEOUT = 120_000 +// Initialize containers and managers +const connections = new ConnectionManager() +const sandboxes: Record = {} // Load environment variables dotenv.config() @@ -48,118 +45,39 @@ app.use(cors()) const httpServer = createServer(app) const io = new Server(httpServer, { cors: { - origin: "*", + origin: "*", // Allow connections from any origin }, }) -// Check if the sandbox owner is connected -function isOwnerConnected(sandboxId: string): boolean { - return (connections[sandboxId] ?? 0) > 0 -} - -// Extract port number from a string -function extractPortNumber(inputString: string): number | null { - const cleanedString = inputString.replace(/\x1B\[[0-9;]*m/g, "") - const regex = /http:\/\/localhost:(\d+)/ - const match = cleanedString.match(regex) - return match ? parseInt(match[1]) : null -} - -// Initialize containers and managers -const containers: Record = {} -const connections: Record = {} -const fileManagers: Record = {} -const terminalManagers: Record = {} - // Middleware for socket authentication -io.use(async (socket, next) => { - // Define the schema for handshake query validation - const handshakeSchema = z.object({ - userId: z.string(), - sandboxId: z.string(), - EIO: z.string(), - transport: z.string(), - }) - - const q = socket.handshake.query - const parseQuery = handshakeSchema.safeParse(q) - - // Check if the query is valid according to the schema - if (!parseQuery.success) { - next(new Error("Invalid request.")) - return - } - - const { sandboxId, userId } = parseQuery.data - // Fetch user data from the database - const dbUser = await fetch( - `${process.env.DATABASE_WORKER_URL}/api/user?id=${userId}`, - { - headers: { - Authorization: `${process.env.WORKERS_KEY}`, - }, - } - ) - const dbUserJSON = (await dbUser.json()) as User - - // Check if user data was retrieved successfully - if (!dbUserJSON) { - next(new Error("DB error.")) - return - } - - // Check if the user owns the sandbox or has shared access - const sandbox = dbUserJSON.sandbox.find((s) => s.id === sandboxId) - const sharedSandboxes = dbUserJSON.usersToSandboxes.find( - (uts) => uts.sandboxId === sandboxId - ) - - // If user doesn't own or have shared access to the sandbox, deny access - if (!sandbox && !sharedSandboxes) { - next(new Error("Invalid credentials.")) - return - } - - // Set socket data with user information - socket.data = { - userId, - sandboxId: sandboxId, - isOwner: sandbox !== undefined, - } - - // Allow the connection - next() -}) - -// Initialize lock manager -const lockManager = new LockManager() +io.use(socketAuth) // Use the new socketAuth middleware // Check for required environment variables if (!process.env.DOKKU_HOST) - console.error("Environment variable DOKKU_HOST is not defined") + console.warn("Environment variable DOKKU_HOST is not defined") if (!process.env.DOKKU_USERNAME) - console.error("Environment variable DOKKU_USERNAME is not defined") + console.warn("Environment variable DOKKU_USERNAME is not defined") if (!process.env.DOKKU_KEY) - console.error("Environment variable DOKKU_KEY is not defined") + console.warn("Environment variable DOKKU_KEY is not defined") // Initialize Dokku client -const client = +const dokkuClient = process.env.DOKKU_HOST && process.env.DOKKU_KEY && process.env.DOKKU_USERNAME ? new DokkuClient({ - host: process.env.DOKKU_HOST, - username: process.env.DOKKU_USERNAME, - privateKey: fs.readFileSync(process.env.DOKKU_KEY), - }) + host: process.env.DOKKU_HOST, + username: process.env.DOKKU_USERNAME, + privateKey: fs.readFileSync(process.env.DOKKU_KEY), + }) : null -client?.connect() +dokkuClient?.connect() // Initialize Git client used to deploy Dokku apps -const git = +const gitClient = process.env.DOKKU_HOST && process.env.DOKKU_KEY ? new SecureGitClient( - `dokku@${process.env.DOKKU_HOST}`, - process.env.DOKKU_KEY - ) + `dokku@${process.env.DOKKU_HOST}`, + process.env.DOKKU_KEY + ) : null // Add this near the top of the file, after other initializations @@ -170,357 +88,95 @@ const aiWorker = new AIWorker( process.env.WORKERS_KEY! ) -// Handle socket connections +// Handle a client connecting to the server io.on("connection", async (socket) => { try { + // This data comes is added by our authentication middleware const data = socket.data as { userId: string sandboxId: string isOwner: boolean } - // Handle connection based on user type (owner or not) - if (data.isOwner) { - connections[data.sandboxId] = (connections[data.sandboxId] ?? 0) + 1 - } else { - if (!isOwnerConnected(data.sandboxId)) { - socket.emit("disableAccess", "The sandbox owner is not connected.") - return - } + // Register the connection + connections.addConnectionForSandbox(socket, data.sandboxId, data.isOwner) + + // Disable access unless the sandbox owner is connected + if (!data.isOwner && !connections.ownerIsConnected(data.sandboxId)) { + socket.emit("disableAccess", "The sandbox owner is not connected.") + return } - // Create or retrieve container - const createdContainer = await lockManager.acquireLock( - data.sandboxId, - async () => { + try { + // Create or retrieve the sandbox manager for the given sandbox ID + const sandbox = sandboxes[data.sandboxId] ?? new Sandbox( + data.sandboxId, + { + aiWorker, dokkuClient, gitClient, + } + ) + sandboxes[data.sandboxId] = sandbox + + // This callback recieves an update when the file list changes, and notifies all relevant connections. + const sendFileNotifications = (files: (TFolder | TFile)[]) => { + connections.connectionsForSandbox(data.sandboxId).forEach((socket: Socket) => { + socket.emit("loaded", files); + }); + }; + + // Initialize the sandbox container + // The file manager and terminal managers will be set up if they have been closed + await sandbox.initialize(sendFileNotifications) + socket.emit("loaded", sandbox.fileManager?.files) + + // Register event handlers for the sandbox + // For each event handler, listen on the socket for that event + // Pass connection-specific information to the handlers + Object.entries(sandbox.handlers({ + userId: data.userId, + isOwner: data.isOwner, + socket + })).forEach(([event, handler]) => { + socket.on(event, async (options: any, callback?: (response: any) => void) => { + try { + const result = await handler(options) + callback?.(result); + } catch (e: any) { + handleErrors(`Error processing event "${event}":`, e, socket); + } + }); + }); + + // Handle disconnection event + socket.on("disconnect", async () => { try { - // Start a new container if the container doesn't exist or it timed out. - if ( - !containers[data.sandboxId] || - !(await containers[data.sandboxId].isRunning()) - ) { - containers[data.sandboxId] = await Sandbox.create({ - timeoutMs: CONTAINER_TIMEOUT, - }) - console.log("Created container ", data.sandboxId) - return true + // Deregister the connection + connections.removeConnectionForSandbox(socket, data.sandboxId, data.isOwner) + + // If the owner has disconnected from all sockets, close open terminals and file watchers.o + // The sandbox itself will timeout after the heartbeat stops. + if (data.isOwner && !connections.ownerIsConnected(data.sandboxId)) { + await sandbox.disconnect() + socket.broadcast.emit( + "disableAccess", + "The sandbox owner has disconnected." + ) } } catch (e: any) { - console.error(`Error creating container ${data.sandboxId}:`, e) - io.emit("error", `Error: container creation. ${e.message ?? e}`) + handleErrors("Error disconnecting:", e, socket); } - } - ) + }) - // Function to send loaded event - const sendLoadedEvent = (files: SandboxFiles) => { - socket.emit("loaded", files.files) + } catch (e: any) { + handleErrors(`Error initializing sandbox ${data.sandboxId}:`, e, socket); } - // Initialize file and terminal managers if container was created - if (createdContainer) { - fileManagers[data.sandboxId] = new FileManager( - data.sandboxId, - containers[data.sandboxId], - sendLoadedEvent - ) - await fileManagers[data.sandboxId].initialize() - terminalManagers[data.sandboxId] = new TerminalManager( - containers[data.sandboxId] - ) - } - - const fileManager = fileManagers[data.sandboxId] - const terminalManager = terminalManagers[data.sandboxId] - - // Load file list from the file manager into the editor - sendLoadedEvent(fileManager.sandboxFiles) - - // Handle various socket events (heartbeat, file operations, terminal operations, etc.) - socket.on("heartbeat", async () => { - try { - // This keeps the container alive for another CONTAINER_TIMEOUT seconds. - // The E2B docs are unclear, but the timeout is relative to the time of this method call. - await containers[data.sandboxId].setTimeout(CONTAINER_TIMEOUT) - } catch (e: any) { - console.error("Error setting timeout:", e) - io.emit("error", `Error: set timeout. ${e.message ?? e}`) - } - }) - - // Handle request to get file content - socket.on("getFile", async (fileId: string, callback) => { - try { - const fileContent = await fileManager.getFile(fileId) - callback(fileContent) - } catch (e: any) { - console.error("Error getting file:", e) - io.emit("error", `Error: get file. ${e.message ?? e}`) - } - }) - - // Handle request to get folder contents - socket.on("getFolder", async (folderId: string, callback) => { - try { - const files = await fileManager.getFolder(folderId) - callback(files) - } catch (e: any) { - console.error("Error getting folder:", e) - io.emit("error", `Error: get folder. ${e.message ?? e}`) - } - }) - - // Handle request to save file - socket.on("saveFile", async (fileId: string, body: string) => { - try { - await saveFileRL.consume(data.userId, 1) - await fileManager.saveFile(fileId, body) - } catch (e: any) { - console.error("Error saving file:", e) - io.emit("error", `Error: file saving. ${e.message ?? e}`) - } - }) - - // Handle request to move file - socket.on( - "moveFile", - async (fileId: string, folderId: string, callback) => { - try { - const newFiles = await fileManager.moveFile(fileId, folderId) - callback(newFiles) - } catch (e: any) { - console.error("Error moving file:", e) - io.emit("error", `Error: file moving. ${e.message ?? e}`) - } - } - ) - - interface CallbackResponse { - success: boolean - apps?: string[] - message?: string - } - - // Handle request to list apps - socket.on( - "list", - async (callback: (response: CallbackResponse) => void) => { - console.log("Retrieving apps list...") - try { - if (!client) - throw Error("Failed to retrieve apps list: No Dokku client") - callback({ - success: true, - apps: await client.listApps(), - }) - } catch (error) { - callback({ - success: false, - message: "Failed to retrieve apps list", - }) - } - } - ) - - // Handle request to deploy project - socket.on( - "deploy", - async (callback: (response: CallbackResponse) => void) => { - try { - // Push the project files to the Dokku server - console.log("Deploying project ${data.sandboxId}...") - if (!git) throw Error("Failed to retrieve apps list: No git client") - // Remove the /project/[id]/ component of each file path: - const fixedFilePaths = fileManager.sandboxFiles.fileData.map( - (file) => { - return { - ...file, - id: file.id.split("/").slice(2).join("/"), - } - } - ) - // Push all files to Dokku. - await git.pushFiles(fixedFilePaths, data.sandboxId) - callback({ - success: true, - }) - } catch (error) { - callback({ - success: false, - message: "Failed to deploy project: " + error, - }) - } - } - ) - - // Handle request to create a new file - socket.on("createFile", async (name: string, callback) => { - try { - await createFileRL.consume(data.userId, 1) - const success = await fileManager.createFile(name) - callback({ success }) - } catch (e: any) { - console.error("Error creating file:", e) - io.emit("error", `Error: file creation. ${e.message ?? e}`) - } - }) - - // Handle request to create a new folder - socket.on("createFolder", async (name: string, callback) => { - try { - await createFolderRL.consume(data.userId, 1) - await fileManager.createFolder(name) - callback() - } catch (e: any) { - console.error("Error creating folder:", e) - io.emit("error", `Error: folder creation. ${e.message ?? e}`) - } - }) - - // Handle request to rename a file - socket.on("renameFile", async (fileId: string, newName: string) => { - try { - await renameFileRL.consume(data.userId, 1) - await fileManager.renameFile(fileId, newName) - } catch (e: any) { - console.error("Error renaming file:", e) - io.emit("error", `Error: file renaming. ${e.message ?? e}`) - } - }) - - // Handle request to delete a file - socket.on("deleteFile", async (fileId: string, callback) => { - try { - await deleteFileRL.consume(data.userId, 1) - const newFiles = await fileManager.deleteFile(fileId) - callback(newFiles) - } catch (e: any) { - console.error("Error deleting file:", e) - io.emit("error", `Error: file deletion. ${e.message ?? e}`) - } - }) - - // Handle request to delete a folder - socket.on("deleteFolder", async (folderId: string, callback) => { - try { - const newFiles = await fileManager.deleteFolder(folderId) - callback(newFiles) - } catch (e: any) { - console.error("Error deleting folder:", e) - io.emit("error", `Error: folder deletion. ${e.message ?? e}`) - } - }) - - // Handle request to create a new terminal - socket.on("createTerminal", async (id: string, callback) => { - try { - await lockManager.acquireLock(data.sandboxId, async () => { - await terminalManager.createTerminal(id, (responseString: string) => { - io.emit("terminalResponse", { id, data: responseString }) - const port = extractPortNumber(responseString) - if (port) { - io.emit( - "previewURL", - "https://" + containers[data.sandboxId].getHost(port) - ) - } - }) - }) - callback() - } catch (e: any) { - console.error(`Error creating terminal ${id}:`, e) - io.emit("error", `Error: terminal creation. ${e.message ?? e}`) - } - }) - - // Handle request to resize terminal - socket.on( - "resizeTerminal", - (dimensions: { cols: number; rows: number }) => { - try { - terminalManager.resizeTerminal(dimensions) - } catch (e: any) { - console.error("Error resizing terminal:", e) - io.emit("error", `Error: terminal resizing. ${e.message ?? e}`) - } - } - ) - - // Handle terminal input data - socket.on("terminalData", async (id: string, data: string) => { - try { - await terminalManager.sendTerminalData(id, data) - } catch (e: any) { - console.error("Error writing to terminal:", e) - io.emit("error", `Error: writing to terminal. ${e.message ?? e}`) - } - }) - - // Handle request to close terminal - socket.on("closeTerminal", async (id: string, callback) => { - try { - await terminalManager.closeTerminal(id) - callback() - } catch (e: any) { - console.error("Error closing terminal:", e) - io.emit("error", `Error: closing terminal. ${e.message ?? e}`) - } - }) - - // Handle request to generate code - socket.on( - "generateCode", - async ( - fileName: string, - code: string, - line: number, - instructions: string, - callback - ) => { - try { - const result = await aiWorker.generateCode( - data.userId, - fileName, - code, - line, - instructions - ) - callback(result) - } catch (e: any) { - console.error("Error generating code:", e) - io.emit("error", `Error: code generation. ${e.message ?? e}`) - } - } - ) - - // Handle socket disconnection - socket.on("disconnect", async () => { - try { - if (data.isOwner) { - connections[data.sandboxId]-- - } - - await terminalManager.closeAllTerminals() - await fileManager.closeWatchers() - - if (data.isOwner && connections[data.sandboxId] <= 0) { - socket.broadcast.emit( - "disableAccess", - "The sandbox owner has disconnected." - ) - } - } catch (e: any) { - console.log("Error disconnecting:", e) - io.emit("error", `Error: disconnecting. ${e.message ?? e}`) - } - }) } catch (e: any) { - console.error("Error connecting:", e) - io.emit("error", `Error: connection. ${e.message ?? e}`) + handleErrors("Error connecting:", e, socket); } }) // Start the server httpServer.listen(port, () => { console.log(`Server running on port ${port}`) -}) +}) \ No newline at end of file diff --git a/backend/server/src/socketAuth.ts b/backend/server/src/socketAuth.ts new file mode 100644 index 0000000..3bd83b1 --- /dev/null +++ b/backend/server/src/socketAuth.ts @@ -0,0 +1,63 @@ +import { Socket } from "socket.io" +import { z } from "zod" +import { User } from "./types" + +// Middleware for socket authentication +export const socketAuth = async (socket: Socket, next: Function) => { + // Define the schema for handshake query validation + const handshakeSchema = z.object({ + userId: z.string(), + sandboxId: z.string(), + EIO: z.string(), + transport: z.string(), + }) + + const q = socket.handshake.query + const parseQuery = handshakeSchema.safeParse(q) + + // Check if the query is valid according to the schema + if (!parseQuery.success) { + next(new Error("Invalid request.")) + return + } + + const { sandboxId, userId } = parseQuery.data + // Fetch user data from the database + const dbUser = await fetch( + `${process.env.DATABASE_WORKER_URL}/api/user?id=${userId}`, + { + headers: { + Authorization: `${process.env.WORKERS_KEY}`, + }, + } + ) + const dbUserJSON = (await dbUser.json()) as User + + // Check if user data was retrieved successfully + if (!dbUserJSON) { + next(new Error("DB error.")) + return + } + + // Check if the user owns the sandbox or has shared access + const sandbox = dbUserJSON.sandbox.find((s) => s.id === sandboxId) + const sharedSandboxes = dbUserJSON.usersToSandboxes.find( + (uts) => uts.sandboxId === sandboxId + ) + + // If user doesn't own or have shared access to the sandbox, deny access + if (!sandbox && !sharedSandboxes) { + next(new Error("Invalid credentials.")) + return + } + + // Set socket data with user information + socket.data = { + userId, + sandboxId: sandboxId, + isOwner: sandbox !== undefined, + } + + // Allow the connection + next() +} diff --git a/backend/server/src/types.ts b/backend/server/src/types.ts index 42ad6d0..93e45e6 100644 --- a/backend/server/src/types.ts +++ b/backend/server/src/types.ts @@ -68,3 +68,8 @@ export type R2FileBody = R2FileData & { json: Promise blob: Promise } +export interface DokkuResponse { + success: boolean + apps?: string[] + message?: string +} diff --git a/backend/server/src/utils.ts b/backend/server/src/utils.ts index 5ae1377..dd33984 100644 --- a/backend/server/src/utils.ts +++ b/backend/server/src/utils.ts @@ -20,4 +20,4 @@ export class LockManager { } return await this.locks[key] } -} +} \ No newline at end of file diff --git a/backend/storage/.prettierrc b/backend/storage/.prettierrc deleted file mode 100644 index 42830e2..0000000 --- a/backend/storage/.prettierrc +++ /dev/null @@ -1,5 +0,0 @@ -{ - "tabWidth": 2, - "semi": false, - "singleQuote": false -} \ No newline at end of file diff --git a/backend/storage/package.json b/backend/storage/package.json index 3f5c39f..215832e 100644 --- a/backend/storage/package.json +++ b/backend/storage/package.json @@ -1,23 +1,23 @@ { - "name": "storage", - "version": "0.0.0", - "private": true, - "scripts": { - "deploy": "wrangler deploy", - "dev": "wrangler dev --remote", - "start": "wrangler dev", - "test": "vitest", - "cf-typegen": "wrangler types" - }, - "devDependencies": { - "@cloudflare/vitest-pool-workers": "^0.1.0", - "@cloudflare/workers-types": "^4.20240419.0", - "typescript": "^5.0.4", - "vitest": "1.3.0", - "wrangler": "^3.0.0" - }, - "dependencies": { - "p-limit": "^6.1.0", - "zod": "^3.23.4" - } -} + "name": "storage", + "version": "0.0.0", + "private": true, + "scripts": { + "deploy": "wrangler deploy", + "dev": "wrangler dev --remote", + "start": "wrangler dev", + "test": "vitest", + "cf-typegen": "wrangler types" + }, + "devDependencies": { + "@cloudflare/vitest-pool-workers": "^0.1.0", + "@cloudflare/workers-types": "^4.20240419.0", + "typescript": "^5.0.4", + "vitest": "1.3.0", + "wrangler": "^3.0.0" + }, + "dependencies": { + "p-limit": "^6.1.0", + "zod": "^3.23.4" + } +} \ No newline at end of file diff --git a/backend/storage/src/index.ts b/backend/storage/src/index.ts index c9b371e..e7a7294 100644 --- a/backend/storage/src/index.ts +++ b/backend/storage/src/index.ts @@ -1,5 +1,5 @@ +import pLimit from "p-limit" import { z } from "zod" -import pLimit from 'p-limit'; export interface Env { R2: R2Bucket @@ -144,20 +144,24 @@ export default { const body = await request.json() const { sandboxId, type } = initSchema.parse(body) - console.log(`Copying template: ${type}`); + console.log(`Copying template: ${type}`) // List all objects under the directory - const { objects } = await env.Templates.list({ prefix: type }); + const { objects } = await env.Templates.list({ prefix: type }) // Copy each object to the new directory with a 5 concurrency limit - const limit = pLimit(5); - await Promise.all(objects.map(({ key }) => - limit(async () => { - const destinationKey = key.replace(type, `projects/${sandboxId}`); - const fileBody = await env.Templates.get(key).then(res => res?.body ?? ""); - await env.R2.put(destinationKey, fileBody); - }) - )); + const limit = pLimit(5) + await Promise.all( + objects.map(({ key }) => + limit(async () => { + const destinationKey = key.replace(type, `projects/${sandboxId}`) + const fileBody = await env.Templates.get(key).then( + (res) => res?.body ?? "" + ) + await env.R2.put(destinationKey, fileBody) + }) + ) + ) return success } else { diff --git a/backend/storage/test/index.spec.ts b/backend/storage/test/index.spec.ts index fbee335..706f17c 100644 --- a/backend/storage/test/index.spec.ts +++ b/backend/storage/test/index.spec.ts @@ -1,25 +1,30 @@ // test/index.spec.ts -import { env, createExecutionContext, waitOnExecutionContext, SELF } from 'cloudflare:test'; -import { describe, it, expect } from 'vitest'; -import worker from '../src/index'; +import { + createExecutionContext, + env, + SELF, + waitOnExecutionContext, +} from "cloudflare:test" +import { describe, expect, it } from "vitest" +import worker from "../src/index" // For now, you'll need to do something like this to get a correctly-typed // `Request` to pass to `worker.fetch()`. -const IncomingRequest = Request; +const IncomingRequest = Request -describe('Hello World worker', () => { - it('responds with Hello World! (unit style)', async () => { - const request = new IncomingRequest('http://example.com'); +describe("Hello World worker", () => { + it("responds with Hello World! (unit style)", async () => { + const request = new IncomingRequest("http://example.com") // Create an empty context to pass to `worker.fetch()`. - const ctx = createExecutionContext(); - const response = await worker.fetch(request, env, ctx); + const ctx = createExecutionContext() + const response = await worker.fetch(request, env, ctx) // Wait for all `Promise`s passed to `ctx.waitUntil()` to settle before running test assertions - await waitOnExecutionContext(ctx); - expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`); - }); + await waitOnExecutionContext(ctx) + expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`) + }) - it('responds with Hello World! (integration style)', async () => { - const response = await SELF.fetch('https://example.com'); - expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`); - }); -}); + it("responds with Hello World! (integration style)", async () => { + const response = await SELF.fetch("https://example.com") + expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`) + }) +}) diff --git a/backend/storage/test/tsconfig.json b/backend/storage/test/tsconfig.json index 509425f..339ee9b 100644 --- a/backend/storage/test/tsconfig.json +++ b/backend/storage/test/tsconfig.json @@ -1,11 +1,11 @@ { - "extends": "../tsconfig.json", - "compilerOptions": { - "types": [ - "@cloudflare/workers-types/experimental", - "@cloudflare/vitest-pool-workers" - ] - }, - "include": ["./**/*.ts", "../src/env.d.ts"], - "exclude": [] + "extends": "../tsconfig.json", + "compilerOptions": { + "types": [ + "@cloudflare/workers-types/experimental", + "@cloudflare/vitest-pool-workers" + ] + }, + "include": ["./**/*.ts", "../src/env.d.ts"], + "exclude": [] } diff --git a/backend/storage/tsconfig.json b/backend/storage/tsconfig.json index 9192490..8b55b9c 100644 --- a/backend/storage/tsconfig.json +++ b/backend/storage/tsconfig.json @@ -12,7 +12,9 @@ /* Language and Environment */ "target": "es2021" /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */, - "lib": ["es2021"] /* Specify a set of bundled library declaration files that describe the target runtime environment. */, + "lib": [ + "es2021" + ] /* Specify a set of bundled library declaration files that describe the target runtime environment. */, "jsx": "react" /* Specify what JSX code is generated. */, // "experimentalDecorators": true, /* Enable experimental support for TC39 stage 2 draft decorators. */ // "emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */ diff --git a/backend/storage/vitest.config.ts b/backend/storage/vitest.config.ts index 973627c..5643ba3 100644 --- a/backend/storage/vitest.config.ts +++ b/backend/storage/vitest.config.ts @@ -1,11 +1,11 @@ -import { defineWorkersConfig } from "@cloudflare/vitest-pool-workers/config"; +import { defineWorkersConfig } from "@cloudflare/vitest-pool-workers/config" export default defineWorkersConfig({ - test: { - poolOptions: { - workers: { - wrangler: { configPath: "./wrangler.toml" }, - }, - }, - }, -}); + test: { + poolOptions: { + workers: { + wrangler: { configPath: "./wrangler.toml" }, + }, + }, + }, +}) diff --git a/backend/storage/worker-configuration.d.ts b/backend/storage/worker-configuration.d.ts index 5b2319b..a3f43d2 100644 --- a/backend/storage/worker-configuration.d.ts +++ b/backend/storage/worker-configuration.d.ts @@ -1,4 +1,3 @@ // Generated by Wrangler // After adding bindings to `wrangler.toml`, regenerate this interface via `npm run cf-typegen` -interface Env { -} +interface Env {} diff --git a/frontend/.prettierrc b/frontend/.prettierrc deleted file mode 100644 index 42830e2..0000000 --- a/frontend/.prettierrc +++ /dev/null @@ -1,5 +0,0 @@ -{ - "tabWidth": 2, - "semi": false, - "singleQuote": false -} \ No newline at end of file diff --git a/frontend/app/(app)/code/[id]/page.tsx b/frontend/app/(app)/code/[id]/page.tsx index 9681a5b..d193d39 100644 --- a/frontend/app/(app)/code/[id]/page.tsx +++ b/frontend/app/(app)/code/[id]/page.tsx @@ -1,12 +1,11 @@ -import Navbar from "@/components/editor/navbar" import { Room } from "@/components/editor/live/room" +import Loading from "@/components/editor/loading" +import Navbar from "@/components/editor/navbar" +import { TerminalProvider } from "@/context/TerminalContext" import { Sandbox, User, UsersToSandboxes } from "@/lib/types" import { currentUser } from "@clerk/nextjs" -import { notFound, redirect } from "next/navigation" -import Loading from "@/components/editor/loading" import dynamic from "next/dynamic" -import fs from "fs" -import { TerminalProvider } from "@/context/TerminalContext" +import { notFound, redirect } from "next/navigation" export const revalidate = 0 @@ -89,19 +88,20 @@ export default async function CodePage({ params }: { params: { id: string } }) { return ( <> -
- - - -
- -
-
-
-
+
+ + + +
+ +
+
+
+
) } diff --git a/frontend/app/(app)/dashboard/page.tsx b/frontend/app/(app)/dashboard/page.tsx index 1f29f96..52f8c3f 100644 --- a/frontend/app/(app)/dashboard/page.tsx +++ b/frontend/app/(app)/dashboard/page.tsx @@ -1,8 +1,8 @@ -import { UserButton, currentUser } from "@clerk/nextjs" -import { redirect } from "next/navigation" import Dashboard from "@/components/dashboard" import Navbar from "@/components/dashboard/navbar" -import { Sandbox, User } from "@/lib/types" +import { User } from "@/lib/types" +import { currentUser } from "@clerk/nextjs" +import { redirect } from "next/navigation" export default async function DashboardPage() { const user = await currentUser() diff --git a/frontend/app/layout.tsx b/frontend/app/layout.tsx index c93b647..494079e 100644 --- a/frontend/app/layout.tsx +++ b/frontend/app/layout.tsx @@ -15,7 +15,7 @@ export const metadata: Metadata = { } export default function RootLayout({ - children + children, }: Readonly<{ children: React.ReactNode }>) { @@ -29,9 +29,7 @@ export default function RootLayout({ disableTransitionOnChange > - - {children} - + {children} @@ -40,4 +38,4 @@ export default function RootLayout({ ) -} \ No newline at end of file +} diff --git a/frontend/app/page.tsx b/frontend/app/page.tsx index 3f99044..8041367 100644 --- a/frontend/app/page.tsx +++ b/frontend/app/page.tsx @@ -1,13 +1,13 @@ -import { currentUser } from "@clerk/nextjs"; -import { redirect } from "next/navigation"; -import Landing from "@/components/landing"; +import Landing from "@/components/landing" +import { currentUser } from "@clerk/nextjs" +import { redirect } from "next/navigation" export default async function Home() { - const user = await currentUser(); + const user = await currentUser() if (user) { - redirect("/dashboard"); + redirect("/dashboard") } - return ; + return } diff --git a/frontend/components/dashboard/about.tsx b/frontend/components/dashboard/about.tsx index 33b0daa..7accef0 100644 --- a/frontend/components/dashboard/about.tsx +++ b/frontend/components/dashboard/about.tsx @@ -3,16 +3,9 @@ import { Dialog, DialogContent, - DialogDescription, DialogHeader, DialogTitle, - DialogTrigger, } from "@/components/ui/dialog" -import Image from "next/image" -import { useState } from "react" - -import { Button } from "../ui/button" -import { ChevronRight } from "lucide-react" export default function AboutModal({ open, diff --git a/frontend/components/dashboard/index.tsx b/frontend/components/dashboard/index.tsx index a7094ec..ecd3837 100644 --- a/frontend/components/dashboard/index.tsx +++ b/frontend/components/dashboard/index.tsx @@ -1,24 +1,16 @@ "use client" -import CustomButton from "@/components/ui/customButton" import { Button } from "@/components/ui/button" -import { - Code2, - FolderDot, - HelpCircle, - Plus, - Settings, - Users, -} from "lucide-react" -import { useEffect, useState } from "react" +import CustomButton from "@/components/ui/customButton" import { Sandbox } from "@/lib/types" +import { Code2, FolderDot, HelpCircle, Plus, Users } from "lucide-react" +import { useRouter, useSearchParams } from "next/navigation" +import { useEffect, useState } from "react" +import { toast } from "sonner" +import AboutModal from "./about" +import NewProjectModal from "./newProject" import DashboardProjects from "./projects" import DashboardSharedWithMe from "./shared" -import NewProjectModal from "./newProject" -import Link from "next/link" -import { useRouter, useSearchParams } from "next/navigation" -import AboutModal from "./about" -import { toast } from "sonner" type TScreen = "projects" | "shared" | "settings" | "search" @@ -49,8 +41,9 @@ export default function Dashboard({ const q = searchParams.get("q") const router = useRouter() - useEffect(() => { // update the dashboard to show a new project - router.refresh() + useEffect(() => { + // update the dashboard to show a new project + router.refresh() }, []) return ( @@ -102,7 +95,7 @@ export default function Dashboard({ */} diff --git a/frontend/components/dashboard/navbar/search.tsx b/frontend/components/dashboard/navbar/search.tsx index f254efe..75f314e 100644 --- a/frontend/components/dashboard/navbar/search.tsx +++ b/frontend/components/dashboard/navbar/search.tsx @@ -1,13 +1,12 @@ -"use client"; +"use client" -import { Input } from "../../ui/input"; -import { Search } from "lucide-react"; -import { useEffect, useState } from "react"; -import { useRouter } from "next/navigation"; +import { Search } from "lucide-react" +import { useRouter } from "next/navigation" +import { Input } from "../../ui/input" export default function DashboardNavbarSearch() { // const [search, setSearch] = useState(""); - const router = useRouter(); + const router = useRouter() // useEffect(() => { // const delayDebounceFn = setTimeout(() => { @@ -29,14 +28,14 @@ export default function DashboardNavbarSearch() { // onChange={(e) => setSearch(e.target.value)} onChange={(e) => { if (e.target.value === "") { - router.push(`/dashboard`); - return; + router.push(`/dashboard`) + return } - router.push(`/dashboard?q=${e.target.value}`); + router.push(`/dashboard?q=${e.target.value}`) }} placeholder="Search projects..." className="pl-8" /> - ); + ) } diff --git a/frontend/components/dashboard/newProject.tsx b/frontend/components/dashboard/newProject.tsx index 012d385..392db96 100644 --- a/frontend/components/dashboard/newProject.tsx +++ b/frontend/components/dashboard/newProject.tsx @@ -288,7 +288,7 @@ function SearchInput({
diff --git a/frontend/components/dashboard/projectCard/dropdown.tsx b/frontend/components/dashboard/projectCard/dropdown.tsx index 24a93f8..522d5bc 100644 --- a/frontend/components/dashboard/projectCard/dropdown.tsx +++ b/frontend/components/dashboard/projectCard/dropdown.tsx @@ -1,30 +1,30 @@ -"use client"; +"use client" -import { Sandbox } from "@/lib/types"; -import { Ellipsis, Globe, Lock, Trash2 } from "lucide-react"; +import { Sandbox } from "@/lib/types" +import { Ellipsis, Globe, Lock, Trash2 } from "lucide-react" import { DropdownMenu, DropdownMenuContent, DropdownMenuItem, DropdownMenuTrigger, -} from "@/components/ui/dropdown-menu"; +} from "@/components/ui/dropdown-menu" export default function ProjectCardDropdown({ sandbox, onVisibilityChange, onDelete, }: { - sandbox: Sandbox; - onVisibilityChange: (sandbox: Sandbox) => void; - onDelete: (sandbox: Sandbox) => void; + sandbox: Sandbox + onVisibilityChange: (sandbox: Sandbox) => void + onDelete: (sandbox: Sandbox) => void }) { return ( { - e.preventDefault(); - e.stopPropagation(); + e.preventDefault() + e.stopPropagation() }} className="h-6 w-6 flex items-center justify-center transition-colors bg-transparent hover:bg-muted-foreground/25 rounded-sm outline-foreground" > @@ -33,8 +33,8 @@ export default function ProjectCardDropdown({ { - e.stopPropagation(); - onVisibilityChange(sandbox); + e.stopPropagation() + onVisibilityChange(sandbox) }} className="cursor-pointer" > @@ -52,8 +52,8 @@ export default function ProjectCardDropdown({ { - e.stopPropagation(); - onDelete(sandbox); + e.stopPropagation() + onDelete(sandbox) }} className="!text-destructive cursor-pointer" > @@ -62,5 +62,5 @@ export default function ProjectCardDropdown({ - ); + ) } diff --git a/frontend/components/dashboard/projectCard/index.tsx b/frontend/components/dashboard/projectCard/index.tsx index 9cc3729..a463e84 100644 --- a/frontend/components/dashboard/projectCard/index.tsx +++ b/frontend/components/dashboard/projectCard/index.tsx @@ -1,14 +1,14 @@ "use client" +import { Card } from "@/components/ui/card" +import { projectTemplates } from "@/lib/data" +import { Sandbox } from "@/lib/types" import { AnimatePresence, motion } from "framer-motion" +import { Clock, Globe, Lock } from "lucide-react" import Image from "next/image" +import { useRouter } from "next/navigation" import { useEffect, useState } from "react" import ProjectCardDropdown from "./dropdown" -import { Clock, Globe, Lock } from "lucide-react" -import { Sandbox } from "@/lib/types" -import { Card } from "@/components/ui/card" -import { useRouter } from "next/navigation" -import { projectTemplates } from "@/lib/data" export default function ProjectCard({ children, diff --git a/frontend/components/dashboard/projectCard/revealEffect.tsx b/frontend/components/dashboard/projectCard/revealEffect.tsx index 3786394..fcda16c 100644 --- a/frontend/components/dashboard/projectCard/revealEffect.tsx +++ b/frontend/components/dashboard/projectCard/revealEffect.tsx @@ -1,8 +1,8 @@ -"use client"; -import { cn } from "@/lib/utils"; -import { Canvas, useFrame, useThree } from "@react-three/fiber"; -import React, { useMemo, useRef } from "react"; -import * as THREE from "three"; +"use client" +import { cn } from "@/lib/utils" +import { Canvas, useFrame, useThree } from "@react-three/fiber" +import React, { useMemo, useRef } from "react" +import * as THREE from "three" export const CanvasRevealEffect = ({ animationSpeed = 0.4, @@ -12,12 +12,12 @@ export const CanvasRevealEffect = ({ dotSize, showGradient = true, }: { - animationSpeed?: number; - opacities?: number[]; - colors?: number[][]; - containerClassName?: string; - dotSize?: number; - showGradient?: boolean; + animationSpeed?: number + opacities?: number[] + colors?: number[][] + containerClassName?: string + dotSize?: number + showGradient?: boolean }) => { return (
@@ -41,16 +41,16 @@ export const CanvasRevealEffect = ({
)}
- ); -}; + ) +} interface DotMatrixProps { - colors?: number[][]; - opacities?: number[]; - totalSize?: number; - dotSize?: number; - shader?: string; - center?: ("x" | "y")[]; + colors?: number[][] + opacities?: number[] + totalSize?: number + dotSize?: number + shader?: string + center?: ("x" | "y")[] } const DotMatrix: React.FC = ({ @@ -69,7 +69,7 @@ const DotMatrix: React.FC = ({ colors[0], colors[0], colors[0], - ]; + ] if (colors.length === 2) { colorsArray = [ colors[0], @@ -78,7 +78,7 @@ const DotMatrix: React.FC = ({ colors[1], colors[1], colors[1], - ]; + ] } else if (colors.length === 3) { colorsArray = [ colors[0], @@ -87,7 +87,7 @@ const DotMatrix: React.FC = ({ colors[1], colors[2], colors[2], - ]; + ] } return { @@ -111,8 +111,8 @@ const DotMatrix: React.FC = ({ value: dotSize, type: "uniform1f", }, - }; - }, [colors, opacities, totalSize, dotSize]); + } + }, [colors, opacities, totalSize, dotSize]) return ( = ({ uniforms={uniforms} maxFps={60} /> - ); -}; + ) +} type Uniforms = { [key: string]: { - value: number[] | number[][] | number; - type: string; - }; -}; + value: number[] | number[][] | number + type: string + } +} const ShaderMaterial = ({ source, uniforms, maxFps = 60, }: { - source: string; - hovered?: boolean; - maxFps?: number; - uniforms: Uniforms; + source: string + hovered?: boolean + maxFps?: number + uniforms: Uniforms }) => { - const { size } = useThree(); - const ref = useRef(); - let lastFrameTime = 0; + const { size } = useThree() + const ref = useRef() + let lastFrameTime = 0 useFrame(({ clock }) => { - if (!ref.current) return; - const timestamp = clock.getElapsedTime(); + if (!ref.current) return + const timestamp = clock.getElapsedTime() if (timestamp - lastFrameTime < 1 / maxFps) { - return; + return } - lastFrameTime = timestamp; + lastFrameTime = timestamp - const material: any = ref.current.material; - const timeLocation = material.uniforms.u_time; - timeLocation.value = timestamp; - }); + const material: any = ref.current.material + const timeLocation = material.uniforms.u_time + timeLocation.value = timestamp + }) const getUniforms = () => { - const preparedUniforms: any = {}; + const preparedUniforms: any = {} for (const uniformName in uniforms) { - const uniform: any = uniforms[uniformName]; + const uniform: any = uniforms[uniformName] switch (uniform.type) { case "uniform1f": - preparedUniforms[uniformName] = { value: uniform.value, type: "1f" }; - break; + preparedUniforms[uniformName] = { value: uniform.value, type: "1f" } + break case "uniform3f": preparedUniforms[uniformName] = { value: new THREE.Vector3().fromArray(uniform.value), type: "3f", - }; - break; + } + break case "uniform1fv": - preparedUniforms[uniformName] = { value: uniform.value, type: "1fv" }; - break; + preparedUniforms[uniformName] = { value: uniform.value, type: "1fv" } + break case "uniform3fv": preparedUniforms[uniformName] = { value: uniform.value.map((v: number[]) => new THREE.Vector3().fromArray(v) ), type: "3fv", - }; - break; + } + break case "uniform2f": preparedUniforms[uniformName] = { value: new THREE.Vector2().fromArray(uniform.value), type: "2f", - }; - break; + } + break default: - console.error(`Invalid uniform type for '${uniformName}'.`); - break; + console.error(`Invalid uniform type for '${uniformName}'.`) + break } } - preparedUniforms["u_time"] = { value: 0, type: "1f" }; + preparedUniforms["u_time"] = { value: 0, type: "1f" } preparedUniforms["u_resolution"] = { value: new THREE.Vector2(size.width * 2, size.height * 2), - }; // Initialize u_resolution - return preparedUniforms; - }; + } // Initialize u_resolution + return preparedUniforms + } // Shader material const material = useMemo(() => { @@ -272,33 +272,33 @@ const ShaderMaterial = ({ blending: THREE.CustomBlending, blendSrc: THREE.SrcAlphaFactor, blendDst: THREE.OneFactor, - }); + }) - return materialObject; - }, [size.width, size.height, source]); + return materialObject + }, [size.width, size.height, source]) return ( - ); -}; + ) +} const Shader: React.FC = ({ source, uniforms, maxFps = 60 }) => { return ( - ); -}; + ) +} interface ShaderProps { - source: string; + source: string uniforms: { [key: string]: { - value: number[] | number[][] | number; - type: string; - }; - }; - maxFps?: number; + value: number[] | number[][] | number + type: string + } + } + maxFps?: number } diff --git a/frontend/components/dashboard/projects.tsx b/frontend/components/dashboard/projects.tsx index 9953e19..7c9705d 100644 --- a/frontend/components/dashboard/projects.tsx +++ b/frontend/components/dashboard/projects.tsx @@ -1,16 +1,12 @@ -"use client"; +"use client" -import { Sandbox } from "@/lib/types"; -import ProjectCard from "./projectCard"; -import Image from "next/image"; -import ProjectCardDropdown from "./projectCard/dropdown"; -import { Clock, Globe, Lock } from "lucide-react"; -import Link from "next/link"; -import { Card } from "../ui/card"; -import { deleteSandbox, updateSandbox } from "@/lib/actions"; -import { toast } from "sonner"; -import { useEffect, useState } from "react"; -import { CanvasRevealEffect } from "./projectCard/revealEffect"; +import { deleteSandbox, updateSandbox } from "@/lib/actions" +import { Sandbox } from "@/lib/types" +import Link from "next/link" +import { useEffect, useState } from "react" +import { toast } from "sonner" +import ProjectCard from "./projectCard" +import { CanvasRevealEffect } from "./projectCard/revealEffect" const colors: { [key: string]: number[][] } = { react: [ @@ -21,38 +17,37 @@ const colors: { [key: string]: number[][] } = { [86, 184, 72], [59, 112, 52], ], -}; +} export default function DashboardProjects({ sandboxes, q, }: { - sandboxes: Sandbox[]; - q: string | null; + sandboxes: Sandbox[] + q: string | null }) { - const [deletingId, setDeletingId] = useState(""); + const [deletingId, setDeletingId] = useState("") const onDelete = async (sandbox: Sandbox) => { - setDeletingId(sandbox.id); - toast(`Project ${sandbox.name} deleted.`); - await deleteSandbox(sandbox.id); - }; + setDeletingId(sandbox.id) + toast(`Project ${sandbox.name} deleted.`) + await deleteSandbox(sandbox.id) + } useEffect(() => { if (deletingId) { - setDeletingId(""); + setDeletingId("") } - }, [sandboxes]); + }, [sandboxes]) const onVisibilityChange = async (sandbox: Sandbox) => { - const newVisibility = - sandbox.visibility === "public" ? "private" : "public"; - toast(`Project ${sandbox.name} is now ${newVisibility}.`); + const newVisibility = sandbox.visibility === "public" ? "private" : "public" + toast(`Project ${sandbox.name} is now ${newVisibility}.`) await updateSandbox({ id: sandbox.id, visibility: newVisibility, - }); - }; + }) + } return (
@@ -65,7 +60,7 @@ export default function DashboardProjects({ {sandboxes.map((sandbox) => { if (q && q.length > 0) { if (!sandbox.name.toLowerCase().includes(q.toLowerCase())) { - return null; + return null } } return ( @@ -93,7 +88,7 @@ export default function DashboardProjects({
- ); + ) })}
) : ( @@ -103,5 +98,5 @@ export default function DashboardProjects({ )}
- ); + ) } diff --git a/frontend/components/dashboard/shared.tsx b/frontend/components/dashboard/shared.tsx index 9e5eb03..d9d4d39 100644 --- a/frontend/components/dashboard/shared.tsx +++ b/frontend/components/dashboard/shared.tsx @@ -1,29 +1,27 @@ -import { Sandbox } from "@/lib/types"; import { Table, TableBody, - TableCaption, TableCell, TableHead, TableHeader, TableRow, -} from "@/components/ui/table"; -import Image from "next/image"; -import Button from "../ui/customButton"; -import { ChevronRight } from "lucide-react"; -import Avatar from "../ui/avatar"; -import Link from "next/link"; +} from "@/components/ui/table" +import { ChevronRight } from "lucide-react" +import Image from "next/image" +import Link from "next/link" +import Avatar from "../ui/avatar" +import Button from "../ui/customButton" export default function DashboardSharedWithMe({ shared, }: { shared: { - id: string; - name: string; - type: "react" | "node"; - author: string; - sharedOn: Date; - }[]; + id: string + name: string + type: "react" | "node" + author: string + sharedOn: Date + }[] }) { return (
@@ -86,5 +84,5 @@ export default function DashboardSharedWithMe({
)} - ); + ) } diff --git a/frontend/components/editor/AIChat/ChatInput.tsx b/frontend/components/editor/AIChat/ChatInput.tsx index a40e0b5..380b6a4 100644 --- a/frontend/components/editor/AIChat/ChatInput.tsx +++ b/frontend/components/editor/AIChat/ChatInput.tsx @@ -1,36 +1,51 @@ -import React from 'react'; -import { Button } from '../../ui/button'; -import { Send, StopCircle } from 'lucide-react'; +import { Send, StopCircle } from "lucide-react" +import { Button } from "../../ui/button" interface ChatInputProps { - input: string; - setInput: (input: string) => void; - isGenerating: boolean; - handleSend: () => void; - handleStopGeneration: () => void; + input: string + setInput: (input: string) => void + isGenerating: boolean + handleSend: () => void + handleStopGeneration: () => void } -export default function ChatInput({ input, setInput, isGenerating, handleSend, handleStopGeneration }: ChatInputProps) { +export default function ChatInput({ + input, + setInput, + isGenerating, + handleSend, + handleStopGeneration, +}: ChatInputProps) { return (
- setInput(e.target.value)} - onKeyPress={(e) => e.key === 'Enter' && !isGenerating && handleSend()} + onKeyPress={(e) => e.key === "Enter" && !isGenerating && handleSend()} className="flex-grow p-2 border rounded-lg min-w-0 bg-input" placeholder="Type your message..." disabled={isGenerating} /> {isGenerating ? ( - ) : ( - )}
- ); + ) } diff --git a/frontend/components/editor/AIChat/ChatMessage.tsx b/frontend/components/editor/AIChat/ChatMessage.tsx index 7eac365..6b0fa72 100644 --- a/frontend/components/editor/AIChat/ChatMessage.tsx +++ b/frontend/components/editor/AIChat/ChatMessage.tsx @@ -1,25 +1,31 @@ -import React, { useState } from 'react'; -import { Button } from '../../ui/button'; -import { ChevronUp, ChevronDown, Copy, Check, CornerUpLeft } from 'lucide-react'; -import ReactMarkdown from 'react-markdown'; -import { Prism as SyntaxHighlighter } from 'react-syntax-highlighter'; -import { vscDarkPlus } from 'react-syntax-highlighter/dist/esm/styles/prism'; -import remarkGfm from 'remark-gfm'; -import { copyToClipboard, stringifyContent } from './lib/chatUtils'; +import { Check, ChevronDown, ChevronUp, Copy, CornerUpLeft } from "lucide-react" +import React, { useState } from "react" +import ReactMarkdown from "react-markdown" +import { Prism as SyntaxHighlighter } from "react-syntax-highlighter" +import { vscDarkPlus } from "react-syntax-highlighter/dist/esm/styles/prism" +import remarkGfm from "remark-gfm" +import { Button } from "../../ui/button" +import { copyToClipboard, stringifyContent } from "./lib/chatUtils" interface MessageProps { message: { - role: 'user' | 'assistant'; - content: string; - context?: string; - }; - setContext: (context: string | null) => void; - setIsContextExpanded: (isExpanded: boolean) => void; + role: "user" | "assistant" + content: string + context?: string + } + setContext: (context: string | null) => void + setIsContextExpanded: (isExpanded: boolean) => void } -export default function ChatMessage({ message, setContext, setIsContextExpanded }: MessageProps) { - const [expandedMessageIndex, setExpandedMessageIndex] = useState(null); - const [copiedText, setCopiedText] = useState(null); +export default function ChatMessage({ + message, + setContext, + setIsContextExpanded, +}: MessageProps) { + const [expandedMessageIndex, setExpandedMessageIndex] = useState< + number | null + >(null) + const [copiedText, setCopiedText] = useState(null) const renderCopyButton = (text: any) => ( - ); + ) const askAboutCode = (code: any) => { - const contextString = stringifyContent(code); - setContext(`Regarding this code:\n${contextString}`); - setIsContextExpanded(false); - }; + const contextString = stringifyContent(code) + setContext(`Regarding this code:\n${contextString}`) + setIsContextExpanded(false) + } const renderMarkdownElement = (props: any) => { - const { node, children } = props; - const content = stringifyContent(children); + const { node, children } = props + const content = stringifyContent(children) return (
@@ -59,22 +65,30 @@ export default function ChatMessage({ message, setContext, setIsContextExpanded
- {React.createElement(node.tagName, { - ...props, - className: `${props.className || ''} hover:bg-transparent rounded p-1 transition-colors` - }, children)} + {React.createElement( + node.tagName, + { + ...props, + className: `${ + props.className || "" + } hover:bg-transparent rounded p-1 transition-colors`, + }, + children + )} - ); - }; + ) + } return (
-
- {message.role === 'user' && ( +
+ {message.role === "user" && (
{renderCopyButton(message.content)}