Compare commits

..

1 Commits

Author SHA1 Message Date
James Murdza
e58d088b5f Replace references to the app base URL with a dynamically generated URL. 2024-06-07 17:03:56 -04:00
185 changed files with 15175 additions and 28504 deletions

1
.gitignore vendored
View File

@ -40,6 +40,7 @@ wrangler.toml
dist
backend/server/projects
backend/database/drizzle
app.yaml
ingressController.yaml

View File

@ -1,7 +0,0 @@
{
"tabWidth": 2,
"semi": false,
"singleQuote": false,
"insertFinalNewline": true,
"useTabs": false
}

View File

@ -1,3 +0,0 @@
{
"recommendations": ["esbenp.prettier-vscode"]
}

13
.vscode/settings.json vendored
View File

@ -1,13 +0,0 @@
{
"editor.formatOnSave": true,
"editor.formatOnSaveMode": "file",
"editor.codeActionsOnSave": {
"source.fixAll": "explicit",
"source.organizeImports": "explicit"
},
"tailwindCSS.experimental.classRegex": [
["cva\\(([^)]*)\\)", "[\"'`]([^\"'`]*).*?[\"'`]"],
["cx\\(([^)]*)\\)", "(?:'|\"|`)([^']*)(?:'|\"|`)"]
],
"editor.defaultFormatter": "esbenp.prettier-vscode"
}

View File

@ -1,7 +1,6 @@
MIT License
Copyright (c) 2024 Ishaan Dey
Copyright (c) 2024 GitWit, Inc.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

297
README.md
View File

@ -1,253 +1,101 @@
# GitWit Sandbox 📦🪄
# Sandbox 📦🪄
![2024-10-2307 17 42-ezgif com-resize](https://github.com/user-attachments/assets/a4057129-81a7-4a31-a093-c8bc8189ae72)
<img width="1799" alt="Screenshot 2024-05-31 at 8 33 56AM" src="https://github.com/ishaan1013/sandbox/assets/69771365/3f73d7c0-f82a-4997-b01e-eaa043e95113">
Sandbox is an open-source cloud-based code editing environment with custom AI code generation, live preview, real-time collaboration, and AI chat.
Sandbox is an open-source cloud-based code editing environment with custom AI code autocompletion and real-time collaboration.
For the latest updates, join our Discord server: [discord.gitwit.dev](https://discord.gitwit.dev/).
Check out the [Twitter thread](https://x.com/ishaandey_/status/1796338262002573526) with the demo video!
Check out this [guide](https://dev.to/jamesmurdza/how-to-setup-ishaan1013sandbox-locally-503p) made by [@jamesmurdza](https://x.com/jamesmurdza) on setting it up locally!
## Running Locally
Notes:
### Frontend
- Double-check that whatever you change "SUPERDUPERSECRET" to, it's the same in all config files.
### 0. Requirements
The application uses NodeJS for the backend, NextJS for the frontend, and Cloudflare workers for additional backend tasks.
Needed accounts to set up:
- [Clerk](https://clerk.com/): Used for user authentication.
- [Liveblocks](https://liveblocks.io/): Used for collaborative editing.
- [E2B](https://e2b.dev/): Used for the terminals and live preview.
- [Cloudflare](https://www.cloudflare.com/): Used for relational data storage (D2) and file storage (R2).
- [Anthropic](https://anthropic.com/) and [OpenAI](https://openai.com/): API keys for code generation.
A quick overview of the tech before we start: The deployment uses a **NextJS** app for the frontend and an **ExpressJS** server on the backend. Presumably that's because NextJS integrates well with Clerk middleware but not with Socket.io.
### 1. Initial setup
No surprise in the first step:
Install dependencies
```bash
git clone https://github.com/jamesmurdza/sandbox
cd sandbox
cd frontend
npm install
```
Run `npm install` in:
Add the required environment variables in `.env` (example file provided in `.env.example`). You will need to make an account on [Clerk](https://clerk.com/) and [Liveblocks](https://liveblocks.io/) to get API keys.
```
/frontend
/backend/database
/backend/storage
/backend/server
Then, run in development mode
```bash
npm run dev
```
### 2. Adding Clerk
### Backend
Setup the Clerk account.
Get the API keys from Clerk.
The backend consists of a primary Express and Socket.io server, and 3 Cloudflare Workers microservices for the D1 database, R2 storage, and Workers AI. The D1 database also contains a [service binding](https://developers.cloudflare.com/workers/runtime-apis/bindings/service-bindings/) to the R2 storage worker.
Update `/frontend/.env`:
#### Socket.io server
```
NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY='🔑'
CLERK_SECRET_KEY='🔑'
Install dependencies
```bash
cd backend/server
npm install
```
### 3. Deploying the storage bucket
Add the required environment variables in `.env` (example file provided in `.env.example`)
Go to Cloudflare.
Create and name an R2 storage bucket in the control panel.
Copy the account ID of one domain.
Project files will be stored in the `projects/<project-id>` directory. The middleware contains basic authorization logic for connecting to the server.
Update `/backend/storage/src/wrangler.toml`:
Run in development mode
```
account_id = '🔑'
bucket_name = '🔑'
key = 'SUPERDUPERSECRET'
```bash
npm run dev
```
In the `/backend/storage/src` directory:
This directory is dockerized, so feel free to deploy a container on any platform of your choice! I chose not to deploy this project for public access due to costs & safety, but deploying your own for personal use should be no problem.
#### Cloudflare Workers (Database, Storage, AI)
Directories:
- `/backend/database`: D1 database
- `/backend/storage`: R2 storage
- `/backend/ai`: Workers AI
Install dependencies
```bash
cd backend/database
npm install
cd ../storage
npm install
cd ../ai
npm install
```
Read the [documentation](https://developers.cloudflare.com/workers/) to learn more about workers.
For each directory, add the required environment variables in `wrangler.toml` (example file provided in `wrangler.example.toml`). For the AI worker, you can define any value you want for the `CF_AI_KEY` -- set this in other `.env` files to authorize access.
Run in development mode
```bash
npm run dev
```
Deploy to Cloudflare with [Wrangler](https://developers.cloudflare.com/workers/wrangler/install-and-update/)
```bash
npx wrangler deploy
```
### 4. Deploying the database
Follow this [guide](https://docs.google.com/document/d/1w5dA5daic_sIYB5Seni1KvnFx51pPV2so6lLdN2xa7Q/edit?usp=sharing) for more info.
Create a database:
```
npx wrangler d1 create sandbox-database
```
Use the output for the next setp.
Update `/backend/database/src/wrangler.toml`:
```
database_name = '🔑'
database_id = '🔑'
KEY = 'SUPERDUPERSECRET'
STORAGE_WORKER_URL = 'https://storage.🍎.workers.dev'
```
In the `/backend/database/src` directory:
```
npx wrangler deploy
```
### 5. Applying the database schema
Delete the `/backend/database/drizzle/meta` directory.
In the `/backend/database/` directory:
```
npm run generate
npx wrangler d1 execute sandbox-database --remote --file=./drizzle/0000_🍏_🍐.sql
```
### 6. Configuring the server
Update `/backend/server/.env`:
```
DATABASE_WORKER_URL='https://database.🍎.workers.dev'
STORAGE_WORKER_URL='https://storage.🍎.workers.dev'
WORKERS_KEY='SUPERDUPERSECRET'
```
### 7. Adding Liveblocks
Setup the Liveblocks account.
Update `/frontend/.env`:
```
NEXT_PUBLIC_LIVEBLOCKS_PUBLIC_KEY='🔑'
LIVEBLOCKS_SECRET_KEY='🔑'
```
### 8. Adding E2B
Setup the E2B account.
Update `/backend/server/.env`:
```
E2B_API_KEY='🔑'
```
### 9. Configuring the frontend
Update `/frontend/.env`:
```
NEXT_PUBLIC_DATABASE_WORKER_URL='https://database.🍎.workers.dev'
NEXT_PUBLIC_STORAGE_WORKER_URL='https://storage.🍎.workers.dev'
NEXT_PUBLIC_WORKERS_KEY='SUPERDUPERSECRET'
ANTHROPIC_API_KEY='🔑'
OPENAI_API_KEY='🔑'
```
### 10. Running the IDE
Run `npm run dev` simultaneously in:
```
/frontend
/backend/server
```
## Setting up Deployments
The steps above do not include steps to setup [Dokku](https://github.com/dokku/dokku), which is required for deployments.
**Note:** This is completely optional to set up if you just want to run GitWit Sandbox.
Setting up deployments first requires a separate domain (such as gitwit.app, which we use).
We then deploy Dokku on a separate server, according to this guide: https://dev.to/jamesmurdza/host-your-own-paas-platform-as-a-service-on-amazon-web-services-3f0d
And we install [dokku-daemon](https://github.com/dokku/dokku-daemon) with the following commands:
```
git clone https://github.com/dokku/dokku-daemon
cd dokku-daemon
sudo make install
systemctl start dokku-daemon
```
The Sandbox platform connects to the Dokku server via SSH, using SSH keys specifically generated for this connection. The SSH key is stored on the Sandbox server, and the following environment variables are set in /backend/server/.env:
```bash
DOKKU_HOST=
DOKKU_USERNAME=
DOKKU_KEY=
```
## Deploying to AWS
The backend server and deployments server can be deployed using AWS's EC2 service. See [our video guide](https://www.youtube.com/watch?v=WN8HQnimjmk) on how to do this.
## Creating Custom Templates
Anyone can contribute a custom template for integration in Sandbox. Since Sandbox is built on E2B, there is no limitation to what langauge or runtime a Sandbox can use.
Currently there are five templates:
- [jamesmurdza/dokku-reactjs-template](https://github.com/jamesmurdza/dokku-reactjs-template)
- [jamesmurdza/dokku-vanillajs-template](https://github.com/jamesmurdza/dokku-vanillajs-template)
- [jamesmurdza/dokku-nextjs-template](https://github.com/jamesmurdza/dokku-nextjs-template)
- [jamesmurdza/dokku-streamlit-template](https://github.com/jamesmurdza/dokku-streamlit-template)
- [omarrwd/dokku-php-template](https://github.com/omarrwd/dokku-php-template)
To create your own template, you can fork one of the above templates or start with a new blank repository. The template should have at least an `e2b.Dockerfile`, which is used by E2B to create the development environment. Optionally, a `Dockerfile` can be added which will be used to create the project build when it is deployed.
To test the template, you must have an [E2B account](https://e2b.dev/) and the [E2B CLI tools](https://e2b.dev/docs/cli) installed. Then, in the Terminal, run:
```
e2b auth login
```
Then, navigate to your template directory and run the following command where **TEMPLATENAME** is the name of your template:
```
e2b template build -d e2b.Dockerfile -n TEMPLATENAME
```
Finally, to test your template run:
```
e2b sandbox spawn TEMPLATENAME
cd project
```
You will see a URL in the form of `https://xxxxxxxxxxxxxxxxxxx.e2b-staging.com`.
Now, run the command to start your development server.
To see the running server, visit the public url `https://<PORT>-xxxxxxxxxxxxxxxxxxx.e2b-staging.com`.
If you've done this and it works, let us know and we'll add your template to Sandbox! Please reach out to us [on Discord](https://discord.gitwit.dev/) with any questions or to submit your working template.
Note: In the future, we will add a way to specify the command triggered by the "Run" button (e.g. "npm run dev").
For more information, see:
- [Custom E2B Sandboxes](https://e2b.dev/docs/sandbox-template)
- [Dokku Builders](https://dokku.com/docs/deployment/builders/builder-management/)
---
## Contributing
Thanks for your interest in contributing! Review this section before submitting your first pull request. If you need any help, feel free contact us [on Discord](https://discord.gitwit.dev/).
Thanks for your interest in contributing! Review this section before submitting your first pull request. If you need any help, feel free to reach out to [@ishaandey\_](https://x.com/ishaandey_).
Please prioritize existing issues, but feel free to contribute new issues if you have ideas for a feature or bug that you think would be useful.
### Structure
@ -262,7 +110,8 @@ backend/
├── database/
│ ├── src
│ └── drizzle
└── storage
├── storage
└── ai
```
| Path | Description |
@ -271,6 +120,7 @@ backend/
| `backend/server` | The Express websocket server. |
| `backend/database` | API for interfacing with the D1 database (SQLite). |
| `backend/storage` | API for interfacing with R2 storage. Service-bound to `/backend/database`. |
| `backend/ai` | API for making requests to Workers AI . |
### Development
@ -291,10 +141,6 @@ cd sandbox
git checkout -b my-new-branch
```
### Code formatting
This repository uses [Prettier](https://marketplace.cursorapi.com/items?itemName=esbenp.prettier-vscode) for code formatting, which you will be prompted to install when you open the project. The formatting rules are specified in [.prettierrc](.prettierrc).
### Commit convention
Before you create a Pull Request, please check that you use the [Conventional Commits format](https://www.conventionalcommits.org/en/v1.0.0/)
@ -303,15 +149,11 @@ It should be in the form `category(scope or module): message` in your commit mes
- `feat / feature`: all changes that introduce completely new code or new
features
- `fix`: changes that fix a bug (ideally you will additionally reference an
issue if present)
- `refactor`: any code related change that is not a fix nor a feature
- `docs`: changing existing or creating new documentation (i.e. README, docs for
usage of a lib or cli usage)
- `chore`: all changes to the repository that do not fit into any of the above
categories
@ -339,4 +181,3 @@ It should be in the form `category(scope or module): message` in your commit mes
- [Express](https://expressjs.com/)
- [Socket.io](https://socket.io/)
- [Drizzle ORM](https://orm.drizzle.team/)
- [E2B](https://e2b.dev/)

12
backend/ai/.editorconfig Normal file
View File

@ -0,0 +1,12 @@
# http://editorconfig.org
root = true
[*]
indent_style = tab
end_of_line = lf
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true
[*.yml]
indent_style = space

172
backend/ai/.gitignore vendored Normal file
View File

@ -0,0 +1,172 @@
# Logs
logs
_.log
npm-debug.log_
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
.pnpm-debug.log*
# Diagnostic reports (https://nodejs.org/api/report.html)
report.[0-9]_.[0-9]_.[0-9]_.[0-9]_.json
# Runtime data
pids
_.pid
_.seed
\*.pid.lock
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
\*.lcov
# nyc test coverage
.nyc_output
# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# Bower dependency directory (https://bower.io/)
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release
# Dependency directories
node_modules/
jspm_packages/
# Snowpack dependency directory (https://snowpack.dev/)
web_modules/
# TypeScript cache
\*.tsbuildinfo
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Optional stylelint cache
.stylelintcache
# Microbundle cache
.rpt2_cache/
.rts2_cache_cjs/
.rts2_cache_es/
.rts2_cache_umd/
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
\*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variable files
.env
.env.development.local
.env.test.local
.env.production.local
.env.local
# parcel-bundler cache (https://parceljs.org/)
.cache
.parcel-cache
# Next.js build output
.next
out
# Nuxt.js build / generate output
.nuxt
dist
# Gatsby files
.cache/
# Comment in the public line in if your project uses Gatsby and not Next.js
# https://nextjs.org/blog/next-9-1#public-directory-support
# public
# vuepress build output
.vuepress/dist
# vuepress v2.x temp and cache directory
.temp
.cache
# Docusaurus cache and generated files
.docusaurus
# Serverless directories
.serverless/
# FuseBox cache
.fusebox/
# DynamoDB Local files
.dynamodb/
# TernJS port file
.tern-port
# Stores VSCode versions used for testing VSCode extensions
.vscode-test
# yarn v2
.yarn/cache
.yarn/unplugged
.yarn/build-state.yml
.yarn/install-state.gz
.pnp.\*
# wrangler project
.dev.vars
.wrangler/

5
backend/ai/.prettierrc Normal file
View File

@ -0,0 +1,5 @@
{
"tabWidth": 2,
"semi": false,
"singleQuote": false
}

3113
backend/ai/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

19
backend/ai/package.json Normal file
View File

@ -0,0 +1,19 @@
{
"name": "ai",
"version": "0.0.0",
"private": true,
"scripts": {
"deploy": "wrangler deploy",
"dev": "wrangler dev",
"start": "wrangler dev",
"test": "vitest",
"cf-typegen": "wrangler types"
},
"devDependencies": {
"@cloudflare/vitest-pool-workers": "^0.1.0",
"@cloudflare/workers-types": "^4.20240512.0",
"typescript": "^5.0.4",
"vitest": "1.3.0",
"wrangler": "^3.0.0"
}
}

43
backend/ai/src/index.ts Normal file
View File

@ -0,0 +1,43 @@
export interface Env {
AI: any
}
export default {
async fetch(request, env): Promise<Response> {
if (request.method !== "GET") {
return new Response("Method Not Allowed", { status: 405 })
}
const url = new URL(request.url)
const fileName = url.searchParams.get("fileName")
const instructions = url.searchParams.get("instructions")
const line = url.searchParams.get("line")
const code = url.searchParams.get("code")
const response = await env.AI.run("@cf/meta/llama-3-8b-instruct", {
messages: [
{
role: "system",
content:
"You are an expert coding assistant. You read code from a file, and you suggest new code to add to the file. You may be given instructions on what to generate, which you should follow. You should generate code that is CORRECT, efficient, and follows best practices. You may generate multiple lines of code if necessary. When you generate code, you should ONLY return the code, and nothing else. You MUST NOT include backticks in the code you generate.",
},
{
role: "user",
content: `The file is called ${fileName}.`,
},
{
role: "user",
content: `Here are my instructions on what to generate: ${instructions}.`,
},
{
role: "user",
content: `Suggest me code to insert at line ${line} in my file. Give only the code, and NOTHING else. DO NOT include backticks in your response. My code file content is as follows
${code}`,
},
],
})
return new Response(JSON.stringify(response))
},
} satisfies ExportedHandler<Env>

View File

@ -0,0 +1,25 @@
// test/index.spec.ts
import { env, createExecutionContext, waitOnExecutionContext, SELF } from 'cloudflare:test';
import { describe, it, expect } from 'vitest';
import worker from '../src/index';
// For now, you'll need to do something like this to get a correctly-typed
// `Request` to pass to `worker.fetch()`.
const IncomingRequest = Request<unknown, IncomingRequestCfProperties>;
describe('Hello World worker', () => {
it('responds with Hello World! (unit style)', async () => {
const request = new IncomingRequest('http://example.com');
// Create an empty context to pass to `worker.fetch()`.
const ctx = createExecutionContext();
const response = await worker.fetch(request, env, ctx);
// Wait for all `Promise`s passed to `ctx.waitUntil()` to settle before running test assertions
await waitOnExecutionContext(ctx);
expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`);
});
it('responds with Hello World! (integration style)', async () => {
const response = await SELF.fetch('https://example.com');
expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`);
});
});

View File

@ -0,0 +1,11 @@
{
"extends": "../tsconfig.json",
"compilerOptions": {
"types": [
"@cloudflare/workers-types/experimental",
"@cloudflare/vitest-pool-workers"
]
},
"include": ["./**/*.ts", "../src/env.d.ts"],
"exclude": []
}

104
backend/ai/tsconfig.json Normal file
View File

@ -0,0 +1,104 @@
{
"compilerOptions": {
/* Visit https://aka.ms/tsconfig.json to read more about this file */
/* Projects */
// "incremental": true, /* Enable incremental compilation */
// "composite": true, /* Enable constraints that allow a TypeScript project to be used with project references. */
// "tsBuildInfoFile": "./", /* Specify the folder for .tsbuildinfo incremental compilation files. */
// "disableSourceOfProjectReferenceRedirect": true, /* Disable preferring source files instead of declaration files when referencing composite projects */
// "disableSolutionSearching": true, /* Opt a project out of multi-project reference checking when editing. */
// "disableReferencedProjectLoad": true, /* Reduce the number of projects loaded automatically by TypeScript. */
/* Language and Environment */
"target": "es2021" /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */,
"lib": ["es2021"] /* Specify a set of bundled library declaration files that describe the target runtime environment. */,
"jsx": "react" /* Specify what JSX code is generated. */,
// "experimentalDecorators": true, /* Enable experimental support for TC39 stage 2 draft decorators. */
// "emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */
// "jsxFactory": "", /* Specify the JSX factory function used when targeting React JSX emit, e.g. 'React.createElement' or 'h' */
// "jsxFragmentFactory": "", /* Specify the JSX Fragment reference used for fragments when targeting React JSX emit e.g. 'React.Fragment' or 'Fragment'. */
// "jsxImportSource": "", /* Specify module specifier used to import the JSX factory functions when using `jsx: react-jsx*`.` */
// "reactNamespace": "", /* Specify the object invoked for `createElement`. This only applies when targeting `react` JSX emit. */
// "noLib": true, /* Disable including any library files, including the default lib.d.ts. */
// "useDefineForClassFields": true, /* Emit ECMAScript-standard-compliant class fields. */
/* Modules */
"module": "es2022" /* Specify what module code is generated. */,
// "rootDir": "./", /* Specify the root folder within your source files. */
"moduleResolution": "Bundler" /* Specify how TypeScript looks up a file from a given module specifier. */,
// "baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */
// "paths": {}, /* Specify a set of entries that re-map imports to additional lookup locations. */
// "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */
// "typeRoots": [], /* Specify multiple folders that act like `./node_modules/@types`. */
"types": [
"@cloudflare/workers-types/2023-07-01"
] /* Specify type package names to be included without being referenced in a source file. */,
// "allowUmdGlobalAccess": true, /* Allow accessing UMD globals from modules. */
"resolveJsonModule": true /* Enable importing .json files */,
// "noResolve": true, /* Disallow `import`s, `require`s or `<reference>`s from expanding the number of files TypeScript should add to a project. */
/* JavaScript Support */
"allowJs": true /* Allow JavaScript files to be a part of your program. Use the `checkJS` option to get errors from these files. */,
"checkJs": false /* Enable error reporting in type-checked JavaScript files. */,
// "maxNodeModuleJsDepth": 1, /* Specify the maximum folder depth used for checking JavaScript files from `node_modules`. Only applicable with `allowJs`. */
/* Emit */
// "declaration": true, /* Generate .d.ts files from TypeScript and JavaScript files in your project. */
// "declarationMap": true, /* Create sourcemaps for d.ts files. */
// "emitDeclarationOnly": true, /* Only output d.ts files and not JavaScript files. */
// "sourceMap": true, /* Create source map files for emitted JavaScript files. */
// "outFile": "./", /* Specify a file that bundles all outputs into one JavaScript file. If `declaration` is true, also designates a file that bundles all .d.ts output. */
// "outDir": "./", /* Specify an output folder for all emitted files. */
// "removeComments": true, /* Disable emitting comments. */
"noEmit": true /* Disable emitting files from a compilation. */,
// "importHelpers": true, /* Allow importing helper functions from tslib once per project, instead of including them per-file. */
// "importsNotUsedAsValues": "remove", /* Specify emit/checking behavior for imports that are only used for types */
// "downlevelIteration": true, /* Emit more compliant, but verbose and less performant JavaScript for iteration. */
// "sourceRoot": "", /* Specify the root path for debuggers to find the reference source code. */
// "mapRoot": "", /* Specify the location where debugger should locate map files instead of generated locations. */
// "inlineSourceMap": true, /* Include sourcemap files inside the emitted JavaScript. */
// "inlineSources": true, /* Include source code in the sourcemaps inside the emitted JavaScript. */
// "emitBOM": true, /* Emit a UTF-8 Byte Order Mark (BOM) in the beginning of output files. */
// "newLine": "crlf", /* Set the newline character for emitting files. */
// "stripInternal": true, /* Disable emitting declarations that have `@internal` in their JSDoc comments. */
// "noEmitHelpers": true, /* Disable generating custom helper functions like `__extends` in compiled output. */
// "noEmitOnError": true, /* Disable emitting files if any type checking errors are reported. */
// "preserveConstEnums": true, /* Disable erasing `const enum` declarations in generated code. */
// "declarationDir": "./", /* Specify the output directory for generated declaration files. */
// "preserveValueImports": true, /* Preserve unused imported values in the JavaScript output that would otherwise be removed. */
/* Interop Constraints */
"isolatedModules": true /* Ensure that each file can be safely transpiled without relying on other imports. */,
"allowSyntheticDefaultImports": true /* Allow 'import x from y' when a module doesn't have a default export. */,
// "esModuleInterop": true /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables `allowSyntheticDefaultImports` for type compatibility. */,
// "preserveSymlinks": true, /* Disable resolving symlinks to their realpath. This correlates to the same flag in node. */
"forceConsistentCasingInFileNames": true /* Ensure that casing is correct in imports. */,
/* Type Checking */
"strict": true /* Enable all strict type-checking options. */,
// "noImplicitAny": true, /* Enable error reporting for expressions and declarations with an implied `any` type.. */
// "strictNullChecks": true, /* When type checking, take into account `null` and `undefined`. */
// "strictFunctionTypes": true, /* When assigning functions, check to ensure parameters and the return values are subtype-compatible. */
// "strictBindCallApply": true, /* Check that the arguments for `bind`, `call`, and `apply` methods match the original function. */
// "strictPropertyInitialization": true, /* Check for class properties that are declared but not set in the constructor. */
// "noImplicitThis": true, /* Enable error reporting when `this` is given the type `any`. */
// "useUnknownInCatchVariables": true, /* Type catch clause variables as 'unknown' instead of 'any'. */
// "alwaysStrict": true, /* Ensure 'use strict' is always emitted. */
// "noUnusedLocals": true, /* Enable error reporting when a local variables aren't read. */
// "noUnusedParameters": true, /* Raise an error when a function parameter isn't read */
// "exactOptionalPropertyTypes": true, /* Interpret optional property types as written, rather than adding 'undefined'. */
// "noImplicitReturns": true, /* Enable error reporting for codepaths that do not explicitly return in a function. */
// "noFallthroughCasesInSwitch": true, /* Enable error reporting for fallthrough cases in switch statements. */
// "noUncheckedIndexedAccess": true, /* Include 'undefined' in index signature results */
// "noImplicitOverride": true, /* Ensure overriding members in derived classes are marked with an override modifier. */
// "noPropertyAccessFromIndexSignature": true, /* Enforces using indexed accessors for keys declared using an indexed type */
// "allowUnusedLabels": true, /* Disable error reporting for unused labels. */
// "allowUnreachableCode": true, /* Disable error reporting for unreachable code. */
/* Completeness */
// "skipDefaultLibCheck": true, /* Skip type checking .d.ts files that are included with TypeScript. */
"skipLibCheck": true /* Skip type checking all .d.ts files. */
},
"exclude": ["test"]
}

View File

@ -0,0 +1,11 @@
import { defineWorkersConfig } from "@cloudflare/vitest-pool-workers/config";
export default defineWorkersConfig({
test: {
poolOptions: {
workers: {
wrangler: { configPath: "./wrangler.toml" },
},
},
},
});

4
backend/ai/worker-configuration.d.ts vendored Normal file
View File

@ -0,0 +1,4 @@
// Generated by Wrangler
// After adding bindings to `wrangler.toml`, regenerate this interface via `npm run cf-typegen`
interface Env {
}

View File

@ -0,0 +1,7 @@
name = "ai"
main = "src/index.ts"
compatibility_date = "2024-05-12"
compatibility_flags = ["nodejs_compat"]
[ai]
binding = "AI"

View File

@ -0,0 +1,5 @@
{
"tabWidth": 2,
"semi": false,
"singleQuote": false
}

View File

@ -1,4 +1,4 @@
import type { Config } from "drizzle-kit"
import type { Config } from "drizzle-kit";
export default process.env.LOCAL_DB_PATH
? ({
@ -16,4 +16,4 @@ export default process.env.LOCAL_DB_PATH
wranglerConfigPath: "wrangler.toml",
dbName: "d1-sandbox",
},
} satisfies Config)
} satisfies Config);

View File

@ -1,46 +0,0 @@
CREATE TABLE `sandbox` (
`id` text PRIMARY KEY NOT NULL,
`name` text NOT NULL,
`type` text NOT NULL,
`visibility` text,
`createdAt` integer DEFAULT CURRENT_TIMESTAMP,
`user_id` text NOT NULL,
`likeCount` integer DEFAULT 0,
`viewCount` integer DEFAULT 0,
FOREIGN KEY (`user_id`) REFERENCES `user`(`id`) ON UPDATE no action ON DELETE no action
);
--> statement-breakpoint
CREATE TABLE `sandbox_likes` (
`user_id` text NOT NULL,
`sandbox_id` text NOT NULL,
`createdAt` integer DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY(`sandbox_id`, `user_id`),
FOREIGN KEY (`user_id`) REFERENCES `user`(`id`) ON UPDATE no action ON DELETE no action,
FOREIGN KEY (`sandbox_id`) REFERENCES `sandbox`(`id`) ON UPDATE no action ON DELETE no action
);
--> statement-breakpoint
CREATE TABLE `user` (
`id` text PRIMARY KEY NOT NULL,
`name` text NOT NULL,
`email` text NOT NULL,
`username` text NOT NULL,
`avatarUrl` text,
`githubToken` text,
`createdAt` integer DEFAULT CURRENT_TIMESTAMP,
`generations` integer DEFAULT 0,
`tier` text DEFAULT 'FREE',
`tierExpiresAt` integer,
`lastResetDate` integer
);
--> statement-breakpoint
CREATE TABLE `users_to_sandboxes` (
`userId` text NOT NULL,
`sandboxId` text NOT NULL,
`sharedOn` integer,
FOREIGN KEY (`userId`) REFERENCES `user`(`id`) ON UPDATE no action ON DELETE no action,
FOREIGN KEY (`sandboxId`) REFERENCES `sandbox`(`id`) ON UPDATE no action ON DELETE no action
);
--> statement-breakpoint
CREATE UNIQUE INDEX `sandbox_id_unique` ON `sandbox` (`id`);--> statement-breakpoint
CREATE UNIQUE INDEX `user_id_unique` ON `user` (`id`);--> statement-breakpoint
CREATE UNIQUE INDEX `user_username_unique` ON `user` (`username`);

View File

@ -1,3 +0,0 @@
ALTER TABLE user ADD `bio` text;--> statement-breakpoint
ALTER TABLE user ADD `personalWebsite` text;--> statement-breakpoint
ALTER TABLE user ADD `links` text DEFAULT '[]';

View File

@ -1 +0,0 @@
ALTER TABLE sandbox ADD `containerId` text;

View File

@ -1,7 +1,7 @@
{
"version": "5",
"dialect": "sqlite",
"id": "4ada398d-7e4e-448f-8cea-a10b4d844600",
"id": "6570ba20-a672-400c-8147-7ba533784918",
"prevId": "00000000-0000-0000-0000-000000000000",
"tables": {
"sandbox": {
@ -35,36 +35,12 @@
"notNull": false,
"autoincrement": false
},
"createdAt": {
"name": "createdAt",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "CURRENT_TIMESTAMP"
},
"user_id": {
"name": "user_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"likeCount": {
"name": "likeCount",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": 0
},
"viewCount": {
"name": "viewCount",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": 0
}
},
"indexes": {
@ -94,72 +70,6 @@
"compositePrimaryKeys": {},
"uniqueConstraints": {}
},
"sandbox_likes": {
"name": "sandbox_likes",
"columns": {
"user_id": {
"name": "user_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"sandbox_id": {
"name": "sandbox_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"createdAt": {
"name": "createdAt",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "CURRENT_TIMESTAMP"
}
},
"indexes": {},
"foreignKeys": {
"sandbox_likes_user_id_user_id_fk": {
"name": "sandbox_likes_user_id_user_id_fk",
"tableFrom": "sandbox_likes",
"tableTo": "user",
"columnsFrom": [
"user_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"sandbox_likes_sandbox_id_sandbox_id_fk": {
"name": "sandbox_likes_sandbox_id_sandbox_id_fk",
"tableFrom": "sandbox_likes",
"tableTo": "sandbox",
"columnsFrom": [
"sandbox_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {
"sandbox_likes_sandbox_id_user_id_pk": {
"columns": [
"sandbox_id",
"user_id"
],
"name": "sandbox_likes_sandbox_id_user_id_pk"
}
},
"uniqueConstraints": {}
},
"user": {
"name": "user",
"columns": {
@ -183,65 +93,6 @@
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"username": {
"name": "username",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"avatarUrl": {
"name": "avatarUrl",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"githubToken": {
"name": "githubToken",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"createdAt": {
"name": "createdAt",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "CURRENT_TIMESTAMP"
},
"generations": {
"name": "generations",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": 0
},
"tier": {
"name": "tier",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "'FREE'"
},
"tierExpiresAt": {
"name": "tierExpiresAt",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"lastResetDate": {
"name": "lastResetDate",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
@ -251,13 +102,6 @@
"id"
],
"isUnique": true
},
"user_username_unique": {
"name": "user_username_unique",
"columns": [
"username"
],
"isUnique": true
}
},
"foreignKeys": {},
@ -280,13 +124,6 @@
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"sharedOn": {
"name": "sharedOn",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},

View File

@ -1,353 +1,175 @@
{
"version": "5",
"dialect": "sqlite",
"id": "80c0b0b2-bb0e-449a-b447-c21863686f58",
"prevId": "4ada398d-7e4e-448f-8cea-a10b4d844600",
"tables": {
"sandbox": {
"name": "sandbox",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"visibility": {
"name": "visibility",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"createdAt": {
"name": "createdAt",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "CURRENT_TIMESTAMP"
},
"user_id": {
"name": "user_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"likeCount": {
"name": "likeCount",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": 0
},
"viewCount": {
"name": "viewCount",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": 0
}
"version": "5",
"dialect": "sqlite",
"id": "9f64104a-4954-40c0-8155-17755ea0a243",
"prevId": "6570ba20-a672-400c-8147-7ba533784918",
"tables": {
"sandbox": {
"name": "sandbox",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"indexes": {
"sandbox_id_unique": {
"name": "sandbox_id_unique",
"columns": [
"id"
],
"isUnique": true
}
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"foreignKeys": {
"sandbox_user_id_user_id_fk": {
"name": "sandbox_user_id_user_id_fk",
"tableFrom": "sandbox",
"tableTo": "user",
"columnsFrom": [
"user_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"compositePrimaryKeys": {},
"uniqueConstraints": {}
"visibility": {
"name": "visibility",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"user_id": {
"name": "user_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
}
},
"sandbox_likes": {
"name": "sandbox_likes",
"columns": {
"user_id": {
"name": "user_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"sandbox_id": {
"name": "sandbox_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"createdAt": {
"name": "createdAt",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "CURRENT_TIMESTAMP"
}
},
"indexes": {},
"foreignKeys": {
"sandbox_likes_user_id_user_id_fk": {
"name": "sandbox_likes_user_id_user_id_fk",
"tableFrom": "sandbox_likes",
"tableTo": "user",
"columnsFrom": [
"user_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"sandbox_likes_sandbox_id_sandbox_id_fk": {
"name": "sandbox_likes_sandbox_id_sandbox_id_fk",
"tableFrom": "sandbox_likes",
"tableTo": "sandbox",
"columnsFrom": [
"sandbox_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {
"sandbox_likes_sandbox_id_user_id_pk": {
"columns": [
"sandbox_id",
"user_id"
],
"name": "sandbox_likes_sandbox_id_user_id_pk"
}
},
"uniqueConstraints": {}
"indexes": {
"sandbox_id_unique": {
"name": "sandbox_id_unique",
"columns": [
"id"
],
"isUnique": true
}
},
"user": {
"name": "user",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"email": {
"name": "email",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"username": {
"name": "username",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"avatarUrl": {
"name": "avatarUrl",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"githubToken": {
"name": "githubToken",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"createdAt": {
"name": "createdAt",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "CURRENT_TIMESTAMP"
},
"generations": {
"name": "generations",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": 0
},
"bio": {
"name": "bio",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"personalWebsite": {
"name": "personalWebsite",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"links": {
"name": "links",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "'[]'"
},
"tier": {
"name": "tier",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "'FREE'"
},
"tierExpiresAt": {
"name": "tierExpiresAt",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"lastResetDate": {
"name": "lastResetDate",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"user_id_unique": {
"name": "user_id_unique",
"columns": [
"id"
],
"isUnique": true
},
"user_username_unique": {
"name": "user_username_unique",
"columns": [
"username"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {}
"foreignKeys": {
"sandbox_user_id_user_id_fk": {
"name": "sandbox_user_id_user_id_fk",
"tableFrom": "sandbox",
"tableTo": "user",
"columnsFrom": [
"user_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"users_to_sandboxes": {
"name": "users_to_sandboxes",
"columns": {
"userId": {
"name": "userId",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"sandboxId": {
"name": "sandboxId",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"sharedOn": {
"name": "sharedOn",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"users_to_sandboxes_userId_user_id_fk": {
"name": "users_to_sandboxes_userId_user_id_fk",
"tableFrom": "users_to_sandboxes",
"tableTo": "user",
"columnsFrom": [
"userId"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"users_to_sandboxes_sandboxId_sandbox_id_fk": {
"name": "users_to_sandboxes_sandboxId_sandbox_id_fk",
"tableFrom": "users_to_sandboxes",
"tableTo": "sandbox",
"columnsFrom": [
"sandboxId"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {}
}
"compositePrimaryKeys": {},
"uniqueConstraints": {}
},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
"user": {
"name": "user",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"email": {
"name": "email",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"user_id_unique": {
"name": "user_id_unique",
"columns": [
"id"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {}
},
"users_to_sandboxes": {
"name": "users_to_sandboxes",
"columns": {
"userId": {
"name": "userId",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"sandboxId": {
"name": "sandboxId",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"users_to_sandboxes_userId_user_id_fk": {
"name": "users_to_sandboxes_userId_user_id_fk",
"tableFrom": "users_to_sandboxes",
"tableTo": "user",
"columnsFrom": [
"userId"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"users_to_sandboxes_sandboxId_sandbox_id_fk": {
"name": "users_to_sandboxes_sandboxId_sandbox_id_fk",
"tableFrom": "users_to_sandboxes",
"tableTo": "sandbox",
"columnsFrom": [
"sandboxId"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {}
}
}
},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
}
}

View File

@ -1,8 +1,8 @@
{
"version": "5",
"dialect": "sqlite",
"id": "51abcf01-2921-4885-8058-d1ccd576f3e1",
"prevId": "80c0b0b2-bb0e-449a-b447-c21863686f58",
"id": "5baf10d6-7697-42ba-a11a-ee4c7bd7e91e",
"prevId": "9f64104a-4954-40c0-8155-17755ea0a243",
"tables": {
"sandbox": {
"name": "sandbox",
@ -35,43 +35,12 @@
"notNull": false,
"autoincrement": false
},
"createdAt": {
"name": "createdAt",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "CURRENT_TIMESTAMP"
},
"user_id": {
"name": "user_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"likeCount": {
"name": "likeCount",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": 0
},
"viewCount": {
"name": "viewCount",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": 0
},
"containerId": {
"name": "containerId",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
@ -101,72 +70,6 @@
"compositePrimaryKeys": {},
"uniqueConstraints": {}
},
"sandbox_likes": {
"name": "sandbox_likes",
"columns": {
"user_id": {
"name": "user_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"sandbox_id": {
"name": "sandbox_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"createdAt": {
"name": "createdAt",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "CURRENT_TIMESTAMP"
}
},
"indexes": {},
"foreignKeys": {
"sandbox_likes_user_id_user_id_fk": {
"name": "sandbox_likes_user_id_user_id_fk",
"tableFrom": "sandbox_likes",
"tableTo": "user",
"columnsFrom": [
"user_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"sandbox_likes_sandbox_id_sandbox_id_fk": {
"name": "sandbox_likes_sandbox_id_sandbox_id_fk",
"tableFrom": "sandbox_likes",
"tableTo": "sandbox",
"columnsFrom": [
"sandbox_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {
"sandbox_likes_sandbox_id_user_id_pk": {
"columns": [
"sandbox_id",
"user_id"
],
"name": "sandbox_likes_sandbox_id_user_id_pk"
}
},
"uniqueConstraints": {}
},
"user": {
"name": "user",
"columns": {
@ -190,87 +93,6 @@
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"username": {
"name": "username",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"avatarUrl": {
"name": "avatarUrl",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"githubToken": {
"name": "githubToken",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"createdAt": {
"name": "createdAt",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "CURRENT_TIMESTAMP"
},
"generations": {
"name": "generations",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": 0
},
"bio": {
"name": "bio",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"personalWebsite": {
"name": "personalWebsite",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"links": {
"name": "links",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "'[]'"
},
"tier": {
"name": "tier",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false,
"default": "'FREE'"
},
"tierExpiresAt": {
"name": "tierExpiresAt",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"lastResetDate": {
"name": "lastResetDate",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
@ -280,13 +102,6 @@
"id"
],
"isUnique": true
},
"user_username_unique": {
"name": "user_username_unique",
"columns": [
"username"
],
"isUnique": true
}
},
"foreignKeys": {},
@ -309,13 +124,6 @@
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"sharedOn": {
"name": "sharedOn",
"type": "integer",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {},

View File

@ -0,0 +1,175 @@
{
"version": "5",
"dialect": "sqlite",
"id": "37e38b82-1494-4818-8c26-b9024cce3fa9",
"prevId": "5baf10d6-7697-42ba-a11a-ee4c7bd7e91e",
"tables": {
"sandbox": {
"name": "sandbox",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"type": {
"name": "type",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"visibility": {
"name": "visibility",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
},
"user_id": {
"name": "user_id",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
}
},
"indexes": {
"sandbox_id_unique": {
"name": "sandbox_id_unique",
"columns": [
"id"
],
"isUnique": true
}
},
"foreignKeys": {
"sandbox_user_id_user_id_fk": {
"name": "sandbox_user_id_user_id_fk",
"tableFrom": "sandbox",
"tableTo": "user",
"columnsFrom": [
"user_id"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {}
},
"user": {
"name": "user",
"columns": {
"id": {
"name": "id",
"type": "text",
"primaryKey": true,
"notNull": true,
"autoincrement": false
},
"name": {
"name": "name",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"email": {
"name": "email",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"image": {
"name": "image",
"type": "text",
"primaryKey": false,
"notNull": false,
"autoincrement": false
}
},
"indexes": {
"user_id_unique": {
"name": "user_id_unique",
"columns": [
"id"
],
"isUnique": true
}
},
"foreignKeys": {},
"compositePrimaryKeys": {},
"uniqueConstraints": {}
},
"users_to_sandboxes": {
"name": "users_to_sandboxes",
"columns": {
"userId": {
"name": "userId",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
},
"sandboxId": {
"name": "sandboxId",
"type": "text",
"primaryKey": false,
"notNull": true,
"autoincrement": false
}
},
"indexes": {},
"foreignKeys": {
"users_to_sandboxes_userId_user_id_fk": {
"name": "users_to_sandboxes_userId_user_id_fk",
"tableFrom": "users_to_sandboxes",
"tableTo": "user",
"columnsFrom": [
"userId"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
},
"users_to_sandboxes_sandboxId_sandbox_id_fk": {
"name": "users_to_sandboxes_sandboxId_sandbox_id_fk",
"tableFrom": "users_to_sandboxes",
"tableTo": "sandbox",
"columnsFrom": [
"sandboxId"
],
"columnsTo": [
"id"
],
"onDelete": "no action",
"onUpdate": "no action"
}
},
"compositePrimaryKeys": {},
"uniqueConstraints": {}
}
},
"enums": {},
"_meta": {
"schemas": {},
"tables": {},
"columns": {}
}
}

View File

@ -5,22 +5,50 @@
{
"idx": 0,
"version": "5",
"when": 1736155854410,
"tag": "0000_sudden_wallop",
"when": 1714540200800,
"tag": "0000_big_rogue",
"breakpoints": true
},
{
"idx": 1,
"version": "5",
"when": 1736169498666,
"tag": "0001_dusty_komodo",
"when": 1714541190588,
"tag": "0001_empty_black_knight",
"breakpoints": true
},
{
"idx": 2,
"version": "5",
"when": 1736768910615,
"tag": "0002_chemical_brother_voodoo",
"when": 1714541209173,
"tag": "0002_sour_ego",
"breakpoints": true
},
{
"idx": 3,
"version": "5",
"when": 1714541233589,
"tag": "0003_pale_overlord",
"breakpoints": true
},
{
"idx": 4,
"version": "5",
"when": 1714565073180,
"tag": "0004_cuddly_wolf_cub",
"breakpoints": true
},
{
"idx": 5,
"version": "5",
"when": 1714950365718,
"tag": "0005_last_the_twelve",
"breakpoints": true
},
{
"idx": 6,
"version": "5",
"when": 1716432225404,
"tag": "0006_lively_mattie_franklin",
"breakpoints": true
}
]

File diff suppressed because it is too large Load Diff

View File

@ -1,32 +1,32 @@
{
"name": "database",
"version": "0.0.0",
"private": true,
"scripts": {
"deploy": "wrangler deploy",
"dev": "wrangler dev",
"start": "wrangler dev",
"test": "vitest",
"generate": "drizzle-kit generate:sqlite --schema=src/schema.ts",
"up": "drizzle-kit up:sqlite --schema=src/schema.ts",
"db:studio": "cross-env LOCAL_DB_PATH=$(find .wrangler/state/v3/d1/miniflare-D1DatabaseObject -type f -name '*.sqlite' -print -quit) drizzle-kit studio"
},
"devDependencies": {
"@cloudflare/vitest-pool-workers": "^0.1.0",
"@cloudflare/workers-types": "^4.20240405.0",
"@types/itty-router-extras": "^0.4.3",
"drizzle-kit": "^0.20.17",
"typescript": "^5.0.4",
"vitest": "1.3.0",
"wrangler": "^3.101.0"
},
"dependencies": {
"@paralleldrive/cuid2": "^2.2.2",
"better-sqlite3": "^9.5.0",
"cross-env": "^7.0.3",
"drizzle-orm": "^0.30.8",
"itty-router": "^5.0.16",
"itty-router-extras": "^0.4.6",
"zod": "^3.22.4"
}
"name": "database",
"version": "0.0.0",
"private": true,
"scripts": {
"deploy": "wrangler deploy",
"dev": "wrangler dev",
"start": "wrangler dev",
"test": "vitest",
"generate": "drizzle-kit generate:sqlite --schema=src/schema.ts",
"up": "drizzle-kit up:sqlite --schema=src/schema.ts",
"db:studio": "cross-env LOCAL_DB_PATH=$(find .wrangler/state/v3/d1/miniflare-D1DatabaseObject -type f -name '*.sqlite' -print -quit) drizzle-kit studio"
},
"devDependencies": {
"@cloudflare/vitest-pool-workers": "^0.1.0",
"@cloudflare/workers-types": "^4.20240405.0",
"@types/itty-router-extras": "^0.4.3",
"drizzle-kit": "^0.20.17",
"typescript": "^5.0.4",
"vitest": "1.3.0",
"wrangler": "^3.0.0"
},
"dependencies": {
"@paralleldrive/cuid2": "^2.2.2",
"better-sqlite3": "^9.5.0",
"cross-env": "^7.0.3",
"drizzle-orm": "^0.30.8",
"itty-router": "^5.0.16",
"itty-router-extras": "^0.4.6",
"zod": "^3.22.4"
}
}

View File

@ -1,603 +1,303 @@
// import type { DrizzleD1Database } from "drizzle-orm/d1";
import { drizzle } from "drizzle-orm/d1"
import { json } from "itty-router-extras"
import { z } from "zod"
import { ZodError, z } from "zod"
import { and, eq, sql } from "drizzle-orm"
import { user, sandbox, usersToSandboxes } from "./schema"
import * as schema from "./schema"
import { Sandbox, sandbox, sandboxLikes, user, usersToSandboxes } from "./schema"
import { and, eq, sql } from "drizzle-orm"
export interface Env {
DB: D1Database
STORAGE: any
KEY: string
STORAGE_WORKER_URL: string
DB: D1Database
STORAGE: any
KEY: string
STORAGE_WORKER_URL: string
}
// https://github.com/drizzle-team/drizzle-orm/tree/main/examples/cloudflare-d1
// npm run generate
// npx wrangler d1 execute d1-sandbox --local --file=./drizzle/<FILE>
interface SandboxWithLiked extends Sandbox {
liked: boolean
}
interface UserResponse extends Omit<schema.User, "sandbox"> {
sandbox: SandboxWithLiked[]
}
export default {
async fetch(
request: Request,
env: Env,
ctx: ExecutionContext
): Promise<Response> {
const success = new Response("Success", { status: 200 })
const invalidRequest = new Response("Invalid Request", { status: 400 })
const notFound = new Response("Not Found", { status: 404 })
const methodNotAllowed = new Response("Method Not Allowed", { status: 405 })
async fetch(
request: Request,
env: Env,
ctx: ExecutionContext
): Promise<Response> {
const success = new Response("Success", { status: 200 })
const invalidRequest = new Response("Invalid Request", { status: 400 })
const notFound = new Response("Not Found", { status: 404 })
const methodNotAllowed = new Response("Method Not Allowed", { status: 405 })
const url = new URL(request.url)
const path = url.pathname
const method = request.method
const url = new URL(request.url)
const path = url.pathname
const method = request.method
if (request.headers.get("Authorization") !== env.KEY) {
return new Response("Unauthorized", { status: 401 })
}
if (request.headers.get("Authorization") !== env.KEY) {
return new Response("Unauthorized", { status: 401 })
}
const db = drizzle(env.DB, { schema })
const db = drizzle(env.DB, { schema })
if (path === "/api/sandbox") {
if (method === "GET") {
const params = url.searchParams
if (params.has("id")) {
const id = params.get("id") as string
const res = await db.query.sandbox.findFirst({
where: (sandbox, { eq }) => eq(sandbox.id, id),
with: {
usersToSandboxes: true,
},
})
return json(res ?? {})
} else {
const res = await db.select().from(sandbox).all()
return json(res ?? {})
}
} else if (method === "DELETE") {
const params = url.searchParams
if (params.has("id")) {
const id = params.get("id") as string
await db.delete(sandboxLikes).where(eq(sandboxLikes.sandboxId, id))
await db
.delete(usersToSandboxes)
.where(eq(usersToSandboxes.sandboxId, id))
await db.delete(sandbox).where(eq(sandbox.id, id))
if (path === "/api/sandbox") {
if (method === "GET") {
const params = url.searchParams
if (params.has("id")) {
const id = params.get("id") as string
const res = await db.query.sandbox.findFirst({
where: (sandbox, { eq }) => eq(sandbox.id, id),
with: {
usersToSandboxes: true,
},
})
return json(res ?? {})
} else {
const res = await db.select().from(sandbox).all()
return json(res ?? {})
}
} else if (method === "DELETE") {
const params = url.searchParams
if (params.has("id")) {
const id = params.get("id") as string
await db
.delete(usersToSandboxes)
.where(eq(usersToSandboxes.sandboxId, id))
await db.delete(sandbox).where(eq(sandbox.id, id))
const deleteStorageRequest = new Request(
`${env.STORAGE_WORKER_URL}/api/project`,
{
method: "DELETE",
body: JSON.stringify({ sandboxId: id }),
headers: {
"Content-Type": "application/json",
Authorization: `${env.KEY}`,
},
}
)
await env.STORAGE.fetch(deleteStorageRequest)
const deleteStorageRequest = new Request(
`${env.STORAGE_WORKER_URL}/api/project`,
{
method: "DELETE",
body: JSON.stringify({ sandboxId: id }),
headers: {
"Content-Type": "application/json",
Authorization: `${env.KEY}`,
},
}
)
await env.STORAGE.fetch(deleteStorageRequest)
return success
} else {
return invalidRequest
}
} else if (method === "POST") {
const postSchema = z.object({
id: z.string(),
name: z.string().optional(),
visibility: z.enum(["public", "private"]).optional(),
})
return success
} else {
return invalidRequest
}
} else if (method === "POST") {
const postSchema = z.object({
id: z.string(),
name: z.string().optional(),
visibility: z.enum(["public", "private"]).optional(),
})
const body = await request.json()
const { id, name, visibility } = postSchema.parse(body)
const sb = await db
.update(sandbox)
.set({ name, visibility })
.where(eq(sandbox.id, id))
.returning()
.get()
const body = await request.json()
const { id, name, visibility } = postSchema.parse(body)
const sb = await db
.update(sandbox)
.set({ name, visibility })
.where(eq(sandbox.id, id))
.returning()
.get()
return success
} else if (method === "PUT") {
const initSchema = z.object({
type: z.string(),
name: z.string(),
userId: z.string(),
visibility: z.enum(["public", "private"]),
})
return success
} else if (method === "PUT") {
const initSchema = z.object({
type: z.enum(["react", "node"]),
name: z.string(),
userId: z.string(),
visibility: z.enum(["public", "private"]),
})
const body = await request.json()
const { type, name, userId, visibility } = initSchema.parse(body)
const body = await request.json()
const { type, name, userId, visibility } = initSchema.parse(body)
const userSandboxes = await db
.select()
.from(sandbox)
.where(eq(sandbox.userId, userId))
.all()
const allSandboxes = await db.select().from(sandbox).all()
if (allSandboxes.length >= 8) {
return new Response("You reached the maximum # of sandboxes.", {
status: 400,
})
}
if (userSandboxes.length >= 8) {
return new Response("You reached the maximum # of sandboxes.", {
status: 400,
})
}
const sb = await db
.insert(sandbox)
.values({ type, name, userId, visibility, createdAt: new Date() })
.returning()
.get()
const sb = await db
.insert(sandbox)
.values({ type, name, userId, visibility, createdAt: new Date() })
.returning()
.get()
const initStorageRequest = new Request(
`${env.STORAGE_WORKER_URL}/api/init`,
{
method: "POST",
body: JSON.stringify({ sandboxId: sb.id, type }),
headers: {
"Content-Type": "application/json",
Authorization: `${env.KEY}`,
},
}
)
const initStorageRequest = new Request(
`${env.STORAGE_WORKER_URL}/api/init`,
{
method: "POST",
body: JSON.stringify({ sandboxId: sb.id, type }),
headers: {
"Content-Type": "application/json",
Authorization: `${env.KEY}`,
},
}
)
await env.STORAGE.fetch(initStorageRequest)
await env.STORAGE.fetch(initStorageRequest)
return new Response(sb.id, { status: 200 })
} else {
return methodNotAllowed
}
} else if (path === "/api/sandbox/share") {
if (method === "GET") {
const params = url.searchParams
if (params.has("id")) {
const id = params.get("id") as string
const res = await db.query.usersToSandboxes.findMany({
where: (uts, { eq }) => eq(uts.userId, id),
})
return new Response(sb.id, { status: 200 })
} else {
return methodNotAllowed
}
} else if (path === "/api/sandbox/share") {
if (method === "GET") {
const params = url.searchParams
if (params.has("id")) {
const id = params.get("id") as string
const res = await db.query.usersToSandboxes.findMany({
where: (uts, { eq }) => eq(uts.userId, id),
})
const owners = await Promise.all(
res.map(async (r) => {
const sb = await db.query.sandbox.findFirst({
where: (sandbox, { eq }) => eq(sandbox.id, r.sandboxId),
with: {
author: true,
},
})
if (!sb) return
return {
id: sb.id,
name: sb.name,
type: sb.type,
author: sb.author.name,
sharedOn: r.sharedOn,
}
})
)
const owners = await Promise.all(
res.map(async (r) => {
const sb = await db.query.sandbox.findFirst({
where: (sandbox, { eq }) => eq(sandbox.id, r.sandboxId),
with: {
author: true,
},
})
if (!sb) return
return {
id: sb.id,
name: sb.name,
type: sb.type,
author: sb.author.name,
authorAvatarUrl: sb.author.avatarUrl,
sharedOn: r.sharedOn,
}
})
)
return json(owners ?? {})
} else return invalidRequest
} else if (method === "POST") {
const shareSchema = z.object({
sandboxId: z.string(),
email: z.string(),
})
return json(owners ?? {})
} else return invalidRequest
} else if (method === "POST") {
const shareSchema = z.object({
sandboxId: z.string(),
email: z.string(),
})
const body = await request.json()
const { sandboxId, email } = shareSchema.parse(body)
const body = await request.json()
const { sandboxId, email } = shareSchema.parse(body)
const user = await db.query.user.findFirst({
where: (user, { eq }) => eq(user.email, email),
with: {
sandbox: true,
usersToSandboxes: true,
},
})
const user = await db.query.user.findFirst({
where: (user, { eq }) => eq(user.email, email),
with: {
sandbox: true,
usersToSandboxes: true,
},
})
if (!user) {
return new Response("No user associated with email.", { status: 400 })
}
if (!user) {
return new Response("No user associated with email.", { status: 400 })
}
if (user.sandbox.find((sb) => sb.id === sandboxId)) {
return new Response("Cannot share with yourself!", { status: 400 })
}
if (user.sandbox.find((sb) => sb.id === sandboxId)) {
return new Response("Cannot share with yourself!", { status: 400 })
}
if (user.usersToSandboxes.find((uts) => uts.sandboxId === sandboxId)) {
return new Response("User already has access.", { status: 400 })
}
if (user.usersToSandboxes.find((uts) => uts.sandboxId === sandboxId)) {
return new Response("User already has access.", { status: 400 })
}
await db
.insert(usersToSandboxes)
.values({ userId: user.id, sandboxId, sharedOn: new Date() })
.get()
await db
.insert(usersToSandboxes)
.values({ userId: user.id, sandboxId, sharedOn: new Date() })
.get()
return success
} else if (method === "DELETE") {
const deleteShareSchema = z.object({
sandboxId: z.string(),
userId: z.string(),
})
return success
} else if (method === "DELETE") {
const deleteShareSchema = z.object({
sandboxId: z.string(),
userId: z.string(),
})
const body = await request.json()
const { sandboxId, userId } = deleteShareSchema.parse(body)
const body = await request.json()
const { sandboxId, userId } = deleteShareSchema.parse(body)
await db
.delete(usersToSandboxes)
.where(
and(
eq(usersToSandboxes.userId, userId),
eq(usersToSandboxes.sandboxId, sandboxId)
)
)
await db
.delete(usersToSandboxes)
.where(
and(
eq(usersToSandboxes.userId, userId),
eq(usersToSandboxes.sandboxId, sandboxId)
)
)
return success
} else return methodNotAllowed
} else if (path === "/api/sandbox/generate" && method === "POST") {
const generateSchema = z.object({
userId: z.string(),
})
const body = await request.json()
const { userId } = generateSchema.parse(body)
return success
} else return methodNotAllowed
} else if (path === "/api/sandbox/like") {
if (method === "POST") {
const likeSchema = z.object({
sandboxId: z.string(),
userId: z.string(),
})
const dbUser = await db.query.user.findFirst({
where: (user, { eq }) => eq(user.id, userId),
})
if (!dbUser) {
return new Response("User not found.", { status: 400 })
}
if (dbUser.generations !== null && dbUser.generations >= 10) {
return new Response("You reached the maximum # of generations.", {
status: 400,
})
}
try {
const body = await request.json()
const { sandboxId, userId } = likeSchema.parse(body)
await db
.update(user)
.set({ generations: sql`${user.generations} + 1` })
.where(eq(user.id, userId))
.get()
// Check if user has already liked
const existingLike = await db.query.sandboxLikes.findFirst({
where: (likes, { and, eq }) =>
and(eq(likes.sandboxId, sandboxId), eq(likes.userId, userId)),
})
return success
} else if (path === "/api/user") {
if (method === "GET") {
const params = url.searchParams
if (existingLike) {
// Unlike
await db
.delete(sandboxLikes)
.where(
and(
eq(sandboxLikes.sandboxId, sandboxId),
eq(sandboxLikes.userId, userId)
)
)
if (params.has("id")) {
const id = params.get("id") as string
const res = await db.query.user.findFirst({
where: (user, { eq }) => eq(user.id, id),
with: {
sandbox: {
orderBy: (sandbox, { desc }) => [desc(sandbox.createdAt)],
},
usersToSandboxes: true,
},
})
return json(res ?? {})
} else {
const res = await db.select().from(user).all()
return json(res ?? {})
}
} else if (method === "POST") {
const userSchema = z.object({
id: z.string(),
name: z.string(),
email: z.string().email(),
})
await db
.update(sandbox)
.set({
likeCount: sql`${sandbox.likeCount} - 1`,
})
.where(eq(sandbox.id, sandboxId))
const body = await request.json()
const { id, name, email } = userSchema.parse(body)
return json({
message: "Unlike successful",
liked: false,
})
} else {
// Like
await db.insert(sandboxLikes).values({
sandboxId,
userId,
createdAt: new Date(),
})
await db
.update(sandbox)
.set({
likeCount: sql`${sandbox.likeCount} + 1`,
})
.where(eq(sandbox.id, sandboxId))
return json({
message: "Like successful",
liked: true,
})
}
} catch (error) {
return new Response("Invalid request format", { status: 400 })
}
} else if (method === "GET") {
const params = url.searchParams
const sandboxId = params.get("sandboxId")
const userId = params.get("userId")
if (!sandboxId || !userId) {
return invalidRequest
}
const like = await db.query.sandboxLikes.findFirst({
where: (likes, { and, eq }) =>
and(eq(likes.sandboxId, sandboxId), eq(likes.userId, userId)),
})
return json({
liked: !!like,
})
} else {
return methodNotAllowed
}
} else if (path === "/api/user") {
if (method === "GET") {
const params = url.searchParams
if (params.has("id")) {
const id = params.get("id") as string
const res = await db.query.user.findFirst({
where: (user, { eq }) => eq(user.id, id),
with: {
sandbox: {
orderBy: (sandbox, { desc }) => [desc(sandbox.createdAt)],
with: {
likes: true,
},
},
usersToSandboxes: true,
},
})
if (res) {
const transformedUser: UserResponse = {
...res,
sandbox: res.sandbox.map(
(sb): SandboxWithLiked => ({
...sb,
liked: sb.likes.some((like) => like.userId === id),
})
),
}
return json(transformedUser)
}
return json(res ?? {})
} else if (params.has("username")) {
const username = params.get("username") as string
const userId = params.get("currentUserId")
const res = await db.query.user.findFirst({
where: (user, { eq }) => eq(user.username, username),
with: {
sandbox: {
orderBy: (sandbox, { desc }) => [desc(sandbox.createdAt)],
with: {
likes: true,
},
},
usersToSandboxes: true,
},
})
if (res) {
const transformedUser: UserResponse = {
...res,
sandbox: res.sandbox.map(
(sb): SandboxWithLiked => ({
...sb,
liked: sb.likes.some((like) => like.userId === userId),
})
),
}
return json(transformedUser)
}
return json(res ?? {})
} else {
const res = await db.select().from(user).all()
return json(res ?? {})
}
} else if (method === "POST") {
const userSchema = z.object({
id: z.string(),
name: z.string(),
email: z.string().email(),
username: z.string(),
avatarUrl: z.string().optional(),
githubToken: z.string().nullable().optional(),
createdAt: z.string().optional(),
generations: z.number().optional(),
tier: z.enum(["FREE", "PRO", "ENTERPRISE"]).optional(),
tierExpiresAt: z.number().optional(),
lastResetDate: z.number().optional(),
})
const body = await request.json()
const {
id,
name,
email,
username,
avatarUrl,
githubToken,
createdAt,
generations,
tier,
tierExpiresAt,
lastResetDate,
} = userSchema.parse(body)
const res = await db
.insert(user)
.values({
id,
name,
email,
username,
avatarUrl,
githubToken,
createdAt: createdAt ? new Date(createdAt) : new Date(),
generations,
tier,
tierExpiresAt,
lastResetDate,
})
.returning()
.get()
return json({ res })
} else if (method === "DELETE") {
const params = url.searchParams
if (params.has("id")) {
const id = params.get("id") as string
await db.delete(user).where(eq(user.id, id))
return success
} else return invalidRequest
} else if (method === "PUT") {
const updateUserSchema = z.object({
id: z.string(),
name: z.string().optional(),
bio: z.string().optional(),
personalWebsite: z.string().optional(),
links: z
.array(
z.object({
url: z.string(),
platform: z.enum(schema.KNOWN_PLATFORMS),
})
)
.optional(),
email: z.string().email().optional(),
username: z.string().optional(),
avatarUrl: z.string().optional(),
githubToken: z.string().nullable().optional(),
generations: z.number().optional(),
})
try {
const body = await request.json()
const validatedData = updateUserSchema.parse(body)
const { id, username, ...updateData } = validatedData
// If username is being updated, check for existing username
if (username) {
const existingUser = await db
.select()
.from(user)
.where(eq(user.username, username))
.get()
if (existingUser && existingUser.id !== id) {
return json({ error: "Username already exists" }, { status: 409 })
}
}
const cleanUpdateData = {
...updateData,
...(username ? { username } : {}),
}
const res = await db
.update(user)
.set(cleanUpdateData)
.where(eq(user.id, id))
.returning()
.get()
if (!res) {
return json({ error: "User not found" }, { status: 404 })
}
return json({ res })
} catch (error) {
if (error instanceof z.ZodError) {
return json({ error: error.errors }, { status: 400 })
}
return json({ error: "Internal server error" }, { status: 500 })
}
} else {
return methodNotAllowed
}
} else if (path === "/api/user/check-username") {
if (method === "GET") {
const params = url.searchParams
const username = params.get("username")
if (!username) return invalidRequest
const exists = await db.query.user.findFirst({
where: (user, { eq }) => eq(user.username, username),
})
return json({ exists: !!exists })
}
return methodNotAllowed
} else if (
path === "/api/user/increment-generations" &&
method === "POST"
) {
const schema = z.object({
userId: z.string(),
})
const body = await request.json()
const { userId } = schema.parse(body)
await db
.update(user)
.set({ generations: sql`${user.generations} + 1` })
.where(eq(user.id, userId))
.get()
return success
} else if (path === "/api/user/update-tier" && method === "POST") {
const schema = z.object({
userId: z.string(),
tier: z.enum(["FREE", "PRO", "ENTERPRISE"]),
tierExpiresAt: z.date(),
})
const body = await request.json()
const { userId, tier, tierExpiresAt } = schema.parse(body)
await db
.update(user)
.set({
tier,
tierExpiresAt: tierExpiresAt.getTime(),
// Reset generations when upgrading tier
generations: 0,
})
.where(eq(user.id, userId))
.get()
return success
} else if (path === "/api/user/check-reset" && method === "POST") {
const schema = z.object({
userId: z.string(),
})
const body = await request.json()
const { userId } = schema.parse(body)
const dbUser = await db.query.user.findFirst({
where: (user, { eq }) => eq(user.id, userId),
})
if (!dbUser) {
return new Response("User not found", { status: 404 })
}
const now = new Date()
const lastReset = dbUser.lastResetDate
? new Date(dbUser.lastResetDate)
: new Date(0)
if (
now.getMonth() !== lastReset.getMonth() ||
now.getFullYear() !== lastReset.getFullYear()
) {
await db
.update(user)
.set({
generations: 0,
lastResetDate: now.getTime(),
})
.where(eq(user.id, userId))
.get()
return new Response("Reset successful", { status: 200 })
}
return new Response("No reset needed", { status: 200 })
} else return notFound
},
const res = await db
.insert(user)
.values({ id, name, email })
.returning()
.get()
return json({ res })
} else if (method === "DELETE") {
const params = url.searchParams
if (params.has("id")) {
const id = params.get("id") as string
await db.delete(user).where(eq(user.id, id))
return success
} else return invalidRequest
} else {
return methodNotAllowed
}
} else return notFound
},
}

View File

@ -1,140 +1,66 @@
import { createId } from "@paralleldrive/cuid2"
import { relations, sql } from "drizzle-orm"
import { integer, primaryKey, sqliteTable, text } from "drizzle-orm/sqlite-core"
import { integer, sqliteTable, text } from "drizzle-orm/sqlite-core";
import { createId } from "@paralleldrive/cuid2";
import { relations, sql } from "drizzle-orm";
export const KNOWN_PLATFORMS = [
"github",
"twitter",
"instagram",
"bluesky",
"linkedin",
"youtube",
"twitch",
"discord",
"mastodon",
"threads",
"gitlab",
"generic",
] as const
export type KnownPlatform = (typeof KNOWN_PLATFORMS)[number]
export type UserLink = {
url: string
platform: KnownPlatform
}
// #region Tables
export const user = sqliteTable("user", {
id: text("id")
.$defaultFn(() => createId())
.primaryKey()
.unique(),
name: text("name").notNull(),
email: text("email").notNull(),
username: text("username").notNull().unique(),
avatarUrl: text("avatarUrl"),
githubToken: text("githubToken"),
createdAt: integer("createdAt", { mode: "timestamp_ms" }).default(
sql`CURRENT_TIMESTAMP`
),
generations: integer("generations").default(0),
bio: text("bio"),
personalWebsite: text("personalWebsite"),
links: text("links", { mode: "json" }).default("[]").$type<UserLink[]>(),
tier: text("tier", { enum: ["FREE", "PRO", "ENTERPRISE"] }).default("FREE"),
tierExpiresAt: integer("tierExpiresAt"),
lastResetDate: integer("lastResetDate"),
})
id: text("id")
.$defaultFn(() => createId())
.primaryKey()
.unique(),
name: text("name").notNull(),
email: text("email").notNull(),
image: text("image"),
generations: integer("generations").default(0),
});
export type User = typeof user.$inferSelect
export type User = typeof user.$inferSelect;
export const userRelations = relations(user, ({ many }) => ({
sandbox: many(sandbox),
usersToSandboxes: many(usersToSandboxes),
}));
export const sandbox = sqliteTable("sandbox", {
id: text("id")
.$defaultFn(() => createId())
.primaryKey()
.unique(),
name: text("name").notNull(),
type: text("type").notNull(),
visibility: text("visibility", { enum: ["public", "private"] }),
createdAt: integer("createdAt", { mode: "timestamp_ms" }).default(
sql`CURRENT_TIMESTAMP`
),
userId: text("user_id")
.notNull()
.references(() => user.id),
likeCount: integer("likeCount").default(0),
viewCount: integer("viewCount").default(0),
containerId: text("containerId"),
})
id: text("id")
.$defaultFn(() => createId())
.primaryKey()
.unique(),
name: text("name").notNull(),
type: text("type", { enum: ["react", "node"] }).notNull(),
visibility: text("visibility", { enum: ["public", "private"] }),
createdAt: integer("createdAt", { mode: "timestamp_ms" }),
userId: text("user_id")
.notNull()
.references(() => user.id),
});
export type Sandbox = typeof sandbox.$inferSelect
export const sandboxLikes = sqliteTable(
"sandbox_likes",
{
userId: text("user_id")
.notNull()
.references(() => user.id),
sandboxId: text("sandbox_id")
.notNull()
.references(() => sandbox.id),
createdAt: integer("createdAt", { mode: "timestamp_ms" }).default(
sql`CURRENT_TIMESTAMP`
),
},
(table) => ({
pk: primaryKey({ columns: [table.sandboxId, table.userId] }),
})
)
export const usersToSandboxes = sqliteTable("users_to_sandboxes", {
userId: text("userId")
.notNull()
.references(() => user.id),
sandboxId: text("sandboxId")
.notNull()
.references(() => sandbox.id),
sharedOn: integer("sharedOn", { mode: "timestamp_ms" }),
})
// #region Relations
export const userRelations = relations(user, ({ many }) => ({
sandbox: many(sandbox),
usersToSandboxes: many(usersToSandboxes),
likes: many(sandboxLikes),
}))
export type Sandbox = typeof sandbox.$inferSelect;
export const sandboxRelations = relations(sandbox, ({ one, many }) => ({
author: one(user, {
fields: [sandbox.userId],
references: [user.id],
}),
usersToSandboxes: many(usersToSandboxes),
likes: many(sandboxLikes),
}))
author: one(user, {
fields: [sandbox.userId],
references: [user.id],
}),
usersToSandboxes: many(usersToSandboxes),
}));
export const sandboxLikesRelations = relations(sandboxLikes, ({ one }) => ({
user: one(user, {
fields: [sandboxLikes.userId],
references: [user.id],
}),
sandbox: one(sandbox, {
fields: [sandboxLikes.sandboxId],
references: [sandbox.id],
}),
}))
export const usersToSandboxes = sqliteTable("users_to_sandboxes", {
userId: text("userId")
.notNull()
.references(() => user.id),
sandboxId: text("sandboxId")
.notNull()
.references(() => sandbox.id),
sharedOn: integer("sharedOn", { mode: "timestamp_ms" }),
});
export const usersToSandboxesRelations = relations(
usersToSandboxes,
({ one }) => ({
group: one(sandbox, {
fields: [usersToSandboxes.sandboxId],
references: [sandbox.id],
}),
user: one(user, {
fields: [usersToSandboxes.userId],
references: [user.id],
}),
})
)
// #endregion
export const usersToSandboxesRelations = relations(usersToSandboxes, ({ one }) => ({
group: one(sandbox, {
fields: [usersToSandboxes.sandboxId],
references: [sandbox.id],
}),
user: one(user, {
fields: [usersToSandboxes.userId],
references: [user.id],
}),
}));

View File

@ -1,30 +1,25 @@
// test/index.spec.ts
import {
createExecutionContext,
env,
SELF,
waitOnExecutionContext,
} from "cloudflare:test"
import { describe, expect, it } from "vitest"
import worker from "../src/index"
import { env, createExecutionContext, waitOnExecutionContext, SELF } from "cloudflare:test";
import { describe, it, expect } from "vitest";
import worker from "../src/index";
// For now, you'll need to do something like this to get a correctly-typed
// `Request` to pass to `worker.fetch()`.
const IncomingRequest = Request<unknown, IncomingRequestCfProperties>
const IncomingRequest = Request<unknown, IncomingRequestCfProperties>;
describe("Hello World worker", () => {
it("responds with Hello World! (unit style)", async () => {
const request = new IncomingRequest("http://example.com")
// Create an empty context to pass to `worker.fetch()`.
const ctx = createExecutionContext()
const response = await worker.fetch(request, env, ctx)
// Wait for all `Promise`s passed to `ctx.waitUntil()` to settle before running test assertions
await waitOnExecutionContext(ctx)
expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`)
})
it("responds with Hello World! (unit style)", async () => {
const request = new IncomingRequest("http://example.com");
// Create an empty context to pass to `worker.fetch()`.
const ctx = createExecutionContext();
const response = await worker.fetch(request, env, ctx);
// Wait for all `Promise`s passed to `ctx.waitUntil()` to settle before running test assertions
await waitOnExecutionContext(ctx);
expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`);
});
it("responds with Hello World! (integration style)", async () => {
const response = await SELF.fetch("https://example.com")
expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`)
})
})
it("responds with Hello World! (integration style)", async () => {
const response = await SELF.fetch("https://example.com");
expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`);
});
});

View File

@ -1,11 +1,11 @@
{
"extends": "../tsconfig.json",
"compilerOptions": {
"types": [
"@cloudflare/workers-types/experimental",
"@cloudflare/vitest-pool-workers"
]
},
"include": ["./**/*.ts", "../src/env.d.ts"],
"exclude": []
"extends": "../tsconfig.json",
"compilerOptions": {
"types": [
"@cloudflare/workers-types/experimental",
"@cloudflare/vitest-pool-workers"
]
},
"include": ["./**/*.ts", "../src/env.d.ts"],
"exclude": []
}

View File

@ -12,9 +12,7 @@
/* Language and Environment */
"target": "es2021" /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */,
"lib": [
"es2021"
] /* Specify a set of bundled library declaration files that describe the target runtime environment. */,
"lib": ["es2021"] /* Specify a set of bundled library declaration files that describe the target runtime environment. */,
"jsx": "react" /* Specify what JSX code is generated. */,
// "experimentalDecorators": true, /* Enable experimental support for TC39 stage 2 draft decorators. */
// "emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */

View File

@ -1,11 +1,11 @@
import { defineWorkersConfig } from "@cloudflare/vitest-pool-workers/config"
import { defineWorkersConfig } from "@cloudflare/vitest-pool-workers/config";
export default defineWorkersConfig({
test: {
poolOptions: {
workers: {
wrangler: { configPath: "./wrangler.toml" },
},
},
},
})
test: {
poolOptions: {
workers: {
wrangler: { configPath: "./wrangler.toml" },
},
},
},
});

View File

@ -1,13 +1,7 @@
# Set WORKERS_KEY to be the same as KEY in /backend/storage/wrangler.toml.
# Set DATABASE_WORKER_URL and STORAGE_WORKER_URL after deploying the workers.
# DOKKU_HOST and DOKKU_USERNAME are used to authenticate via SSH with the Dokku server
# DOKKU_KEY is the path to an SSH (.pem) key on the local machine
PORT=4000
WORKERS_KEY=
DATABASE_WORKER_URL=
STORAGE_WORKER_URL=
E2B_API_KEY=
DOKKU_HOST=
DOKKU_USERNAME=
DOKKU_KEY=

View File

@ -1,7 +1,5 @@
{
"watch": [
"src"
],
"watch": ["src"],
"ext": "ts",
"exec": "concurrently \"npx tsc --watch\" \"ts-node src/index.ts\""
}

File diff suppressed because it is too large Load Diff

View File

@ -14,21 +14,16 @@
"concurrently": "^8.2.2",
"cors": "^2.8.5",
"dotenv": "^16.4.5",
"e2b": "^1.0.5",
"express": "^4.19.2",
"jzip": "^1.0.0",
"node-pty": "^1.0.0",
"rate-limiter-flexible": "^5.0.3",
"simple-git": "^3.25.0",
"socket.io": "^4.7.5",
"ssh2": "^1.15.0",
"zod": "^3.22.4"
},
"devDependencies": {
"@types/cors": "^2.8.17",
"@types/express": "^4.17.21",
"@types/jszip": "^3.4.1",
"@types/node": "^20.12.7",
"@types/ssh2": "^1.15.0",
"nodemon": "^3.1.0",
"ts-node": "^10.9.2",
"typescript": "^5.4.5"

View File

@ -1,61 +0,0 @@
import { Socket } from "socket.io"
class Counter {
private count: number = 0
increment() {
this.count++
}
decrement() {
this.count = Math.max(0, this.count - 1)
}
getValue(): number {
return this.count
}
}
// Owner Connection Management
export class ConnectionManager {
// Counts how many times the owner is connected to a sandbox
private ownerConnections: Record<string, Counter> = {}
// Stores all sockets connected to a given sandbox
private sockets: Record<string, Set<Socket>> = {}
// Checks if the owner of a sandbox is connected
ownerIsConnected(sandboxId: string): boolean {
return this.ownerConnections[sandboxId]?.getValue() > 0
}
// Adds a connection for a sandbox
addConnectionForSandbox(socket: Socket, sandboxId: string, isOwner: boolean) {
this.sockets[sandboxId] ??= new Set()
this.sockets[sandboxId].add(socket)
// If the connection is for the owner, increments the owner connection counter
if (isOwner) {
this.ownerConnections[sandboxId] ??= new Counter()
this.ownerConnections[sandboxId].increment()
}
}
// Removes a connection for a sandbox
removeConnectionForSandbox(
socket: Socket,
sandboxId: string,
isOwner: boolean
) {
this.sockets[sandboxId]?.delete(socket)
// If the connection being removed is for the owner, decrements the owner connection counter
if (isOwner) {
this.ownerConnections[sandboxId]?.decrement()
}
}
// Returns the set of sockets connected to a given sandbox
connectionsForSandbox(sandboxId: string): Set<Socket> {
return this.sockets[sandboxId] ?? new Set()
}
}

View File

@ -1,61 +0,0 @@
import { SSHConfig, SSHSocketClient } from "./SSHSocketClient"
// Interface for the response structure from Dokku commands
export interface DokkuResponse {
ok: boolean
output: string
}
// DokkuClient class extends SSHSocketClient to interact with Dokku via SSH
export class DokkuClient extends SSHSocketClient {
constructor(config: SSHConfig) {
// Initialize with Dokku daemon socket path
super(config, "/var/run/dokku-daemon/dokku-daemon.sock")
}
// Send a command to Dokku and parse the response
async sendCommand(command: string): Promise<DokkuResponse> {
try {
const response = await this.sendData(command)
if (typeof response !== "string") {
throw new Error("Received data is not a string")
}
// Parse the JSON response from Dokku
return JSON.parse(response)
} catch (error: any) {
throw new Error(`Failed to send command: ${error.message}`)
}
}
// List all deployed Dokku apps
async listApps(): Promise<string[]> {
const response = await this.sendCommand("--quiet apps:list")
return response.output.split("\n")
}
// Get the creation timestamp of an app
async getAppCreatedAt(appName: string): Promise<number> {
const response = await this.sendCommand(
`apps:report --app-created-at ${appName}`
)
const createdAt = parseInt(response.output.trim(), 10)
if (isNaN(createdAt)) {
throw new Error(
`Failed to retrieve creation timestamp for app ${appName}`
)
}
return createdAt
}
// Check if an app exists
async appExists(appName: string): Promise<boolean> {
const response = await this.sendCommand(`apps:exists ${appName}`)
return response.output.includes("App") === false
}
}
export { SSHConfig }

View File

@ -1,594 +0,0 @@
import { FilesystemEvent, Sandbox, WatchHandle } from "e2b"
import JSZip from "jszip"
import path from "path"
import RemoteFileStorage from "./RemoteFileStorage"
import { MAX_BODY_SIZE } from "./ratelimit"
import { TFile, TFileData, TFolder } from "./types"
// Convert list of paths to the hierchical file structure used by the editor
function generateFileStructure(paths: string[]): (TFolder | TFile)[] {
const root: TFolder = { id: "/", type: "folder", name: "/", children: [] }
paths.forEach((path) => {
const parts = path.split("/")
let current: TFolder = root
for (let i = 0; i < parts.length; i++) {
const part = parts[i]
const isFile = i === parts.length - 1 && part.length
const existing = current.children.find((child) => child.name === part)
if (existing) {
if (!isFile) {
current = existing as TFolder
}
} else {
if (isFile) {
const file: TFile = {
id: `/${parts.join("/")}`,
type: "file",
name: part,
}
current.children.push(file)
} else {
const folder: TFolder = {
id: `/${parts.slice(0, i + 1).join("/")}`,
type: "folder",
name: part,
children: [],
}
current.children.push(folder)
current = folder
}
}
}
})
return root.children
}
// FileManager class to handle file operations in a sandbox
export class FileManager {
private sandboxId: string
private sandbox: Sandbox
public files: (TFolder | TFile)[]
public fileData: TFileData[]
private fileWatchers: WatchHandle[] = []
private dirName = "/home/user/project"
private refreshFileList: ((files: (TFolder | TFile)[]) => void) | null
// Constructor to initialize the FileManager
constructor(
sandboxId: string,
sandbox: Sandbox,
refreshFileList: ((files: (TFolder | TFile)[]) => void) | null
) {
this.sandboxId = sandboxId
this.sandbox = sandbox
this.files = []
this.fileData = []
this.refreshFileList = refreshFileList
}
// Fetch file data from list of paths
private async generateFileData(paths: string[]): Promise<TFileData[]> {
const fileData: TFileData[] = []
for (const path of paths) {
const parts = path.split("/")
const isFile = parts.length > 0 && parts[parts.length - 1].length > 0
if (isFile) {
const fileId = `/${parts.join("/")}`
const data = await RemoteFileStorage.fetchFileContent(
`projects/${this.sandboxId}${fileId}`
)
fileData.push({ id: fileId, data })
}
}
return fileData
}
// Convert local file path to remote path
private getRemoteFileId(localId: string): string {
return [
"projects",
this.sandboxId,
localId.startsWith("/") ? localId : localId,
].join("/")
}
// Convert remote file path to local file path
private getLocalFileId(remoteId: string): string | undefined {
const allParts = remoteId.split("/")
if (allParts[1] !== this.sandboxId) return undefined
return allParts.slice(2).join("/")
}
// Convert remote file paths to local file paths
private getLocalFileIds(remoteIds: string[]): string[] {
return remoteIds
.map(this.getLocalFileId.bind(this))
.filter((id) => id !== undefined)
}
// Download files from remote storage
private async updateFileData(): Promise<TFileData[]> {
const remotePaths = await RemoteFileStorage.getSandboxPaths(this.sandboxId)
const localPaths = this.getLocalFileIds(remotePaths)
this.fileData = await this.generateFileData(localPaths)
return this.fileData
}
// Update file structure
private async updateFileStructure(): Promise<(TFolder | TFile)[]> {
const remotePaths = await RemoteFileStorage.getSandboxPaths(this.sandboxId)
const localPaths = this.getLocalFileIds(remotePaths)
this.files = generateFileStructure(localPaths)
return this.files
}
private async loadLocalFiles() {
// Reload file list from the container to include template files
const result = await this.sandbox.commands.run(
`find "${this.dirName}" -type f`
) // List all files recursively
const localPaths = result.stdout.split("\n").filter((path) => path) // Split the output into an array and filter out empty strings
const relativePaths = localPaths.map((filePath) =>
path.posix.relative(this.dirName, filePath)
) // Convert absolute paths to relative paths
this.files = generateFileStructure(relativePaths)
}
// Initialize the FileManager
async initialize() {
// Download files from remote file storage
await this.updateFileStructure()
await this.updateFileData()
// Copy all files from the project to the container
const promises = this.fileData.map(async (file) => {
try {
const filePath = path.posix.join(this.dirName, file.id)
const parentDirectory = path.dirname(filePath)
if (!this.sandbox.files.exists(parentDirectory)) {
await this.sandbox.files.makeDir(parentDirectory)
}
await this.sandbox.files.write(filePath, file.data)
} catch (e: any) {
console.log("Failed to create file: " + e)
}
})
await Promise.all(promises)
await this.loadLocalFiles()
// Make the logged in user the owner of all project files
this.fixPermissions()
await this.watchDirectory(this.dirName)
await this.watchSubdirectories(this.dirName)
}
// Check if the given path is a directory
private async isDirectory(directoryPath: string): Promise<boolean> {
try {
const result = await this.sandbox.commands.run(
`[ -d "${directoryPath}" ] && echo "true" || echo "false"`
)
return result.stdout.trim() === "true"
} catch (e: any) {
console.log("Failed to check if directory: " + e)
return false
}
}
// Change the owner of the project directory to user
private async fixPermissions() {
try {
await this.sandbox.commands.run(`sudo chown -R user "${this.dirName}"`)
} catch (e: any) {
console.log("Failed to fix permissions: " + e)
}
}
// Watch a directory for changes
async watchDirectory(directory: string): Promise<WatchHandle | undefined> {
try {
const handle = await this.sandbox.files.watchDir(
directory,
async (event: FilesystemEvent) => {
try {
function removeDirName(path: string, dirName: string) {
return path.startsWith(dirName)
? path.slice(dirName.length)
: path
}
// This is the absolute file path in the container
const containerFilePath = path.posix.join(directory, event.name)
// This is the file path relative to the project directory
const sandboxFilePath = removeDirName(
containerFilePath,
this.dirName
)
// This is the directory being watched relative to the project directory
const sandboxDirectory = removeDirName(directory, this.dirName)
// Helper function to find a folder by id
function findFolderById(
files: (TFolder | TFile)[],
folderId: string
) {
return files.find(
(file: TFolder | TFile) =>
file.type === "folder" && file.id === folderId
)
}
// Handle file/directory creation event
if (event.type === "create") {
await this.loadLocalFiles()
console.log(`Create ${sandboxFilePath}`)
}
// Handle file/directory removal or rename event
else if (event.type === "remove" || event.type == "rename") {
await this.loadLocalFiles()
console.log(`Removed: ${sandboxFilePath}`)
}
// Handle file write event
else if (event.type === "write") {
const folder = findFolderById(
this.files,
sandboxDirectory
) as TFolder
const fileToWrite = this.fileData.find(
(file) => file.id === sandboxFilePath
)
if (fileToWrite) {
fileToWrite.data = await this.sandbox.files.read(
containerFilePath
)
console.log(`Write to ${sandboxFilePath}`)
} else {
// If the file is part of a folder structure, locate it and update its data
const fileInFolder = folder?.children.find(
(file) => file.id === sandboxFilePath
)
if (fileInFolder) {
const fileData = await this.sandbox.files.read(
containerFilePath
)
const fileContents =
typeof fileData === "string" ? fileData : ""
this.fileData.push({
id: sandboxFilePath,
data: fileContents,
})
console.log(`Write to ${sandboxFilePath}`)
}
}
}
// Tell the client to reload the file list
if (event.type !== "chmod") {
this.refreshFileList?.(this.files)
}
} catch (error) {
console.error(
`Error handling ${event.type} event for ${event.name}:`,
error
)
}
},
{ timeoutMs: 0 }
)
this.fileWatchers.push(handle)
return handle
} catch (error) {
console.error(`Error watching filesystem:`, error)
}
}
// Watch subdirectories recursively
async watchSubdirectories(directory: string) {
const dirContent = await this.sandbox.files.list(directory)
await Promise.all(
dirContent.map(async (item) => {
if (item.type === "dir") {
console.log("Watching " + item.path)
await this.watchDirectory(item.path)
}
})
)
}
// Get file content
async getFile(fileId: string): Promise<string | undefined> {
const filePath = path.posix.join(this.dirName, fileId)
const fileContent = await this.sandbox.files.read(filePath)
return fileContent
}
// Get folder content
async getFolder(folderId: string): Promise<string[]> {
const remotePaths = await RemoteFileStorage.getFolder(
this.getRemoteFileId(folderId)
)
return this.getLocalFileIds(remotePaths)
}
// Save file content
async saveFile(fileId: string, body: string): Promise<void> {
if (!fileId) return // handles saving when no file is open
if (Buffer.byteLength(body, "utf-8") > MAX_BODY_SIZE) {
throw new Error("File size too large. Please reduce the file size.")
}
// Save to remote storage
await RemoteFileStorage.saveFile(this.getRemoteFileId(fileId), body)
// Update local file data cache
let file = this.fileData.find((f) => f.id === fileId)
if (file) {
file.data = body
} else {
file = {
id: fileId,
data: body,
}
this.fileData.push(file)
}
// Save to sandbox filesystem
const filePath = path.posix.join(this.dirName, fileId)
await this.sandbox.files.write(filePath, body)
// Instead of updating the entire file structure, just ensure this file exists in it
const parts = fileId.split('/').filter(Boolean)
let current = this.files
let currentPath = ''
// Navigate/create the path to the file
for (let i = 0; i < parts.length - 1; i++) {
currentPath += '/' + parts[i]
let folder = current.find(
(f) => f.type === 'folder' && f.name === parts[i]
) as TFolder
if (!folder) {
folder = {
id: currentPath,
type: 'folder',
name: parts[i],
children: [],
}
current.push(folder)
}
current = folder.children
}
// Add/update the file in the structure if it doesn't exist
const fileName = parts[parts.length - 1]
const existingFile = current.find(
(f) => f.type === 'file' && f.name === fileName
)
if (!existingFile) {
current.push({
id: fileId,
type: 'file',
name: fileName,
})
}
this.refreshFileList?.(this.files)
this.fixPermissions()
}
// Move a file to a different folder
async moveFile(
fileId: string,
folderId: string
): Promise<(TFolder | TFile)[]> {
const fileData = this.fileData.find((f) => f.id === fileId)
const file = this.files.find((f) => f.id === fileId)
if (!fileData || !file) return this.files
const parts = fileId.split("/")
const newFileId = folderId + "/" + parts.pop()
await this.moveFileInContainer(fileId, newFileId)
await this.fixPermissions()
fileData.id = newFileId
file.id = newFileId
await RemoteFileStorage.renameFile(
this.getRemoteFileId(fileId),
this.getRemoteFileId(newFileId),
fileData.data
)
return this.updateFileStructure()
}
// Move a file within the container
private async moveFileInContainer(oldPath: string, newPath: string) {
try {
const fileContents = await this.sandbox.files.read(
path.posix.join(this.dirName, oldPath)
)
await this.sandbox.files.write(
path.posix.join(this.dirName, newPath),
fileContents
)
await this.sandbox.files.remove(path.posix.join(this.dirName, oldPath))
} catch (e) {
console.error(`Error moving file from ${oldPath} to ${newPath}:`, e)
}
}
// Create a new file
async createFile(name: string): Promise<boolean> {
const size: number = await RemoteFileStorage.getProjectSize(this.sandboxId)
if (size > 200 * 1024 * 1024) {
throw new Error("Project size exceeded. Please delete some files.")
}
const id = `/${name}`
await this.sandbox.files.write(path.posix.join(this.dirName, id), "")
await this.fixPermissions()
await RemoteFileStorage.createFile(this.getRemoteFileId(id))
return true
}
public async loadFileContent(): Promise<TFileData[]> {
// Get all file paths, excluding node_modules
const result = await this.sandbox.commands.run(
`find "${this.dirName}" -path "${this.dirName}/node_modules" -prune -o -type f -print`
)
const filePaths = result.stdout.split("\n").filter((path) => path) ?? []
console.log("Paths found for download (excluding node_modules):", filePaths)
// Add files to zip with synchronized content
for (const filePath of filePaths) {
const relativePath = filePath.replace(this.dirName, "") // Remove base directory from path
try {
// Read the file content from the sandbox
const content = await this.sandbox.files.read(filePath)
// Find the existing file data entry or create a new one
const fileDataEntry = this.fileData.find(
(f) => f.id === relativePath
) || {
id: relativePath,
data: typeof content === "string" ? content : "",
}
// Update the file data entry if it already exists, otherwise add it to the list
if (!this.fileData.includes(fileDataEntry)) {
this.fileData.push(fileDataEntry)
} else {
fileDataEntry.data = typeof content === "string" ? content : ""
}
} catch (error) {
console.error(`Failed to read content for ${relativePath}:`, error)
}
}
return this.fileData
}
public async getFilesForDownload(): Promise<string> {
// Create new JSZip instance
const zip = new JSZip()
await this.loadFileContent()
if (this.fileData.length === 0) {
console.error(
"No files found in the sandbox project directory for download."
)
return ""
}
// Add files to zip with synchronized content
for (const fileDataEntry of this.fileData) {
const relativePath = fileDataEntry.id
const content = fileDataEntry.data
zip.file(relativePath, content)
console.log(`Added file to ZIP: ${relativePath}`)
}
// Generate zip file
const zipBlob = await zip.generateAsync({
type: "blob",
compression: "DEFLATE",
compressionOptions: {
level: 6,
},
})
// Convert Blob to Base64
const zipBlobArrayBuffer = await zipBlob.arrayBuffer()
const zipBlobBase64 = btoa(
String.fromCharCode(...new Uint8Array(zipBlobArrayBuffer))
)
return zipBlobBase64
}
// Create a new folder
async createFolder(name: string): Promise<void> {
const id = `/${name}`
await this.sandbox.files.makeDir(path.posix.join(this.dirName, id))
}
// Rename a file
async renameFile(fileId: string, newName: string): Promise<void> {
const fileData = this.fileData.find((f) => f.id === fileId)
const file = this.files.find((f) => f.id === fileId)
if (!fileData || !file) return
const parts = fileId.split("/")
const newFileId = parts.slice(0, parts.length - 1).join("/") + "/" + newName
await this.moveFileInContainer(fileId, newFileId)
await this.fixPermissions()
await RemoteFileStorage.renameFile(
this.getRemoteFileId(fileId),
this.getRemoteFileId(newFileId),
fileData.data
)
fileData.id = newFileId
file.id = newFileId
}
// Delete a file
async deleteFile(fileId: string): Promise<(TFolder | TFile)[]> {
const file = this.fileData.find((f) => f.id === fileId)
if (!file) return this.files
await this.sandbox.files.remove(path.posix.join(this.dirName, fileId))
await RemoteFileStorage.deleteFile(this.getRemoteFileId(fileId))
return this.updateFileStructure()
}
// Delete a folder
async deleteFolder(folderId: string): Promise<(TFolder | TFile)[]> {
const files = await RemoteFileStorage.getFolder(
this.getRemoteFileId(folderId)
)
await Promise.all(
files.map(async (file) => {
await this.sandbox.files.remove(path.posix.join(this.dirName, file))
await RemoteFileStorage.deleteFile(this.getRemoteFileId(file))
})
)
return this.updateFileStructure()
}
// Close all file watchers
async closeWatchers() {
await Promise.all(
this.fileWatchers.map(async (handle: WatchHandle) => {
await handle.stop()
})
)
}
}

View File

@ -1,113 +0,0 @@
import * as dotenv from "dotenv"
import { R2Files } from "./types"
dotenv.config()
export const RemoteFileStorage = {
getSandboxPaths: async (id: string) => {
const res = await fetch(
`${process.env.STORAGE_WORKER_URL}/api?sandboxId=${id}`,
{
headers: {
Authorization: `${process.env.WORKERS_KEY}`,
},
}
)
const data: R2Files = await res.json()
return data.objects.map((obj) => obj.key)
},
getFolder: async (folderId: string) => {
const res = await fetch(
`${process.env.STORAGE_WORKER_URL}/api?folderId=${folderId}`,
{
headers: {
Authorization: `${process.env.WORKERS_KEY}`,
},
}
)
const data: R2Files = await res.json()
return data.objects.map((obj) => obj.key)
},
fetchFileContent: async (fileId: string): Promise<string> => {
try {
const fileRes = await fetch(
`${process.env.STORAGE_WORKER_URL}/api?fileId=${fileId}`,
{
headers: {
Authorization: `${process.env.WORKERS_KEY}`,
},
}
)
return await fileRes.text()
} catch (error) {
console.error("ERROR fetching file:", error)
return ""
}
},
createFile: async (fileId: string) => {
const res = await fetch(`${process.env.STORAGE_WORKER_URL}/api`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `${process.env.WORKERS_KEY}`,
},
body: JSON.stringify({ fileId }),
})
return res.ok
},
renameFile: async (fileId: string, newFileId: string, data: string) => {
const res = await fetch(`${process.env.STORAGE_WORKER_URL}/api/rename`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `${process.env.WORKERS_KEY}`,
},
body: JSON.stringify({ fileId, newFileId, data }),
})
return res.ok
},
saveFile: async (fileId: string, data: string) => {
const res = await fetch(`${process.env.STORAGE_WORKER_URL}/api/save`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `${process.env.WORKERS_KEY}`,
},
body: JSON.stringify({ fileId, data }),
})
return res.ok
},
deleteFile: async (fileId: string) => {
const res = await fetch(`${process.env.STORAGE_WORKER_URL}/api`, {
method: "DELETE",
headers: {
"Content-Type": "application/json",
Authorization: `${process.env.WORKERS_KEY}`,
},
body: JSON.stringify({ fileId }),
})
return res.ok
},
getProjectSize: async (id: string) => {
const res = await fetch(
`${process.env.STORAGE_WORKER_URL}/api/size?sandboxId=${id}`,
{
headers: {
Authorization: `${process.env.WORKERS_KEY}`,
},
}
)
return (await res.json()).size
},
}
export default RemoteFileStorage

View File

@ -1,104 +0,0 @@
import { Client } from "ssh2"
// Interface defining the configuration for SSH connection
export interface SSHConfig {
host: string
port?: number
username: string
privateKey: Buffer
}
// Class to handle SSH connections and communicate with a Unix socket
export class SSHSocketClient {
private conn: Client
private config: SSHConfig
private socketPath: string
private _isConnected: boolean = false
public get isConnected(): boolean {
return this._isConnected
}
// Constructor initializes the SSH client and sets up configuration
constructor(config: SSHConfig, socketPath: string) {
this.conn = new Client()
this.config = { ...config, port: 22 } // Default port to 22 if not provided
this.socketPath = socketPath
this.setupTerminationHandlers()
}
// Set up handlers for graceful termination
private setupTerminationHandlers() {
process.on("SIGINT", this.closeConnection.bind(this))
process.on("SIGTERM", this.closeConnection.bind(this))
}
// Method to close the SSH connection
private closeConnection() {
console.log("Closing SSH connection...")
this.conn.end()
this._isConnected = false
process.exit(0)
}
// Method to establish the SSH connection
connect(): Promise<void> {
return new Promise((resolve, reject) => {
this.conn
.on("ready", () => {
console.log("SSH connection established")
this._isConnected = true
resolve()
})
.on("error", (err) => {
console.error("SSH connection error:", err)
this._isConnected = false
reject(err)
})
.on("close", () => {
console.log("SSH connection closed")
this._isConnected = false
})
.connect(this.config)
})
}
// Method to send data through the SSH connection to the Unix socket
sendData(data: string): Promise<string> {
return new Promise((resolve, reject) => {
if (!this.isConnected) {
reject(new Error("SSH connection is not established"))
return
}
// Use netcat to send data to the Unix socket
this.conn.exec(
`echo "${data}" | nc -U ${this.socketPath}`,
(err, stream) => {
if (err) {
reject(err)
return
}
stream
.on("close", (code: number, signal: string) => {
reject(
new Error(
`Stream closed with code ${code} and signal ${signal}`
)
)
})
.on("data", (data: Buffer) => {
// Netcat remains open until it is closed, so we close the connection once we receive data.
resolve(data.toString())
stream.close()
})
.stderr.on("data", (data: Buffer) => {
reject(new Error(data.toString()))
stream.close()
})
}
)
})
}
}

View File

@ -1,305 +0,0 @@
import { Sandbox as E2BSandbox } from "e2b"
import { Socket } from "socket.io"
import { CONTAINER_TIMEOUT } from "./constants"
import { DokkuClient } from "./DokkuClient"
import { FileManager } from "./FileManager"
import {
createFileRL,
createFolderRL,
deleteFileRL,
renameFileRL,
saveFileRL,
} from "./ratelimit"
import { SecureGitClient } from "./SecureGitClient"
import { TerminalManager } from "./TerminalManager"
import { TFile, TFolder } from "./types"
import { LockManager } from "./utils"
const lockManager = new LockManager()
// Define a type for SocketHandler functions
type SocketHandler<T = Record<string, any>> = (args: T) => any
// Extract port number from a string
function extractPortNumber(inputString: string): number | null {
const cleanedString = inputString.replace(/\x1B\[[0-9;]*m/g, "")
const regex = /http:\/\/localhost:(\d+)/
const match = cleanedString.match(regex)
return match ? parseInt(match[1]) : null
}
type ServerContext = {
dokkuClient: DokkuClient | null
gitClient: SecureGitClient | null
}
export class Sandbox {
// Sandbox properties:
sandboxId: string
type: string
fileManager: FileManager | null
terminalManager: TerminalManager | null
container: E2BSandbox | null
// Server context:
dokkuClient: DokkuClient | null
gitClient: SecureGitClient | null
constructor(
sandboxId: string,
type: string,
{ dokkuClient, gitClient }: ServerContext
) {
// Sandbox properties:
this.sandboxId = sandboxId
this.type = type
this.fileManager = null
this.terminalManager = null
this.container = null
// Server context:
this.dokkuClient = dokkuClient
this.gitClient = gitClient
}
// Initializes the container for the sandbox environment
async initialize(
fileWatchCallback: ((files: (TFolder | TFile)[]) => void) | undefined
) {
// Acquire a lock to ensure exclusive access to the sandbox environment
await lockManager.acquireLock(this.sandboxId, async () => {
// Check if a container already exists and is running
if (this.container && (await this.container.isRunning())) {
console.log(`Found existing container ${this.sandboxId}`)
} else {
console.log("Creating container", this.sandboxId)
// Create a new container with a specified template and timeout
const templateTypes = [
"vanillajs",
"reactjs",
"nextjs",
"streamlit",
"php",
]
const template = templateTypes.includes(this.type)
? `gitwit-${this.type}`
: `base`
this.container = await E2BSandbox.create(template, {
timeoutMs: CONTAINER_TIMEOUT,
})
}
})
// Ensure a container was successfully created
if (!this.container) throw new Error("Failed to create container")
// Initialize the terminal manager if it hasn't been set up yet
if (!this.terminalManager) {
this.terminalManager = new TerminalManager(this.container)
console.log(`Terminal manager set up for ${this.sandboxId}`)
}
// Initialize the file manager if it hasn't been set up yet
if (!this.fileManager) {
this.fileManager = new FileManager(
this.sandboxId,
this.container,
fileWatchCallback ?? null
)
// Initialize the file manager and emit the initial files
await this.fileManager.initialize()
}
}
// Called when the client disconnects from the Sandbox
async disconnect() {
// Close all terminals managed by the terminal manager
await this.terminalManager?.closeAllTerminals()
// This way the terminal manager will be set up again if we reconnect
this.terminalManager = null
// Close all file watchers managed by the file manager
await this.fileManager?.closeWatchers()
// This way the file manager will be set up again if we reconnect
this.fileManager = null
}
handlers(connection: { userId: string; isOwner: boolean; socket: Socket }) {
// Handle heartbeat from a socket connection
const handleHeartbeat: SocketHandler = (_: any) => {
// Only keep the sandbox alive if the owner is still connected
if (connection.isOwner) {
this.container?.setTimeout(CONTAINER_TIMEOUT)
}
}
// Handle getting a file
const handleGetFile: SocketHandler = ({ fileId }: any) => {
return this.fileManager?.getFile(fileId)
}
// Handle getting a folder
const handleGetFolder: SocketHandler = ({ folderId }: any) => {
return this.fileManager?.getFolder(folderId)
}
// Handle saving a file
const handleSaveFile: SocketHandler = async ({ fileId, body }: any) => {
await saveFileRL.consume(connection.userId, 1)
return this.fileManager?.saveFile(fileId, body)
}
// Handle moving a file
const handleMoveFile: SocketHandler = ({ fileId, folderId }: any) => {
return this.fileManager?.moveFile(fileId, folderId)
}
// Handle listing apps
const handleListApps: SocketHandler = async (_: any) => {
if (!this.dokkuClient)
throw Error("Failed to retrieve apps list: No Dokku client")
return { success: true, apps: await this.dokkuClient.listApps() }
}
// Handle getting app creation timestamp
const handleGetAppCreatedAt: SocketHandler = async ({ appName }) => {
if (!this.dokkuClient)
throw new Error(
"Failed to retrieve app creation timestamp: No Dokku client"
)
return {
success: true,
createdAt: await this.dokkuClient.getAppCreatedAt(appName),
}
}
// Handle checking if an app exists
const handleAppExists: SocketHandler = async ({ appName }) => {
if (!this.dokkuClient) {
console.log("Failed to check app existence: No Dokku client")
return {
success: false,
}
}
if (!this.dokkuClient.isConnected) {
console.log(
"Failed to check app existence: The Dokku client is not connected"
)
return {
success: false,
}
}
return {
success: true,
exists: await this.dokkuClient.appExists(appName),
}
}
// Handle deploying code
const handleDeploy: SocketHandler = async (_: any) => {
if (!this.gitClient) throw Error("No git client")
if (!this.fileManager) throw Error("No file manager")
await this.gitClient.pushFiles(
await this.fileManager?.loadFileContent(),
this.sandboxId
)
return { success: true }
}
// Handle creating a file
const handleCreateFile: SocketHandler = async ({ name }: any) => {
await createFileRL.consume(connection.userId, 1)
return { success: await this.fileManager?.createFile(name) }
}
// Handle creating a folder
const handleCreateFolder: SocketHandler = async ({ name }: any) => {
await createFolderRL.consume(connection.userId, 1)
return { success: await this.fileManager?.createFolder(name) }
}
// Handle renaming a file
const handleRenameFile: SocketHandler = async ({
fileId,
newName,
}: any) => {
await renameFileRL.consume(connection.userId, 1)
return this.fileManager?.renameFile(fileId, newName)
}
// Handle deleting a file
const handleDeleteFile: SocketHandler = async ({ fileId }: any) => {
await deleteFileRL.consume(connection.userId, 1)
return this.fileManager?.deleteFile(fileId)
}
// Handle deleting a folder
const handleDeleteFolder: SocketHandler = ({ folderId }: any) => {
return this.fileManager?.deleteFolder(folderId)
}
// Handle creating a terminal session
const handleCreateTerminal: SocketHandler = async ({ id }: any) => {
await lockManager.acquireLock(this.sandboxId, async () => {
await this.terminalManager?.createTerminal(
id,
(responseString: string) => {
connection.socket.emit("terminalResponse", {
id,
data: responseString,
})
const port = extractPortNumber(responseString)
if (port) {
connection.socket.emit(
"previewURL",
"https://" + this.container?.getHost(port)
)
}
}
)
})
}
// Handle resizing a terminal
const handleResizeTerminal: SocketHandler = ({ dimensions }: any) => {
this.terminalManager?.resizeTerminal(dimensions)
}
// Handle sending data to a terminal
const handleTerminalData: SocketHandler = ({ id, data }: any) => {
return this.terminalManager?.sendTerminalData(id, data)
}
// Handle closing a terminal
const handleCloseTerminal: SocketHandler = ({ id }: any) => {
return this.terminalManager?.closeTerminal(id)
}
// Handle downloading files by download button
const handleDownloadFiles: SocketHandler = async () => {
if (!this.fileManager) throw Error("No file manager")
// Get the Base64 encoded ZIP string
const zipBase64 = await this.fileManager.getFilesForDownload()
return { zipBlob: zipBase64 }
}
return {
heartbeat: handleHeartbeat,
getFile: handleGetFile,
downloadFiles: handleDownloadFiles,
getFolder: handleGetFolder,
saveFile: handleSaveFile,
moveFile: handleMoveFile,
listApps: handleListApps,
getAppCreatedAt: handleGetAppCreatedAt,
getAppExists: handleAppExists,
deploy: handleDeploy,
createFile: handleCreateFile,
createFolder: handleCreateFolder,
renameFile: handleRenameFile,
deleteFile: handleDeleteFile,
deleteFolder: handleDeleteFolder,
createTerminal: handleCreateTerminal,
resizeTerminal: handleResizeTerminal,
terminalData: handleTerminalData,
closeTerminal: handleCloseTerminal,
}
}
}

View File

@ -1,84 +0,0 @@
import fs from "fs"
import os from "os"
import path from "path"
import simpleGit, { SimpleGit } from "simple-git"
export type FileData = {
id: string
data: string
}
export class SecureGitClient {
private gitUrl: string
private sshKeyPath: string
constructor(gitUrl: string, sshKeyPath: string) {
this.gitUrl = gitUrl
this.sshKeyPath = sshKeyPath
}
async pushFiles(fileData: FileData[], repository: string): Promise<void> {
let tempDir: string | undefined
try {
// Create a temporary directory
tempDir = fs.mkdtempSync(path.posix.join(os.tmpdir(), "git-push-"))
console.log(`Temporary directory created: ${tempDir}`)
// Write files to the temporary directory
console.log(`Writing ${fileData.length} files.`)
for (const { id, data } of fileData) {
const filePath = path.posix.join(tempDir, id)
const dirPath = path.dirname(filePath)
if (!fs.existsSync(dirPath)) {
fs.mkdirSync(dirPath, { recursive: true })
}
fs.writeFileSync(filePath, data)
}
// Initialize the simple-git instance with the temporary directory and custom SSH command
const git: SimpleGit = simpleGit(tempDir, {
config: [
"core.sshCommand=ssh -i " +
this.sshKeyPath +
" -o IdentitiesOnly=yes",
],
}).outputHandler((_command, stdout, stderr) => {
stdout.pipe(process.stdout)
stderr.pipe(process.stderr)
})
// Initialize a new Git repository
await git.init()
// Add remote repository
await git.addRemote("origin", `${this.gitUrl}:${repository}`)
// Add files to the repository
for (const { id, data } of fileData) {
await git.add(id.startsWith("/") ? id.slice(1) : id)
}
// Commit the changes
await git.commit("Add files.")
// Push the changes to the remote repository
await git.push("origin", "master", { "--force": null })
console.log("Files successfully pushed to the repository")
if (tempDir) {
fs.rmSync(tempDir, { recursive: true, force: true })
console.log(`Temporary directory removed: ${tempDir}`)
}
} catch (error) {
if (tempDir) {
fs.rmSync(tempDir, { recursive: true, force: true })
console.log(`Temporary directory removed: ${tempDir}`)
}
console.error("Error pushing files to the repository:", error)
throw error
}
}
}

View File

@ -1,70 +0,0 @@
import { CommandHandle, Sandbox } from "e2b"
// Terminal class to manage a pseudo-terminal (PTY) in a sandbox environment
export class Terminal {
private pty: CommandHandle | undefined // Holds the PTY process handle
private sandbox: Sandbox // Reference to the sandbox environment
// Constructor initializes the Terminal with a sandbox
constructor(sandbox: Sandbox) {
this.sandbox = sandbox
}
// Initialize the terminal with specified rows, columns, and data handler
async init({
rows = 20,
cols = 80,
onData,
}: {
rows?: number
cols?: number
onData: (responseData: string) => void
}): Promise<void> {
// Create a new PTY process
this.pty = await this.sandbox.pty.create({
rows,
cols,
timeoutMs: 0,
onData: (data: Uint8Array) => {
onData(new TextDecoder().decode(data)) // Convert received data to string and pass to handler
},
})
}
// Send data to the terminal
async sendData(data: string) {
if (this.pty) {
await this.sandbox.pty.sendInput(
this.pty.pid,
new TextEncoder().encode(data)
)
} else {
console.log("Cannot send data because pty is not initialized.")
}
}
// Resize the terminal
async resize(size: { cols: number; rows: number }): Promise<void> {
if (this.pty) {
await this.sandbox.pty.resize(this.pty.pid, size)
} else {
console.log("Cannot resize terminal because pty is not initialized.")
}
}
// Close the terminal, killing the PTY process and stopping the input stream
async close(): Promise<void> {
if (this.pty) {
await this.pty.kill()
} else {
console.log("Cannot kill pty because it is not initialized.")
}
}
}
// Usage example:
// const terminal = new Terminal(sandbox);
// await terminal.init();
// terminal.sendData('ls -la');
// await terminal.resize({ cols: 100, rows: 30 });
// await terminal.close();

View File

@ -1,74 +0,0 @@
import { Sandbox } from "e2b"
import { Terminal } from "./Terminal"
export class TerminalManager {
private sandbox: Sandbox
private terminals: Record<string, Terminal> = {}
constructor(sandbox: Sandbox) {
this.sandbox = sandbox
}
async createTerminal(
id: string,
onData: (responseString: string) => void
): Promise<void> {
if (this.terminals[id]) {
return
}
this.terminals[id] = new Terminal(this.sandbox)
await this.terminals[id].init({
onData,
cols: 80,
rows: 20,
})
const defaultDirectory = "/home/user/project"
const defaultCommands = [
`cd "${defaultDirectory}"`,
"export PS1='user> '",
"clear",
]
for (const command of defaultCommands) {
await this.terminals[id].sendData(command + "\r")
}
console.log("Created terminal", id)
}
async resizeTerminal(dimensions: {
cols: number
rows: number
}): Promise<void> {
Object.values(this.terminals).forEach((t) => {
t.resize(dimensions)
})
}
async sendTerminalData(id: string, data: string): Promise<void> {
if (!this.terminals[id]) {
return
}
await this.terminals[id].sendData(data)
}
async closeTerminal(id: string): Promise<void> {
if (!this.terminals[id]) {
return
}
await this.terminals[id].close()
delete this.terminals[id]
}
async closeAllTerminals(): Promise<void> {
await Promise.all(
Object.entries(this.terminals).map(async ([key, terminal]) => {
await terminal.close()
delete this.terminals[key]
})
)
}
}

View File

@ -1,2 +0,0 @@
// The amount of time in ms that a container will stay alive without a hearbeat.
export const CONTAINER_TIMEOUT = 120_000

View File

@ -1,185 +1,465 @@
import cors from "cors"
import dotenv from "dotenv"
import express, { Express } from "express"
import fs from "fs"
import { createServer } from "http"
import { Server, Socket } from "socket.io"
import fs from "fs";
import os from "os";
import path from "path";
import cors from "cors";
import express, { Express } from "express";
import dotenv from "dotenv";
import { createServer } from "http";
import { Server } from "socket.io";
import { ConnectionManager } from "./ConnectionManager"
import { DokkuClient } from "./DokkuClient"
import { Sandbox } from "./Sandbox"
import { SecureGitClient } from "./SecureGitClient"
import { socketAuth } from "./socketAuth" // Import the new socketAuth middleware
import { TFile, TFolder } from "./types"
import { z } from "zod";
import { User } from "./types";
import {
createFile,
deleteFile,
getFolder,
getProjectSize,
getSandboxFiles,
renameFile,
saveFile,
} from "./utils";
import { IDisposable, IPty, spawn } from "node-pty";
import {
MAX_BODY_SIZE,
createFileRL,
createFolderRL,
deleteFileRL,
renameFileRL,
saveFileRL,
} from "./ratelimit";
// Log errors and send a notification to the client
export const handleErrors = (message: string, error: any, socket: Socket) => {
console.error(message, error)
socket.emit("error", `${message} ${error.message ?? error}`)
}
dotenv.config();
// Handle uncaught exceptions
process.on("uncaughtException", (error) => {
console.error("Uncaught Exception:", error)
// Do not exit the process
})
// Handle unhandled promise rejections
process.on("unhandledRejection", (reason, promise) => {
console.error("Unhandled Rejection at:", promise, "reason:", reason)
// Do not exit the process
})
// Initialize containers and managers
const connections = new ConnectionManager()
const sandboxes: Record<string, Sandbox> = {}
// Load environment variables
dotenv.config()
// Initialize Express app and create HTTP server
const app: Express = express()
const port = process.env.PORT || 4000
app.use(cors())
const httpServer = createServer(app)
const app: Express = express();
const port = process.env.PORT || 4000;
app.use(cors());
const httpServer = createServer(app);
const io = new Server(httpServer, {
cors: {
origin: "*", // Allow connections from any origin
origin: "*",
},
})
});
// Middleware for socket authentication
io.use(socketAuth) // Use the new socketAuth middleware
let inactivityTimeout: NodeJS.Timeout | null = null;
let isOwnerConnected = false;
// Check for required environment variables
if (!process.env.DOKKU_HOST)
console.warn("Environment variable DOKKU_HOST is not defined")
if (!process.env.DOKKU_USERNAME)
console.warn("Environment variable DOKKU_USERNAME is not defined")
if (!process.env.DOKKU_KEY)
console.warn("Environment variable DOKKU_KEY is not defined")
const terminals: {
[id: string]: { terminal: IPty; onData: IDisposable; onExit: IDisposable };
} = {};
// Initialize Dokku client
const dokkuClient =
process.env.DOKKU_HOST && process.env.DOKKU_KEY && process.env.DOKKU_USERNAME
? new DokkuClient({
host: process.env.DOKKU_HOST,
username: process.env.DOKKU_USERNAME,
privateKey: fs.readFileSync(process.env.DOKKU_KEY),
})
: null
dokkuClient?.connect()
const dirName = path.join(__dirname, "..");
// Initialize Git client used to deploy Dokku apps
const gitClient =
process.env.DOKKU_HOST && process.env.DOKKU_KEY
? new SecureGitClient(
`dokku@${process.env.DOKKU_HOST}`,
process.env.DOKKU_KEY
)
: null
io.use(async (socket, next) => {
const handshakeSchema = z.object({
userId: z.string(),
sandboxId: z.string(),
EIO: z.string(),
transport: z.string(),
});
const q = socket.handshake.query;
const parseQuery = handshakeSchema.safeParse(q);
if (!parseQuery.success) {
next(new Error("Invalid request."));
return;
}
const { sandboxId, userId } = parseQuery.data;
const dbUser = await fetch(
`${process.env.DATABASE_WORKER_URL}/api/user?id=${userId}`,
{
headers: {
Authorization: `${process.env.WORKERS_KEY}`,
},
}
);
const dbUserJSON = (await dbUser.json()) as User;
if (!dbUserJSON) {
next(new Error("DB error."));
return;
}
const sandbox = dbUserJSON.sandbox.find((s) => s.id === sandboxId);
const sharedSandboxes = dbUserJSON.usersToSandboxes.find(
(uts) => uts.sandboxId === sandboxId
);
if (!sandbox && !sharedSandboxes) {
next(new Error("Invalid credentials."));
return;
}
socket.data = {
userId,
sandboxId: sandboxId,
isOwner: sandbox !== undefined,
};
next();
});
// Handle a client connecting to the server
io.on("connection", async (socket) => {
try {
// This data comes is added by our authentication middleware
const data = socket.data as {
userId: string
sandboxId: string
isOwner: boolean
type: string
if (inactivityTimeout) clearTimeout(inactivityTimeout);
const data = socket.data as {
userId: string;
sandboxId: string;
isOwner: boolean;
};
if (data.isOwner) {
isOwnerConnected = true;
} else {
if (!isOwnerConnected) {
socket.emit("disableAccess", "The sandbox owner is not connected.");
return;
}
}
const sandboxFiles = await getSandboxFiles(data.sandboxId);
sandboxFiles.fileData.forEach((file) => {
const filePath = path.join(dirName, file.id);
fs.mkdirSync(path.dirname(filePath), { recursive: true });
fs.writeFile(filePath, file.data, function (err) {
if (err) throw err;
});
});
socket.emit("loaded", sandboxFiles.files);
socket.on("getFile", (fileId: string, callback) => {
const file = sandboxFiles.fileData.find((f) => f.id === fileId);
if (!file) return;
callback(file.data);
});
socket.on("getFolder", async (folderId: string, callback) => {
const files = await getFolder(folderId);
callback(files);
});
// todo: send diffs + debounce for efficiency
socket.on("saveFile", async (fileId: string, body: string) => {
try {
await saveFileRL.consume(data.userId, 1);
if (Buffer.byteLength(body, "utf-8") > MAX_BODY_SIZE) {
socket.emit(
"rateLimit",
"Rate limited: file size too large. Please reduce the file size."
);
return;
}
const file = sandboxFiles.fileData.find((f) => f.id === fileId);
if (!file) return;
file.data = body;
fs.writeFile(path.join(dirName, file.id), body, function (err) {
if (err) throw err;
});
await saveFile(fileId, body);
} catch (e) {
io.emit("rateLimit", "Rate limited: file saving. Please slow down.");
}
});
socket.on("moveFile", async (fileId: string, folderId: string, callback) => {
const file = sandboxFiles.fileData.find((f) => f.id === fileId);
if (!file) return;
const parts = fileId.split("/");
const newFileId = folderId + "/" + parts.pop();
fs.rename(
path.join(dirName, fileId),
path.join(dirName, newFileId),
function (err) {
if (err) throw err;
}
);
file.id = newFileId;
await renameFile(fileId, newFileId, file.data);
const newFiles = await getSandboxFiles(data.sandboxId);
callback(newFiles.files);
});
socket.on("createFile", async (name: string, callback) => {
try {
const size: number = await getProjectSize(data.sandboxId);
// limit is 200mb
if (size > 200 * 1024 * 1024) {
io.emit(
"rateLimit",
"Rate limited: project size exceeded. Please delete some files."
);
callback({ success: false });
}
await createFileRL.consume(data.userId, 1);
const id = `projects/${data.sandboxId}/${name}`;
fs.writeFile(path.join(dirName, id), "", function (err) {
if (err) throw err;
});
sandboxFiles.files.push({
id,
name,
type: "file",
});
sandboxFiles.fileData.push({
id,
data: "",
});
await createFile(id);
callback({ success: true });
} catch (e) {
io.emit("rateLimit", "Rate limited: file creation. Please slow down.");
}
});
socket.on("createFolder", async (name: string, callback) => {
try {
await createFolderRL.consume(data.userId, 1);
const id = `projects/${data.sandboxId}/${name}`;
fs.mkdir(path.join(dirName, id), { recursive: true }, function (err) {
if (err) throw err;
});
callback();
} catch (e) {
io.emit("rateLimit", "Rate limited: folder creation. Please slow down.");
}
});
socket.on("renameFile", async (fileId: string, newName: string) => {
try {
await renameFileRL.consume(data.userId, 1);
const file = sandboxFiles.fileData.find((f) => f.id === fileId);
if (!file) return;
file.id = newName;
const parts = fileId.split("/");
const newFileId =
parts.slice(0, parts.length - 1).join("/") + "/" + newName;
fs.rename(
path.join(dirName, fileId),
path.join(dirName, newFileId),
function (err) {
if (err) throw err;
}
);
await renameFile(fileId, newFileId, file.data);
} catch (e) {
io.emit("rateLimit", "Rate limited: file renaming. Please slow down.");
return;
}
});
socket.on("deleteFile", async (fileId: string, callback) => {
try {
await deleteFileRL.consume(data.userId, 1);
const file = sandboxFiles.fileData.find((f) => f.id === fileId);
if (!file) return;
fs.unlink(path.join(dirName, fileId), function (err) {
if (err) throw err;
});
sandboxFiles.fileData = sandboxFiles.fileData.filter(
(f) => f.id !== fileId
);
await deleteFile(fileId);
const newFiles = await getSandboxFiles(data.sandboxId);
callback(newFiles.files);
} catch (e) {
io.emit("rateLimit", "Rate limited: file deletion. Please slow down.");
}
});
// todo
// socket.on("renameFolder", async (folderId: string, newName: string) => {
// });
socket.on("deleteFolder", async (folderId: string, callback) => {
const files = await getFolder(folderId);
await Promise.all(
files.map(async (file) => {
fs.unlink(path.join(dirName, file), function (err) {
if (err) throw err;
});
sandboxFiles.fileData = sandboxFiles.fileData.filter(
(f) => f.id !== file
);
await deleteFile(file);
})
);
const newFiles = await getSandboxFiles(data.sandboxId);
callback(newFiles.files);
});
socket.on("createTerminal", (id: string, callback) => {
if (terminals[id] || Object.keys(terminals).length >= 4) {
return;
}
// Register the connection
connections.addConnectionForSandbox(socket, data.sandboxId, data.isOwner)
const pty = spawn(os.platform() === "win32" ? "cmd.exe" : "bash", [], {
name: "xterm",
cols: 100,
cwd: path.join(dirName, "projects", data.sandboxId),
});
// Disable access unless the sandbox owner is connected
if (!data.isOwner && !connections.ownerIsConnected(data.sandboxId)) {
socket.emit("disableAccess", "The sandbox owner is not connected.")
return
const onData = pty.onData((data) => {
io.emit("terminalResponse", {
id,
data,
});
});
const onExit = pty.onExit((code) => console.log("exit :(", code));
pty.write("export PS1='\\u > '\r");
pty.write("clear\r");
terminals[id] = {
terminal: pty,
onData,
onExit,
};
callback();
});
socket.on("resizeTerminal", (dimensions: { cols: number; rows: number }) => {
Object.values(terminals).forEach((t) => {
t.terminal.resize(dimensions.cols, dimensions.rows);
});
});
socket.on("terminalData", (id: string, data: string) => {
if (!terminals[id]) {
return;
}
try {
// Create or retrieve the sandbox manager for the given sandbox ID
const sandbox =
sandboxes[data.sandboxId] ??
new Sandbox(data.sandboxId, data.type, {
dokkuClient,
gitClient,
})
sandboxes[data.sandboxId] = sandbox
// This callback recieves an update when the file list changes, and notifies all relevant connections.
const sendFileNotifications = (files: (TFolder | TFile)[]) => {
connections
.connectionsForSandbox(data.sandboxId)
.forEach((socket: Socket) => {
socket.emit("loaded", files)
})
}
// Initialize the sandbox container
// The file manager and terminal managers will be set up if they have been closed
await sandbox.initialize(sendFileNotifications)
socket.emit("loaded", sandbox.fileManager?.files)
// Register event handlers for the sandbox
// For each event handler, listen on the socket for that event
// Pass connection-specific information to the handlers
Object.entries(
sandbox.handlers({
userId: data.userId,
isOwner: data.isOwner,
socket,
})
).forEach(([event, handler]) => {
socket.on(
event,
async (options: any, callback?: (response: any) => void) => {
try {
const result = await handler(options)
callback?.(result)
} catch (e: any) {
handleErrors(`Error processing event "${event}":`, e, socket)
}
}
)
})
socket.emit("ready")
// Handle disconnection event
socket.on("disconnect", async () => {
try {
// Deregister the connection
connections.removeConnectionForSandbox(
socket,
data.sandboxId,
data.isOwner
)
// If the owner has disconnected from all sockets, close open terminals and file watchers.o
// The sandbox itself will timeout after the heartbeat stops.
if (data.isOwner && !connections.ownerIsConnected(data.sandboxId)) {
await sandbox.disconnect()
socket.broadcast.emit(
"disableAccess",
"The sandbox owner has disconnected."
)
}
} catch (e: any) {
handleErrors("Error disconnecting:", e, socket)
}
})
} catch (e: any) {
handleErrors(`Error initializing sandbox ${data.sandboxId}:`, e, socket)
terminals[id].terminal.write(data);
} catch (e) {
console.log("Error writing to terminal", e);
}
} catch (e: any) {
handleErrors("Error connecting:", e, socket)
}
})
});
socket.on("closeTerminal", (id: string, callback) => {
if (!terminals[id]) {
return;
}
terminals[id].onData.dispose();
terminals[id].onExit.dispose();
delete terminals[id];
callback();
});
socket.on(
"generateCode",
async (
fileName: string,
code: string,
line: number,
instructions: string,
callback
) => {
const fetchPromise = fetch(
`${process.env.DATABASE_WORKER_URL}/api/sandbox/generate`,
{
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `${process.env.WORKERS_KEY}`,
},
body: JSON.stringify({
userId: data.userId,
}),
}
);
// Generate code from cloudflare workers AI
const generateCodePromise = fetch(
`${process.env.AI_WORKER_URL}/api?fileName=${fileName}&code=${code}&line=${line}&instructions=${instructions}`,
{
headers: {
"Content-Type": "application/json",
Authorization: `${process.env.CF_AI_KEY}`,
},
}
);
const [fetchResponse, generateCodeResponse] = await Promise.all([
fetchPromise,
generateCodePromise,
]);
const json = await generateCodeResponse.json();
callback({ response: json.response, success: true });
}
);
socket.on("disconnect", async () => {
if (data.isOwner) {
Object.entries(terminals).forEach((t) => {
const { terminal, onData, onExit } = t[1];
onData.dispose();
onExit.dispose();
delete terminals[t[0]];
});
socket.broadcast.emit(
"disableAccess",
"The sandbox owner has disconnected."
);
}
// const sockets = await io.fetchSockets();
// if (inactivityTimeout) {
// clearTimeout(inactivityTimeout);
// }
// if (sockets.length === 0) {
// console.log("STARTING TIMER");
// inactivityTimeout = setTimeout(() => {
// io.fetchSockets().then(async (sockets) => {
// if (sockets.length === 0) {
// console.log("Server stopped", res);
// }
// });
// }, 20000);
// } else {
// console.log("number of sockets", sockets.length);
// }
});
});
// Start the server
httpServer.listen(port, () => {
console.log(`Server running on port ${port}`)
})
console.log(`Server running on port ${port}`);
});

View File

@ -30,4 +30,4 @@ export const deleteFileRL = new RateLimiterMemory({
export const deleteFolderRL = new RateLimiterMemory({
points: 1,
duration: 2,
})
})

View File

@ -1,75 +0,0 @@
import { Socket } from "socket.io"
import { z } from "zod"
import { Sandbox, User } from "./types"
// Middleware for socket authentication
export const socketAuth = async (socket: Socket, next: Function) => {
// Define the schema for handshake query validation
const handshakeSchema = z.object({
userId: z.string(),
sandboxId: z.string(),
EIO: z.string(),
transport: z.string(),
})
const q = socket.handshake.query
const parseQuery = handshakeSchema.safeParse(q)
// Check if the query is valid according to the schema
if (!parseQuery.success) {
next(new Error("Invalid request."))
return
}
const { sandboxId, userId } = parseQuery.data
// Fetch user data from the database
const dbUser = await fetch(
`${process.env.DATABASE_WORKER_URL}/api/user?id=${userId}`,
{
headers: {
Authorization: `${process.env.WORKERS_KEY}`,
},
}
)
const dbUserJSON = (await dbUser.json()) as User
// Fetch sandbox data from the database
const dbSandbox = await fetch(
`${process.env.DATABASE_WORKER_URL}/api/sandbox?id=${sandboxId}`,
{
headers: {
Authorization: `${process.env.WORKERS_KEY}`,
},
}
)
const dbSandboxJSON = (await dbSandbox.json()) as Sandbox
// Check if user data was retrieved successfully
if (!dbUserJSON) {
next(new Error("DB error."))
return
}
// Check if the user owns the sandbox or has shared access
const sandbox = dbUserJSON.sandbox.find((s) => s.id === sandboxId)
const sharedSandboxes = dbUserJSON.usersToSandboxes.find(
(uts) => uts.sandboxId === sandboxId
)
// If user doesn't own or have shared access to the sandbox, deny access
if (!sandbox && !sharedSandboxes) {
next(new Error("Invalid credentials."))
return
}
// Set socket data with user information
socket.data = {
userId,
sandboxId: sandboxId,
isOwner: sandbox !== undefined,
type: dbSandboxJSON.type,
}
// Allow the connection
next()
}

View File

@ -1,75 +1,70 @@
// DB Types
export type User = {
id: string
name: string
email: string
generations: number
sandbox: Sandbox[]
usersToSandboxes: UsersToSandboxes[]
}
id: string;
name: string;
email: string;
generations: number;
sandbox: Sandbox[];
usersToSandboxes: UsersToSandboxes[];
};
export type Sandbox = {
id: string
name: string
type: "reactjs" | "vanillajs" | "nextjs" | "streamlit"
visibility: "public" | "private"
createdAt: Date
userId: string
usersToSandboxes: UsersToSandboxes[]
}
id: string;
name: string;
type: "react" | "node";
visibility: "public" | "private";
createdAt: Date;
userId: string;
usersToSandboxes: UsersToSandboxes[];
};
export type UsersToSandboxes = {
userId: string
sandboxId: string
sharedOn: Date
}
userId: string;
sandboxId: string;
sharedOn: Date;
};
export type TFolder = {
id: string
type: "folder"
name: string
children: (TFile | TFolder)[]
}
id: string;
type: "folder";
name: string;
children: (TFile | TFolder)[];
};
export type TFile = {
id: string
type: "file"
name: string
}
id: string;
type: "file";
name: string;
};
export type TFileData = {
id: string
data: string
}
id: string;
data: string;
};
export type R2Files = {
objects: R2FileData[]
truncated: boolean
delimitedPrefixes: any[]
}
objects: R2FileData[];
truncated: boolean;
delimitedPrefixes: any[];
};
export type R2FileData = {
storageClass: string
uploaded: string
checksums: any
httpEtag: string
etag: string
size: number
version: string
key: string
}
storageClass: string;
uploaded: string;
checksums: any;
httpEtag: string;
etag: string;
size: number;
version: string;
key: string;
};
export type R2FileBody = R2FileData & {
body: ReadableStream
bodyUsed: boolean
arrayBuffer: Promise<ArrayBuffer>
text: Promise<string>
json: Promise<any>
blob: Promise<Blob>
}
export interface DokkuResponse {
success: boolean
apps?: string[]
message?: string
}
body: ReadableStream;
bodyUsed: boolean;
arrayBuffer: Promise<ArrayBuffer>;
text: Promise<string>;
json: Promise<any>;
blob: Promise<Blob>;
};

View File

@ -1,23 +1,177 @@
export class LockManager {
private locks: { [key: string]: Promise<any> }
import * as dotenv from "dotenv";
import {
R2FileBody,
R2Files,
Sandbox,
TFile,
TFileData,
TFolder,
} from "./types";
constructor() {
this.locks = {}
}
dotenv.config();
async acquireLock<T>(key: string, task: () => Promise<T>): Promise<T> {
if (!this.locks[key]) {
this.locks[key] = new Promise<T>(async (resolve, reject) => {
try {
const result = await task()
resolve(result)
} catch (error) {
reject(error)
} finally {
delete this.locks[key]
}
})
export const getSandboxFiles = async (id: string) => {
const res = await fetch(
`${process.env.STORAGE_WORKER_URL}/api?sandboxId=${id}`,
{
headers: {
Authorization: `${process.env.WORKERS_KEY}`,
},
}
return await this.locks[key]
);
const data: R2Files = await res.json();
const paths = data.objects.map((obj) => obj.key);
const processedFiles = await processFiles(paths, id);
return processedFiles;
};
export const getFolder = async (folderId: string) => {
const res = await fetch(
`${process.env.STORAGE_WORKER_URL}/api?folderId=${folderId}`,
{
headers: {
Authorization: `${process.env.WORKERS_KEY}`,
},
}
);
const data: R2Files = await res.json();
return data.objects.map((obj) => obj.key);
};
const processFiles = async (paths: string[], id: string) => {
const root: TFolder = { id: "/", type: "folder", name: "/", children: [] };
const fileData: TFileData[] = [];
paths.forEach((path) => {
const allParts = path.split("/");
if (allParts[1] !== id) {
return;
}
const parts = allParts.slice(2);
let current: TFolder = root;
for (let i = 0; i < parts.length; i++) {
const part = parts[i];
const isFile = i === parts.length - 1 && part.includes(".");
const existing = current.children.find((child) => child.name === part);
if (existing) {
if (!isFile) {
current = existing as TFolder;
}
} else {
if (isFile) {
const file: TFile = { id: path, type: "file", name: part };
current.children.push(file);
fileData.push({ id: path, data: "" });
} else {
const folder: TFolder = {
// id: path, // todo: wrong id. for example, folder "src" ID is: projects/a7vgttfqbgy403ratp7du3ln/src/App.css
id: `projects/${id}/${parts.slice(0, i + 1).join("/")}`,
type: "folder",
name: part,
children: [],
};
current.children.push(folder);
current = folder;
}
}
}
});
await Promise.all(
fileData.map(async (file) => {
const data = await fetchFileContent(file.id);
file.data = data;
})
);
return {
files: root.children,
fileData,
};
};
const fetchFileContent = async (fileId: string): Promise<string> => {
try {
const fileRes = await fetch(
`${process.env.STORAGE_WORKER_URL}/api?fileId=${fileId}`,
{
headers: {
Authorization: `${process.env.WORKERS_KEY}`,
},
}
);
return await fileRes.text();
} catch (error) {
console.error("ERROR fetching file:", error);
return "";
}
}
};
export const createFile = async (fileId: string) => {
const res = await fetch(`${process.env.STORAGE_WORKER_URL}/api`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `${process.env.WORKERS_KEY}`,
},
body: JSON.stringify({ fileId }),
});
return res.ok;
};
export const renameFile = async (
fileId: string,
newFileId: string,
data: string
) => {
const res = await fetch(`${process.env.STORAGE_WORKER_URL}/api/rename`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `${process.env.WORKERS_KEY}`,
},
body: JSON.stringify({ fileId, newFileId, data }),
});
return res.ok;
};
export const saveFile = async (fileId: string, data: string) => {
const res = await fetch(`${process.env.STORAGE_WORKER_URL}/api/save`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `${process.env.WORKERS_KEY}`,
},
body: JSON.stringify({ fileId, data }),
});
return res.ok;
};
export const deleteFile = async (fileId: string) => {
const res = await fetch(`${process.env.STORAGE_WORKER_URL}/api`, {
method: "DELETE",
headers: {
"Content-Type": "application/json",
Authorization: `${process.env.WORKERS_KEY}`,
},
body: JSON.stringify({ fileId }),
});
return res.ok;
};
export const getProjectSize = async (id: string) => {
const res = await fetch(
`${process.env.STORAGE_WORKER_URL}/api/size?sandboxId=${id}`,
{
headers: {
Authorization: `${process.env.WORKERS_KEY}`,
},
}
);
return (await res.json()).size;
};

View File

@ -0,0 +1,5 @@
{
"tabWidth": 2,
"semi": false,
"singleQuote": false
}

File diff suppressed because it is too large Load Diff

View File

@ -1,23 +1,22 @@
{
"name": "storage",
"version": "0.0.0",
"private": true,
"scripts": {
"deploy": "wrangler deploy",
"dev": "wrangler dev --remote",
"start": "wrangler dev",
"test": "vitest",
"cf-typegen": "wrangler types"
},
"devDependencies": {
"@cloudflare/vitest-pool-workers": "^0.1.0",
"@cloudflare/workers-types": "^4.20241106.0",
"typescript": "^5.0.4",
"vitest": "1.3.0",
"wrangler": "^3.86.0"
},
"dependencies": {
"p-limit": "^6.1.0",
"zod": "^3.23.4"
}
"name": "storage",
"version": "0.0.0",
"private": true,
"scripts": {
"deploy": "wrangler deploy",
"dev": "wrangler dev --remote",
"start": "wrangler dev",
"test": "vitest",
"cf-typegen": "wrangler types"
},
"devDependencies": {
"@cloudflare/vitest-pool-workers": "^0.1.0",
"@cloudflare/workers-types": "^4.20240419.0",
"typescript": "^5.0.4",
"vitest": "1.3.0",
"wrangler": "^3.0.0"
},
"dependencies": {
"zod": "^3.23.4"
}
}

View File

@ -1,9 +1,8 @@
import { ExecutionContext, R2Bucket, Headers as CFHeaders } from "@cloudflare/workers-types"
import { z } from "zod"
import startercode from "./startercode"
export interface Env {
R2: R2Bucket
Templates: R2Bucket
KEY: string
}
@ -76,13 +75,14 @@ export default {
if (obj === null) {
return new Response(`${fileId} not found`, { status: 404 })
}
const headers = new Headers() as unknown as CFHeaders
const headers = new Headers()
headers.set("etag", obj.httpEtag)
obj.writeHttpMetadata(headers)
const text = await obj.text()
return new Response(text, {
headers: Object.fromEntries(headers.entries()),
headers,
})
} else return invalidRequest
} else if (method === "POST") {
@ -135,7 +135,22 @@ export default {
return success
} else if (path === "/api/init" && method === "POST") {
// This API path no longer does anything, because template files are stored in E2B sandbox templates.
const initSchema = z.object({
sandboxId: z.string(),
type: z.enum(["react", "node"]),
})
const body = await request.json()
const { sandboxId, type } = initSchema.parse(body)
console.log(startercode[type])
await Promise.all(
startercode[type].map(async (file) => {
await env.R2.put(`projects/${sandboxId}/${file.name}`, file.body)
})
)
return success
} else {
return notFound

View File

@ -0,0 +1,151 @@
const startercode = {
node: [
{ name: "index.js", body: `console.log("Hello World!")` },
{
name: "package.json",
body: `{
"name": "nodejs",
"version": "1.0.0",
"description": "",
"main": "index.js",
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": {
"@types/node": "^18.0.6"
}
}`,
},
],
react: [
{
name: "package.json",
body: `{
"name": "react",
"private": true,
"version": "0.0.0",
"type": "module",
"scripts": {
"dev": "vite",
"build": "vite build",
"lint": "eslint . --ext js,jsx --report-unused-disable-directives --max-warnings 0",
"preview": "vite preview"
},
"dependencies": {
"react": "^18.2.0",
"react-dom": "^18.2.0"
},
"devDependencies": {
"@types/react": "^18.2.66",
"@types/react-dom": "^18.2.22",
"@vitejs/plugin-react": "^4.2.1",
"eslint": "^8.57.0",
"eslint-plugin-react": "^7.34.1",
"eslint-plugin-react-hooks": "^4.6.0",
"eslint-plugin-react-refresh": "^0.4.6",
"vite": "^5.2.0"
}
}`,
},
{
name: "vite.config.js",
body: `import { defineConfig } from 'vite'
import react from '@vitejs/plugin-react'
// https://vitejs.dev/config/
export default defineConfig({
plugins: [react()],
server: {
port: 5173,
host: "0.0.0.0",
}
})
`,
},
{
name: "index.html",
body: `<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>React Starter Code</title>
</head>
<body>
<div id="root"></div>
<script type="module" src="/src/main.jsx"></script>
</body>
</html>
`,
},
{
name: "src/App.css",
body: `div {
width: 100%;
height: 100vh;
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
font-family: sans-serif;
}
h1 {
color: #000;
margin: 0;
}
p {
color: #777;
margin: 0;
}
button {
padding: 8px 16px;
margin-top: 16px;
}`,
},
{
name: "src/App.jsx",
body: `import './App.css'
import { useState } from 'react'
function App() {
const [count, setCount] = useState(0)
return (
<div>
<h1>React Starter Code</h1>
<p>
Edit App.jsx to get started.
</p>
<button onClick={() => setCount(count => count + 1)}>
Clicked {count} times
</button>
</div>
)
}
export default App
`,
},
{
name: "src/main.jsx",
body: `import React from 'react'
import ReactDOM from 'react-dom/client'
import App from './App.jsx'
ReactDOM.createRoot(document.getElementById('root')).render(
<React.StrictMode>
<App />
</React.StrictMode>,
)
`,
},
],
}
export default startercode

View File

@ -1,30 +1,25 @@
// test/index.spec.ts
import {
createExecutionContext,
env,
SELF,
waitOnExecutionContext,
} from "cloudflare:test"
import { describe, expect, it } from "vitest"
import worker from "../src/index"
import { env, createExecutionContext, waitOnExecutionContext, SELF } from 'cloudflare:test';
import { describe, it, expect } from 'vitest';
import worker from '../src/index';
// For now, you'll need to do something like this to get a correctly-typed
// `Request` to pass to `worker.fetch()`.
const IncomingRequest = Request<unknown, IncomingRequestCfProperties>
const IncomingRequest = Request<unknown, IncomingRequestCfProperties>;
describe("Hello World worker", () => {
it("responds with Hello World! (unit style)", async () => {
const request = new IncomingRequest("http://example.com")
describe('Hello World worker', () => {
it('responds with Hello World! (unit style)', async () => {
const request = new IncomingRequest('http://example.com');
// Create an empty context to pass to `worker.fetch()`.
const ctx = createExecutionContext()
const response = await worker.fetch(request, env, ctx)
const ctx = createExecutionContext();
const response = await worker.fetch(request, env, ctx);
// Wait for all `Promise`s passed to `ctx.waitUntil()` to settle before running test assertions
await waitOnExecutionContext(ctx)
expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`)
})
await waitOnExecutionContext(ctx);
expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`);
});
it("responds with Hello World! (integration style)", async () => {
const response = await SELF.fetch("https://example.com")
expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`)
})
})
it('responds with Hello World! (integration style)', async () => {
const response = await SELF.fetch('https://example.com');
expect(await response.text()).toMatchInlineSnapshot(`"Hello World!"`);
});
});

View File

@ -1,11 +1,11 @@
{
"extends": "../tsconfig.json",
"compilerOptions": {
"types": [
"@cloudflare/workers-types/experimental",
"@cloudflare/vitest-pool-workers"
]
},
"include": ["./**/*.ts", "../src/env.d.ts"],
"exclude": []
"extends": "../tsconfig.json",
"compilerOptions": {
"types": [
"@cloudflare/workers-types/experimental",
"@cloudflare/vitest-pool-workers"
]
},
"include": ["./**/*.ts", "../src/env.d.ts"],
"exclude": []
}

View File

@ -12,9 +12,7 @@
/* Language and Environment */
"target": "es2021" /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */,
"lib": [
"es2021"
] /* Specify a set of bundled library declaration files that describe the target runtime environment. */,
"lib": ["es2021"] /* Specify a set of bundled library declaration files that describe the target runtime environment. */,
"jsx": "react" /* Specify what JSX code is generated. */,
// "experimentalDecorators": true, /* Enable experimental support for TC39 stage 2 draft decorators. */
// "emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */
@ -34,7 +32,7 @@
// "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */
// "typeRoots": [], /* Specify multiple folders that act like `./node_modules/@types`. */
"types": [
"@cloudflare/workers-types"
"@cloudflare/workers-types/2023-07-01"
] /* Specify type package names to be included without being referenced in a source file. */,
// "allowUmdGlobalAccess": true, /* Allow accessing UMD globals from modules. */
"resolveJsonModule": true /* Enable importing .json files */,

View File

@ -1,11 +1,11 @@
import { defineWorkersConfig } from "@cloudflare/vitest-pool-workers/config"
import { defineWorkersConfig } from "@cloudflare/vitest-pool-workers/config";
export default defineWorkersConfig({
test: {
poolOptions: {
workers: {
wrangler: { configPath: "./wrangler.toml" },
},
},
},
})
test: {
poolOptions: {
workers: {
wrangler: { configPath: "./wrangler.toml" },
},
},
},
});

View File

@ -1,3 +1,4 @@
// Generated by Wrangler
// After adding bindings to `wrangler.toml`, regenerate this interface via `npm run cf-typegen`
interface Env {}
interface Env {
}

View File

@ -5,19 +5,14 @@ LIVEBLOCKS_SECRET_KEY=
NEXT_PUBLIC_SERVER_PORT=4000
NEXT_PUBLIC_APP_URL=http://localhost:3000
NEXT_PUBLIC_API_URL=http://localhost:4000
NEXT_PUBLIC_SERVER_URL=http://localhost:4000
# Set WORKER_URLs after deploying the workers.
# Set NEXT_PUBLIC_WORKERS_KEY to be the same as KEY in /backend/storage/wrangler.toml.
NEXT_PUBLIC_DATABASE_WORKER_URL=https://database.your-worker.workers.dev
NEXT_PUBLIC_STORAGE_WORKER_URL=https://storage.your-worker.workers.dev
NEXT_PUBLIC_WORKERS_KEY=SUPERDUPERSECRET
NEXT_PUBLIC_DATABASE_WORKER_URL=
NEXT_PUBLIC_STORAGE_WORKER_URL=
NEXT_PUBLIC_WORKERS_KEY=
NEXT_PUBLIC_CLERK_SIGN_IN_URL=/sign-in
NEXT_PUBLIC_CLERK_SIGN_UP_URL=/sign-up
NEXT_PUBLIC_CLERK_AFTER_SIGN_IN_URL=/dashboard
NEXT_PUBLIC_CLERK_AFTER_SIGN_UP_URL=/dashboard
ANTHROPIC_API_KEY=
OPENAI_API_KEY=
NEXT_PUBLIC_CLERK_AFTER_SIGN_UP_URL=/dashboard

5
frontend/.prettierrc Normal file
View File

@ -0,0 +1,5 @@
{
"tabWidth": 2,
"semi": false,
"singleQuote": false
}

View File

@ -1,11 +1,11 @@
// import { Room } from "@/components/editor/live/room"
import Loading from "@/components/editor/loading"
import Navbar from "@/components/editor/navbar"
import { TerminalProvider } from "@/context/TerminalContext"
import { Room } from "@/components/editor/live/room"
import { Sandbox, User, UsersToSandboxes } from "@/lib/types"
import { currentUser } from "@clerk/nextjs"
import dynamic from "next/dynamic"
import { notFound, redirect } from "next/navigation"
import Loading from "@/components/editor/loading"
import dynamic from "next/dynamic"
import fs from "fs"
export const revalidate = 0
@ -51,11 +51,7 @@ const getSharedUsers = async (usersToSandboxes: UsersToSandboxes[]) => {
}
)
const userData: User = await userRes.json()
return {
id: userData.id,
name: userData.name,
avatarUrl: userData.avatarUrl,
}
return { id: userData.id, name: userData.name }
})
)
@ -67,6 +63,14 @@ const CodeEditor = dynamic(() => import("@/components/editor"), {
loading: () => <Loading />,
})
function getReactDefinitionFile() {
const reactDefinitionFile = fs.readFileSync(
"node_modules/@types/react/index.d.ts",
"utf8"
)
return reactDefinitionFile
}
export default async function CodePage({ params }: { params: { id: string } }) {
const user = await currentUser()
const sandboxId = params.id
@ -90,18 +94,20 @@ export default async function CodePage({ params }: { params: { id: string } }) {
return notFound()
}
const reactDefinitionFile = getReactDefinitionFile()
return (
<TerminalProvider>
{/* <Room id={sandboxId}> */}
<div className="overflow-hidden overscroll-none w-screen h-screen grid [grid-template-rows:3.5rem_auto] bg-background">
<Navbar
userData={userData}
sandboxData={sandboxData}
shared={shared as { id: string; name: string; avatarUrl: string }[]}
/>
<CodeEditor userData={userData} sandboxData={sandboxData} />
</div>
{/* </Room> */}
</TerminalProvider>
<div className="overflow-hidden overscroll-none w-screen flex flex-col h-screen bg-background">
<Room id={sandboxId}>
<Navbar userData={userData} sandboxData={sandboxData} shared={shared} />
<div className="w-screen flex grow">
<CodeEditor
userData={userData}
sandboxData={sandboxData}
reactDefinitionFile={reactDefinitionFile}
/>
</div>
</Room>
</div>
)
}

View File

@ -1,8 +1,8 @@
import { UserButton, currentUser } from "@clerk/nextjs"
import { redirect } from "next/navigation"
import Dashboard from "@/components/dashboard"
import Navbar from "@/components/dashboard/navbar"
import { User } from "@/lib/types"
import { currentUser } from "@clerk/nextjs"
import { redirect } from "next/navigation"
import { Sandbox, User } from "@/lib/types"
export default async function DashboardPage() {
const user = await currentUser()
@ -35,7 +35,6 @@ export default async function DashboardPage() {
type: "react" | "node"
author: string
sharedOn: Date
authorAvatarUrl: string
}[]
return (

View File

@ -1,5 +1,4 @@
import { User } from "@/lib/types"
import { generateUniqueUsername } from "@/lib/username-generator"
import { currentUser } from "@clerk/nextjs"
import { redirect } from "next/navigation"
@ -25,27 +24,6 @@ export default async function AppAuthLayout({
const dbUserJSON = (await dbUser.json()) as User
if (!dbUserJSON.id) {
// Try to get GitHub username if available
const githubUsername = user.externalAccounts.find(
(account) => account.provider === "github"
)?.username
const username =
githubUsername ||
(await generateUniqueUsername(async (username) => {
// Check if username exists in database
const userCheck = await fetch(
`${process.env.NEXT_PUBLIC_DATABASE_WORKER_URL}/api/user/check-username?username=${username}`,
{
headers: {
Authorization: `${process.env.NEXT_PUBLIC_WORKERS_KEY}`,
},
}
)
const exists = await userCheck.json()
return exists.exists
}))
const res = await fetch(
`${process.env.NEXT_PUBLIC_DATABASE_WORKER_URL}/api/user`,
{
@ -58,20 +36,9 @@ export default async function AppAuthLayout({
id: user.id,
name: user.firstName + " " + user.lastName,
email: user.emailAddresses[0].emailAddress,
username: username,
avatarUrl: user.imageUrl || null,
createdAt: new Date().toISOString(),
}),
}
)
if (!res.ok) {
const error = await res.text()
console.error("Failed to create user:", error)
} else {
const data = await res.json()
console.log("User created successfully:", data)
}
}
return <>{children}</>

View File

@ -1,65 +0,0 @@
import ProfilePage from "@/components/profile"
import ProfileNavbar from "@/components/profile/navbar"
import { SandboxWithLiked, User } from "@/lib/types"
import { currentUser } from "@clerk/nextjs"
import { notFound } from "next/navigation"
export default async function Page({
params: { username: rawUsername },
}: {
params: { username: string }
}) {
const username = decodeURIComponent(rawUsername).replace("@", "")
const loggedInClerkUser = await currentUser()
const [profileOwnerResponse, loggedInUserResponse] = await Promise.all([
fetch(
`${process.env.NEXT_PUBLIC_DATABASE_WORKER_URL}/api/user?username=${username}&currentUserId=${loggedInClerkUser?.id}`,
{
headers: {
Authorization: `${process.env.NEXT_PUBLIC_WORKERS_KEY}`,
},
}
),
fetch(
`${process.env.NEXT_PUBLIC_DATABASE_WORKER_URL}/api/user?id=${loggedInClerkUser?.id}`,
{
headers: {
Authorization: `${process.env.NEXT_PUBLIC_WORKERS_KEY}`,
},
}
),
])
const profileOwner = (await profileOwnerResponse.json()) as User
const loggedInUser = (await loggedInUserResponse.json()) as User
if (!Boolean(profileOwner?.id)) {
notFound()
}
const publicSandboxes: SandboxWithLiked[] = []
const privateSandboxes: SandboxWithLiked[] = []
profileOwner?.sandbox?.forEach((sandbox) => {
if (sandbox.visibility === "public") {
publicSandboxes.push(sandbox as SandboxWithLiked)
} else if (sandbox.visibility === "private") {
privateSandboxes.push(sandbox as SandboxWithLiked)
}
})
const isUserLoggedIn = Boolean(loggedInUser?.id)
return (
<section>
<ProfileNavbar userData={loggedInUser} />
<ProfilePage
publicSandboxes={publicSandboxes}
privateSandboxes={
profileOwner?.id === loggedInUser.id ? privateSandboxes : []
}
profileOwner={profileOwner}
loggedInUser={isUserLoggedIn ? loggedInUser : null}
/>
</section>
)
}

View File

@ -1,237 +0,0 @@
import {
ignoredFiles,
ignoredFolders,
} from "@/components/editor/AIChat/lib/ignored-paths"
import { templateConfigs } from "@/lib/templates"
import { TIERS } from "@/lib/tiers"
import { TFile, TFolder } from "@/lib/types"
import { Anthropic } from "@anthropic-ai/sdk"
import { currentUser } from "@clerk/nextjs"
const anthropic = new Anthropic({
apiKey: process.env.ANTHROPIC_API_KEY!,
})
// Format file structure for context
function formatFileStructure(
items: (TFile | TFolder)[] | undefined,
prefix = ""
): string {
if (!items || !Array.isArray(items)) {
return "No files available"
}
// Sort items to show folders first, then files
const sortedItems = [...items].sort((a, b) => {
if (a.type === b.type) return a.name.localeCompare(b.name)
return a.type === "folder" ? -1 : 1
})
return sortedItems
.map((item) => {
if (
item.type === "file" &&
!ignoredFiles.some(
(pattern) =>
item.name.endsWith(pattern.replace("*", "")) ||
item.name === pattern
)
) {
return `${prefix}├── ${item.name}`
} else if (
item.type === "folder" &&
!ignoredFolders.some((folder) => folder === item.name)
) {
const folderContent = formatFileStructure(
item.children,
`${prefix}`
)
return `${prefix}├── ${item.name}/\n${folderContent}`
}
return null
})
.filter(Boolean)
.join("\n")
}
export async function POST(request: Request) {
try {
const user = await currentUser()
if (!user) {
return new Response("Unauthorized", { status: 401 })
}
// Check and potentially reset monthly usage
const resetResponse = await fetch(
`${process.env.NEXT_PUBLIC_DATABASE_WORKER_URL}/api/user/check-reset`,
{
method: "POST",
headers: {
Authorization: `${process.env.NEXT_PUBLIC_WORKERS_KEY}`,
"Content-Type": "application/json",
},
body: JSON.stringify({ userId: user.id }),
}
)
if (!resetResponse.ok) {
console.error("Failed to check usage reset")
}
// Get user data and check tier
const dbUser = await fetch(
`${process.env.NEXT_PUBLIC_DATABASE_WORKER_URL}/api/user?id=${user.id}`,
{
headers: {
Authorization: `${process.env.NEXT_PUBLIC_WORKERS_KEY}`,
},
}
)
const userData = await dbUser.json()
// Get tier settings
const tierSettings =
TIERS[userData.tier as keyof typeof TIERS] || TIERS.FREE
if (userData.generations >= tierSettings.generations) {
return new Response(
`AI generation limit reached for your ${userData.tier || "FREE"} tier`,
{ status: 429 }
)
}
const {
messages,
context,
activeFileContent,
isEditMode,
fileName,
line,
templateType,
files,
projectName,
} = await request.json()
// Get template configuration
const templateConfig = templateConfigs[templateType]
// Create template context
const templateContext = templateConfig
? `
Project Template: ${templateConfig.name}
Current File Structure:
${files ? formatFileStructure(files) : "No files available"}
Conventions:
${templateConfig.conventions.join("\n")}
Dependencies:
${JSON.stringify(templateConfig.dependencies, null, 2)}
Scripts:
${JSON.stringify(templateConfig.scripts, null, 2)}
`
: ""
// Create system message based on mode
let systemMessage
if (isEditMode) {
systemMessage = `You are an AI code editor working in a ${templateType} project. Your task is to modify the given code based on the user's instructions. Only output the modified code, without any explanations or markdown formatting. The code should be a direct replacement for the existing code. If there is no code to modify, refer to the active file content and only output the code that is relevant to the user's instructions.
${templateContext}
File: ${fileName}
Line: ${line}
Context:
${context || "No additional context provided"}
Active File Content:
${activeFileContent}
Instructions: ${messages[0].content}
Respond only with the modified code that can directly replace the existing code.`
} else {
systemMessage = `You are an intelligent programming assistant for a ${templateType} project. Please respond to the following request concisely. When providing code:
1. Format it using triple backticks (\`\`\`) with the appropriate language identifier.
2. Always specify the complete file path in the format:
${projectName}/filepath/to/file.ext
3. If creating a new file, specify the path as:
${projectName}/filepath/to/file.ext (new file)
4. Format your code blocks as:
${projectName}/filepath/to/file.ext
\`\`\`language
code here
\`\`\`
If multiple files are involved, repeat the format for each file. Provide a clear and concise explanation along with any code snippets. Keep your response brief and to the point.
This is the project template:
${templateContext}
${context ? `Context:\n${context}\n` : ""}
${activeFileContent ? `Active File Content:\n${activeFileContent}\n` : ""}`
}
// Create stream response
const stream = await anthropic.messages.create({
model: tierSettings.model,
max_tokens: tierSettings.maxTokens,
system: systemMessage,
messages: messages.map((msg: { role: string; content: string }) => ({
role: msg.role === "human" ? "user" : "assistant",
content: msg.content,
})),
stream: true,
})
// Increment user's generation count
await fetch(
`${process.env.NEXT_PUBLIC_DATABASE_WORKER_URL}/api/user/increment-generations`,
{
method: "POST",
headers: {
Authorization: `${process.env.NEXT_PUBLIC_WORKERS_KEY}`,
"Content-Type": "application/json",
},
body: JSON.stringify({ userId: user.id }),
}
)
// Return streaming response
const encoder = new TextEncoder()
return new Response(
new ReadableStream({
async start(controller) {
for await (const chunk of stream) {
if (
chunk.type === "content_block_delta" &&
chunk.delta.type === "text_delta"
) {
controller.enqueue(encoder.encode(chunk.delta.text))
}
}
controller.close()
},
}),
{
headers: {
"Content-Type": "text/plain; charset=utf-8",
"Cache-Control": "no-cache",
Connection: "keep-alive",
},
}
)
} catch (error) {
console.error("AI generation error:", error)
return new Response(
error instanceof Error ? error.message : "Internal Server Error",
{ status: 500 }
)
}
}

View File

@ -1,61 +1,57 @@
// import { colors } from "@/lib/colors"
// import { User } from "@/lib/types"
import { colors } from "@/lib/colors"
import { User } from "@/lib/types"
import { currentUser } from "@clerk/nextjs"
// import { Liveblocks } from "@liveblocks/node"
import { Liveblocks } from "@liveblocks/node"
import { NextRequest } from "next/server"
// const API_KEY = process.env.LIVEBLOCKS_SECRET_KEY!
const API_KEY = process.env.LIVEBLOCKS_SECRET_KEY!
// const liveblocks = new Liveblocks({
// secret: API_KEY!,
// })
const liveblocks = new Liveblocks({
secret: API_KEY!,
})
export async function POST(request: NextRequest) {
// Temporarily return unauthorized while Liveblocks is disabled
return new Response("Liveblocks collaboration temporarily disabled", { status: 503 })
const clerkUser = await currentUser()
// Original implementation commented out:
// const clerkUser = await currentUser()
//
// if (!clerkUser) {
// return new Response("Unauthorized", { status: 401 })
// }
//
// const res = await fetch(
// `${process.env.NEXT_PUBLIC_DATABASE_WORKER_URL}/api/user?id=${clerkUser.id}`,
// {
// headers: {
// Authorization: `${process.env.NEXT_PUBLIC_WORKERS_KEY}`,
// },
// }
// )
// const user = (await res.json()) as User
//
// const colorNames = Object.keys(colors)
// const randomColor = colorNames[
// Math.floor(Math.random() * colorNames.length)
// ] as keyof typeof colors
// const code = colors[randomColor]
//
// // Create a session for the current user
// // userInfo is made available in Liveblocks presence hooks, e.g. useOthers
// const session = liveblocks.prepareSession(user.id, {
// userInfo: {
// name: user.name,
// email: user.email,
// color: randomColor,
// },
// })
//
// // Give the user access to the room
// user.sandbox.forEach((sandbox) => {
// session.allow(`${sandbox.id}`, session.FULL_ACCESS)
// })
// user.usersToSandboxes.forEach((userToSandbox) => {
// session.allow(`${userToSandbox.sandboxId}`, session.FULL_ACCESS)
// })
//
// // Authorize the user and return the result
// const { body, status } = await session.authorize()
// return new Response(body, { status })
if (!clerkUser) {
return new Response("Unauthorized", { status: 401 })
}
const res = await fetch(
`${process.env.NEXT_PUBLIC_DATABASE_WORKER_URL}/api/user?id=${clerkUser.id}`,
{
headers: {
Authorization: `${process.env.NEXT_PUBLIC_WORKERS_KEY}`,
},
}
)
const user = (await res.json()) as User
const colorNames = Object.keys(colors)
const randomColor = colorNames[
Math.floor(Math.random() * colorNames.length)
] as keyof typeof colors
const code = colors[randomColor]
// Create a session for the current user
// userInfo is made available in Liveblocks presence hooks, e.g. useOthers
const session = liveblocks.prepareSession(user.id, {
userInfo: {
name: user.name,
email: user.email,
color: randomColor,
},
})
// Give the user access to the room
user.sandbox.forEach((sandbox) => {
session.allow(`${sandbox.id}`, session.FULL_ACCESS)
})
user.usersToSandboxes.forEach((userToSandbox) => {
session.allow(`${userToSandbox.sandboxId}`, session.FULL_ACCESS)
})
// Authorize the user and return the result
const { body, status } = await session.authorize()
return new Response(body, { status })
}

View File

@ -1,69 +0,0 @@
import OpenAI from "openai"
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
})
export async function POST(request: Request) {
try {
const { originalCode, newCode, fileName } = await request.json()
const systemPrompt = `You are a code merging assistant. Your task is to merge the new code snippet with the original file content while:
1. Preserving the original file's functionality
2. Ensuring proper integration of the new code
3. Maintaining consistent style and formatting
4. Resolving any potential conflicts
5. Output ONLY the raw code without any:
- Code fence markers (\`\`\`)
- Language identifiers (typescript, javascript, etc.)
- Explanations or comments
- Markdown formatting
The output should be the exact code that will replace the existing code, nothing more and nothing less.
Important: When merging, preserve the original code structure as much as possible. Only make necessary changes to integrate the new code while maintaining the original code's organization and style.`
const mergedCode = `Original file (${fileName}):\n${originalCode}\n\nNew code to merge:\n${newCode}`
const response = await openai.chat.completions.create({
model: "gpt-4o",
messages: [
{ role: "system", content: systemPrompt },
{ role: "user", content: mergedCode },
],
prediction: {
type: "content",
content: mergedCode,
},
stream: true,
})
// Clean and stream response
const encoder = new TextEncoder()
return new Response(
new ReadableStream({
async start(controller) {
let buffer = ""
for await (const chunk of response) {
if (chunk.choices[0]?.delta?.content) {
buffer += chunk.choices[0].delta.content
// Clean any code fence markers that might appear in the stream
const cleanedContent = buffer
.replace(/^```[\w-]*\n|```\s*$/gm, "") // Remove code fences
.replace(/^(javascript|typescript|python|html|css)\n/gm, "") // Remove language identifiers
controller.enqueue(encoder.encode(cleanedContent))
buffer = ""
}
}
controller.close()
},
})
)
} catch (error) {
console.error("Merge error:", error)
return new Response(
error instanceof Error ? error.message : "Failed to merge code",
{ status: 500 }
)
}
}

View File

@ -1,42 +0,0 @@
import { currentUser } from "@clerk/nextjs"
export async function POST(request: Request) {
try {
const user = await currentUser()
if (!user) {
return new Response("Unauthorized", { status: 401 })
}
const { tier } = await request.json()
// handle payment processing here
const response = await fetch(
`${process.env.NEXT_PUBLIC_DATABASE_WORKER_URL}/api/user/update-tier`,
{
method: "POST",
headers: {
Authorization: `${process.env.NEXT_PUBLIC_WORKERS_KEY}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
userId: user.id,
tier,
tierExpiresAt: new Date(Date.now() + 30 * 24 * 60 * 60 * 1000), // 30 days
}),
}
)
if (!response.ok) {
throw new Error("Failed to upgrade tier")
}
return new Response("Tier upgraded successfully")
} catch (error) {
console.error("Tier upgrade error:", error)
return new Response(
error instanceof Error ? error.message : "Internal Server Error",
{ status: 500 }
)
}
}

BIN
frontend/app/favicon.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 25 KiB

View File

@ -99,30 +99,6 @@
); /* violet 900 -> bg */
}
.light .gradient-button-bg {
background: radial-gradient(
circle at top,
#f5f5f5 0%,
/* Very light gray */ #e0e0e0 50% /* Soft gray */
);
}
.light .gradient-button {
background: radial-gradient(
circle at bottom,
hsl(0, 0%, 85%) -10%,
/* Slightly darker gray */ hsl(0, 0%, 95%) 50% /* Very soft light gray */
);
}
.light .gradient-button-bg > div:hover {
background: radial-gradient(
circle at bottom,
hsl(0, 0%, 80%) -10%,
/* Slightly darker gray for hover */ hsl(0, 0%, 90%) 80% /* Softer gray */
);
}
.inline-decoration::before {
content: "Generate";
color: #525252;
@ -176,23 +152,3 @@
.tab-scroll::-webkit-scrollbar {
display: none;
}
.added-line-decoration {
background-color: rgba(0, 255, 0, 0.1);
}
.removed-line-decoration {
background-color: rgba(255, 0, 0, 0.1);
}
.added-line-glyph {
background-color: #28a745;
width: 4px !important;
margin-left: 3px;
}
.removed-line-glyph {
background-color: #dc3545;
width: 4px !important;
margin-left: 3px;
}

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 17 KiB

View File

@ -1,32 +1,15 @@
import { Toaster } from "@/components/ui/sonner"
import { ThemeProvider } from "@/components/ui/theme-provider"
import { PreviewProvider } from "@/context/PreviewContext"
import { SocketProvider } from "@/context/SocketContext"
import { ClerkProvider } from "@clerk/nextjs"
import { Analytics } from "@vercel/analytics/react"
import { GeistMono } from "geist/font/mono"
import { GeistSans } from "geist/font/sans"
import type { Metadata } from "next"
import { GeistSans } from "geist/font/sans"
import { GeistMono } from "geist/font/mono"
import "./globals.css"
import { ThemeProvider } from "@/components/layout/themeProvider"
import { ClerkProvider } from "@clerk/nextjs"
import { Toaster } from "@/components/ui/sonner"
import { Analytics } from "@vercel/analytics/react"
export const metadata: Metadata = {
title: "Sandbox",
description:
"an open-source cloud-based code editing environment with custom AI code generation, live preview, real-time collaboration, and AI chat",
openGraph: {
type: "website",
url: "https://sandbox.gitwit.dev",
title: "Sandbox",
description:
"an open-source cloud-based code editing environment with custom AI code generation, live preview, real-time collaboration, and AI chat",
},
twitter: {
site: "https://sandbox.gitwit.dev",
title: "Sandbox by Gitwit",
description:
"an open-source cloud-based code editing environment with custom AI code generation, live preview, real-time collaboration, and AI chat",
creator: "@gitwitdev",
},
description: "A collaborative, AI-powered cloud code editing environment",
}
export default function RootLayout({
@ -40,12 +23,11 @@ export default function RootLayout({
<body>
<ThemeProvider
attribute="class"
defaultTheme="system"
defaultTheme="dark"
forcedTheme="dark"
disableTransitionOnChange
>
<SocketProvider>
<PreviewProvider>{children}</PreviewProvider>
</SocketProvider>
{children}
<Analytics />
<Toaster position="bottom-left" richColors />
</ThemeProvider>

View File

@ -1 +0,0 @@
About Sandbox by Gitwit

Binary file not shown.

Before

Width:  |  Height:  |  Size: 465 KiB

View File

@ -1,13 +1,13 @@
import Landing from "@/components/landing"
import { currentUser } from "@clerk/nextjs"
import { redirect } from "next/navigation"
import { currentUser } from "@clerk/nextjs";
import { redirect } from "next/navigation";
import Landing from "@/components/landing";
export default async function Home() {
const user = await currentUser()
const user = await currentUser();
if (user) {
redirect("/dashboard")
redirect("/dashboard");
}
return <Landing />
return <Landing />;
}

View File

@ -1,13 +0,0 @@
"use client"
import posthog from "posthog-js"
import { PostHogProvider } from "posthog-js/react"
if (typeof window !== "undefined") {
posthog.init(process.env.NEXT_PUBLIC_POSTHOG_KEY, {
api_host: process.env.NEXT_PUBLIC_POSTHOG_HOST,
})
}
export function PHProvider({ children }) {
return <PostHogProvider client={posthog}>{children}</PostHogProvider>
}

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 29 KiB

After

Width:  |  Height:  |  Size: 2.3 KiB

View File

@ -14,4 +14,4 @@
"components": "@/components",
"utils": "@/lib/utils"
}
}
}

View File

@ -3,9 +3,16 @@
import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
DialogTrigger,
} from "@/components/ui/dialog"
import Image from "next/image"
import { useState } from "react"
import { Button } from "../ui/button"
import { ChevronRight } from "lucide-react"
export default function AboutModal({
open,
@ -18,38 +25,11 @@ export default function AboutModal({
<Dialog open={open} onOpenChange={setOpen}>
<DialogContent>
<DialogHeader>
<DialogTitle>Help & Support</DialogTitle>
<DialogTitle>About this project</DialogTitle>
</DialogHeader>
<div className="space-y-4">
{/* <div className="text-sm text-muted-foreground">
Sandbox is an open-source cloud-based code editing environment with
custom AI code autocompletion and real-time collaboration.
</div> */}
<div className="text-sm text-muted-foreground">
Get help and support through our Discord community or by creating issues on GitHub:
</div>
<div className="space-y-2">
<div className="text-sm">
<a
href="https://discord.gitwit.dev/"
target="_blank"
rel="noopener noreferrer"
className="text-primary hover:underline"
>
Join our Discord community
</a>
</div>
<div className="text-sm">
<a
href="https://github.com/jamesmurdza/sandbox/issues"
target="_blank"
rel="noopener noreferrer"
className="text-primary hover:underline"
>
Report issues on GitHub
</a>
</div>
</div>
<div className="text-sm text-muted-foreground">
Sandbox is an open-source cloud-based code editing environment with
custom AI code autocompletion and real-time collaboration.
</div>
</DialogContent>
</Dialog>

View File

@ -1,16 +1,24 @@
"use client"
import { Button } from "@/components/ui/button"
import CustomButton from "@/components/ui/customButton"
import { Sandbox } from "@/lib/types"
import { Code2, FolderDot, HelpCircle, Plus, Users } from "lucide-react"
import { useRouter, useSearchParams } from "next/navigation"
import { Button } from "@/components/ui/button"
import {
Code2,
FolderDot,
HelpCircle,
Plus,
Settings,
Users,
} from "lucide-react"
import { useEffect, useState } from "react"
import { toast } from "sonner"
import AboutModal from "./about"
import NewProjectModal from "./newProject"
import { Sandbox } from "@/lib/types"
import DashboardProjects from "./projects"
import DashboardSharedWithMe from "./shared"
import NewProjectModal from "./newProject"
import Link from "next/link"
import { useRouter, useSearchParams } from "next/navigation"
import AboutModal from "./about"
import { toast } from "sonner"
type TScreen = "projects" | "shared" | "settings" | "search"
@ -25,7 +33,6 @@ export default function Dashboard({
type: "react" | "node"
author: string
sharedOn: Date
authorAvatarUrl?: string
}[]
}) {
const [screen, setScreen] = useState<TScreen>("projects")
@ -43,9 +50,10 @@ export default function Dashboard({
const router = useRouter()
useEffect(() => {
// update the dashboard to show a new project
router.refresh()
}, [])
if (!sandboxes) {
router.refresh()
}
}, [sandboxes])
return (
<>
@ -78,14 +86,14 @@ export default function Dashboard({
<FolderDot className="w-4 h-4 mr-2" />
My Projects
</Button>
{/* <Button
<Button
variant="ghost"
onClick={() => setScreen("shared")}
className={activeScreen("shared")}
>
<Users className="w-4 h-4 mr-2" />
Shared With Me
</Button> */}
</Button>
{/* <Button
variant="ghost"
onClick={() => setScreen("settings")}
@ -96,7 +104,7 @@ export default function Dashboard({
</Button> */}
</div>
<div className="flex flex-col">
<a target="_blank" href="https://github.com/jamesmurdza/sandbox">
<a target="_blank" href="https://github.com/ishaan1013/sandbox">
<Button
variant="ghost"
className="justify-start w-full font-normal text-muted-foreground"
@ -111,7 +119,7 @@ export default function Dashboard({
className="justify-start font-normal text-muted-foreground"
>
<HelpCircle className="w-4 h-4 mr-2" />
Help
About
</Button>
</div>
</div>
@ -122,12 +130,7 @@ export default function Dashboard({
) : null}
</>
) : screen === "shared" ? (
<DashboardSharedWithMe
shared={shared.map((item) => ({
...item,
authorAvatarUrl: item.authorAvatarUrl || "",
}))}
/>
<DashboardSharedWithMe shared={shared} />
) : screen === "settings" ? null : null}
</div>
</>

View File

@ -1,31 +1,24 @@
import Logo from "@/assets/logo.svg"
import { ThemeSwitcher } from "@/components/ui/theme-switcher"
import { User } from "@/lib/types"
import Image from "next/image"
import Link from "next/link"
import UserButton from "../../ui/userButton"
import Logo from "@/assets/logo.svg"
import DashboardNavbarSearch from "./search"
import UserButton from "../../ui/userButton"
import { User } from "@/lib/types"
export default function DashboardNavbar({ userData }: { userData: User }) {
return (
<div className=" py-2 px-4 w-full flex items-center justify-between border-b border-border">
<div className="flex items-center space-x-2">
<div className="h-16 px-4 w-full flex items-center justify-between border-b border-border">
<div className="flex items-center space-x-4">
<Link
href="/"
className="ring-offset-2 ring-offset-background focus-visible:outline-none focus-visible:ring-1 focus-visible:ring-ring disabled:pointer-events-none rounded-sm"
>
<Image src={Logo} alt="Logo" width={36} height={36} />
</Link>
<h1 className="text-xl">
<span className="font-semibold">Sandbox</span>{" "}
<span className="text-xs font-medium text-muted-foreground">
by gitwit
</span>
</h1>
<div className="text-sm font-medium flex items-center">Sandbox</div>
</div>
<div className="flex items-center space-x-4">
<DashboardNavbarSearch />
<ThemeSwitcher />
<UserButton userData={userData} />
</div>
</div>

View File

@ -1,12 +1,13 @@
"use client"
"use client";
import { Search } from "lucide-react"
import { useRouter } from "next/navigation"
import { Input } from "../../ui/input"
import { Input } from "../../ui/input";
import { Search } from "lucide-react";
import { useEffect, useState } from "react";
import { useRouter } from "next/navigation";
export default function DashboardNavbarSearch() {
// const [search, setSearch] = useState("");
const router = useRouter()
const router = useRouter();
// useEffect(() => {
// const delayDebounceFn = setTimeout(() => {
@ -28,14 +29,14 @@ export default function DashboardNavbarSearch() {
// onChange={(e) => setSearch(e.target.value)}
onChange={(e) => {
if (e.target.value === "") {
router.push(`/dashboard`)
return
router.push(`/dashboard`);
return;
}
router.push(`/dashboard?q=${e.target.value}`)
router.push(`/dashboard?q=${e.target.value}`);
}}
placeholder="Search projects..."
className="pl-8"
/>
</div>
)
);
}

View File

@ -3,14 +3,16 @@
import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
DialogTrigger,
} from "@/components/ui/dialog"
import { zodResolver } from "@hookform/resolvers/zod"
import Image from "next/image"
import { useCallback, useEffect, useMemo, useState } from "react"
import { useState } from "react"
import { set, z } from "zod"
import { zodResolver } from "@hookform/resolvers/zod"
import { useForm } from "react-hook-form"
import { z } from "zod"
import {
Form,
@ -29,17 +31,51 @@ import {
SelectTrigger,
SelectValue,
} from "@/components/ui/select"
import { createSandbox } from "@/lib/actions"
import { projectTemplates } from "@/lib/data"
import { useUser } from "@clerk/nextjs"
import { ChevronLeft, ChevronRight, Loader2, Search } from "lucide-react"
import { createSandbox } from "@/lib/actions"
import { useRouter } from "next/navigation"
import { Loader2 } from "lucide-react"
import { Button } from "../ui/button"
import { cn } from "@/lib/utils"
import type { EmblaCarouselType } from "embla-carousel"
import useEmblaCarousel from "embla-carousel-react"
import { WheelGesturesPlugin } from "embla-carousel-wheel-gestures"
type TOptions = "react" | "node" | "python" | "more"
const data: {
id: TOptions
name: string
icon: string
description: string
disabled: boolean
}[] = [
{
id: "react",
name: "React",
icon: "/project-icons/react.svg",
description: "A JavaScript library for building user interfaces",
disabled: false,
},
{
id: "node",
name: "Node",
icon: "/project-icons/node.svg",
description: "A JavaScript runtime built on the V8 JavaScript engine",
disabled: false,
},
{
id: "python",
name: "Python",
icon: "/project-icons/python.svg",
description: "A high-level, general-purpose language, coming soon",
disabled: true,
},
{
id: "more",
name: "More Languages",
icon: "/project-icons/more.svg",
description: "More coming soon, feel free to contribute on GitHub",
disabled: true,
},
]
const formSchema = z.object({
name: z
.string()
@ -59,20 +95,11 @@ export default function NewProjectModal({
open: boolean
setOpen: (open: boolean) => void
}) {
const router = useRouter()
const user = useUser()
const [selected, setSelected] = useState("reactjs")
const [selected, setSelected] = useState<TOptions>("react")
const [loading, setLoading] = useState(false)
const [emblaRef, emblaApi] = useEmblaCarousel({ loop: false }, [
WheelGesturesPlugin(),
])
const {
prevBtnDisabled,
nextBtnDisabled,
onPrevButtonClick,
onNextButtonClick,
} = usePrevNextButtons(emblaApi)
const [search, setSearch] = useState("")
const router = useRouter()
const user = useUser()
const form = useForm<z.infer<typeof formSchema>>({
resolver: zodResolver(formSchema),
@ -82,26 +109,6 @@ export default function NewProjectModal({
},
})
const handleTemplateClick = useCallback(
({ id, index }: { id: string; index: number }) => {
setSelected(id)
emblaApi?.scrollTo(index)
},
[emblaApi]
)
const filteredTemplates = useMemo(
() =>
projectTemplates.filter(
(item) =>
item.name.toLowerCase().includes(search.toLowerCase()) ||
item.description.toLowerCase().includes(search.toLowerCase())
),
[search, projectTemplates]
)
const emptyTemplates = useMemo(
() => filteredTemplates.length === 0,
[filteredTemplates]
)
async function onSubmit(values: z.infer<typeof formSchema>) {
if (!user.isSignedIn) return
@ -111,6 +118,7 @@ export default function NewProjectModal({
const id = await createSandbox(sandboxData)
router.push(`/code/${id}`)
}
return (
<Dialog
open={open}
@ -118,93 +126,29 @@ export default function NewProjectModal({
if (!loading) setOpen(open)
}}
>
<DialogContent className="max-h-[95vh] overflow-y-auto">
<DialogContent>
<DialogHeader>
<DialogTitle>Create A Sandbox</DialogTitle>
</DialogHeader>
<div className="flex flex-col gap-2 max-w-full overflow-hidden">
<div className="flex items-center justify-end">
<SearchInput
{...{
value: search,
onValueChange: setSearch,
}}
/>
</div>
<div className="overflow-hidden relative" ref={emblaRef}>
<div
className={cn(
"grid grid-flow-col gap-x-2 min-h-[97px]",
emptyTemplates ? "auto-cols-[100%]" : "auto-cols-[200px]"
)}
<div className="grid grid-cols-2 w-full gap-2 mt-2">
{data.map((item) => (
<button
disabled={item.disabled || loading}
key={item.id}
onClick={() => setSelected(item.id)}
className={`${
selected === item.id ? "border-foreground" : "border-border"
} rounded-md border bg-card text-card-foreground shadow text-left p-4 flex flex-col transition-all focus-visible:outline-none focus-visible:ring-offset-2 focus-visible:ring-offset-background focus-visible:ring-2 focus-visible:ring-ring disabled:opacity-50 disabled:cursor-not-allowed`}
>
{filteredTemplates.map((item, i) => (
<button
disabled={item.disabled || loading}
key={item.id}
onClick={handleTemplateClick.bind(null, {
id: item.id,
index: i,
})}
className={cn(
selected === item.id
? "shadow-foreground"
: "shadow-border",
"shadow-[0_0_0_1px_inset] rounded-md border bg-card text-card-foreground text-left p-4 flex flex-col transition-all focus-visible:outline-none focus-visible:ring-offset-2 focus-visible:ring-offset-background focus-visible:ring-2 focus-visible:ring-ring disabled:opacity-50 disabled:cursor-not-allowed"
)}
>
<div className="space-x-2 flex items-center justify-start w-full">
<Image alt="" src={item.icon} width={20} height={20} />
<div className="font-medium">{item.name}</div>
</div>
<div className="mt-2 text-muted-foreground text-xs line-clamp-2">
{item.description}
</div>
</button>
))}
{emptyTemplates && (
<div className="flex flex-col gap-2 items-center text-center justify-center text-muted-foreground text-sm">
<p>No templates found</p>
<Button size="xs" asChild>
<a
href="https://github.com/jamesmurdza/sandbox"
target="_blank"
>
Contribute
</a>
</Button>
</div>
)}
</div>
<div
className={cn(
"absolute transition-all opacity-100 duration-400 bg-gradient-to-r from-background via-background to-transparent w-14 pl-1 left-0 top-0 -translate-x-1 bottom-0 h-full flex items-center",
prevBtnDisabled && "opacity-0 pointer-events-none"
)}
>
<Button
size="smIcon"
className="rounded-full"
onClick={onPrevButtonClick}
>
<ChevronLeft className="size-5" />
</Button>
</div>
<div
className={cn(
"absolute transition-all opacity-100 duration-400 bg-gradient-to-l from-background via-background to-transparent w-14 pl-1 right-0 top-0 translate-x-1 bottom-0 h-full flex items-center",
nextBtnDisabled && "opacity-0 pointer-events-none"
)}
>
<Button
size="smIcon"
className="rounded-full"
onClick={onNextButtonClick}
>
<ChevronRight className="size-5" />
</Button>
</div>
</div>
<div className="space-x-2 flex items-center justify-start w-full">
<Image alt="" src={item.icon} width={20} height={20} />
<div className="font-medium">{item.name}</div>
</div>
<div className="mt-2 text-muted-foreground text-sm">
{item.description}
</div>
</button>
))}
</div>
<Form {...form}>
@ -272,68 +216,3 @@ export default function NewProjectModal({
</Dialog>
)
}
function SearchInput({
value,
onValueChange,
}: {
value?: string
onValueChange?: (value: string) => void
}) {
const onSubmit = useCallback((e: React.FormEvent) => {
e.preventDefault()
console.log("searching")
}, [])
return (
<form {...{ onSubmit }} className="w-40 h-8 ">
<label
htmlFor="template-search"
className="flex gap-2 rounded-sm transition-colors bg-gray-100 dark:bg-[#2e2e2e] border border-[--s-color] [--s-color:hsl(var(--muted-foreground))] focus-within:[--s-color:hsl(var(--muted-foreground),50%)] h-full items-center px-2"
>
<Search className="size-4 text-[--s-color] transition-colors" />
<input
id="template-search"
type="text"
name="search"
placeholder="Search templates"
value={value}
onChange={(e) => onValueChange?.(e.target.value)}
className="bg-transparent placeholder:text-muted-foreground w-full focus:outline-none text-xs"
/>
</label>
</form>
)
}
const usePrevNextButtons = (emblaApi: EmblaCarouselType | undefined) => {
const [prevBtnDisabled, setPrevBtnDisabled] = useState(true)
const [nextBtnDisabled, setNextBtnDisabled] = useState(true)
const onPrevButtonClick = useCallback(() => {
if (!emblaApi) return
emblaApi.scrollPrev()
}, [emblaApi])
const onNextButtonClick = useCallback(() => {
if (!emblaApi) return
emblaApi.scrollNext()
}, [emblaApi])
const onSelect = useCallback((emblaApi: EmblaCarouselType) => {
setPrevBtnDisabled(!emblaApi.canScrollPrev())
setNextBtnDisabled(!emblaApi.canScrollNext())
}, [])
useEffect(() => {
if (!emblaApi) return
onSelect(emblaApi)
emblaApi.on("reInit", onSelect).on("select", onSelect)
}, [emblaApi, onSelect])
return {
prevBtnDisabled,
nextBtnDisabled,
onPrevButtonClick,
onNextButtonClick,
}
}

View File

@ -1,44 +1,44 @@
"use client"
"use client";
import { Sandbox } from "@/lib/types"
import { Ellipsis, Globe, Lock, Trash2 } from "lucide-react"
import { Sandbox } from "@/lib/types";
import { Ellipsis, Globe, Lock, Trash2 } from "lucide-react";
import {
DropdownMenu,
DropdownMenuContent,
DropdownMenuItem,
DropdownMenuTrigger,
} from "@/components/ui/dropdown-menu"
} from "@/components/ui/dropdown-menu";
export default function ProjectCardDropdown({
visibility,
sandbox,
onVisibilityChange,
onDelete,
}: {
visibility: Sandbox["visibility"]
onVisibilityChange: () => void
onDelete: () => void
sandbox: Sandbox;
onVisibilityChange: (sandbox: Sandbox) => void;
onDelete: (sandbox: Sandbox) => void;
}) {
return (
<DropdownMenu modal={false}>
<DropdownMenuTrigger
onClick={(e) => {
e.preventDefault()
e.stopPropagation()
e.preventDefault();
e.stopPropagation();
}}
className="h-6 w-6 z-10 flex items-center justify-center transition-colors bg-transparent hover:bg-muted-foreground/25 rounded-sm outline-foreground"
className="h-6 w-6 flex items-center justify-center transition-colors bg-transparent hover:bg-muted-foreground/25 rounded-sm outline-foreground"
>
<Ellipsis className="w-4 h-4" />
</DropdownMenuTrigger>
<DropdownMenuContent className="w-40">
<DropdownMenuItem
onClick={(e) => {
e.stopPropagation()
onVisibilityChange()
e.stopPropagation();
onVisibilityChange(sandbox);
}}
className="cursor-pointer"
>
{visibility === "public" ? (
{sandbox.visibility === "public" ? (
<>
<Lock className="mr-2 h-4 w-4" />
<span>Make Private</span>
@ -52,8 +52,8 @@ export default function ProjectCardDropdown({
</DropdownMenuItem>
<DropdownMenuItem
onClick={(e) => {
e.stopPropagation()
onDelete()
e.stopPropagation();
onDelete(sandbox);
}}
className="!text-destructive cursor-pointer"
>
@ -62,5 +62,5 @@ export default function ProjectCardDropdown({
</DropdownMenuItem>
</DropdownMenuContent>
</DropdownMenu>
)
);
}

View File

@ -1,235 +1,56 @@
"use client"
import { Button } from "@/components/ui/button"
import { Card } from "@/components/ui/card"
import { toggleLike } from "@/lib/actions"
import { projectTemplates } from "@/lib/data"
import { Sandbox } from "@/lib/types"
import { cn } from "@/lib/utils"
import { useUser } from "@clerk/nextjs"
import { AnimatePresence, motion } from "framer-motion"
import { Clock, Eye, Globe, Heart, Lock } from "lucide-react"
import Image from "next/image"
import Link from "next/link"
import { useRouter } from "next/navigation"
import {
memo,
MouseEventHandler,
useEffect,
useMemo,
useOptimistic,
useState,
useTransition,
} from "react"
import { useEffect, useState } from "react"
import ProjectCardDropdown from "./dropdown"
import { CanvasRevealEffect } from "./revealEffect"
import { Clock, Globe, Lock } from "lucide-react"
import { Sandbox } from "@/lib/types"
import { Card } from "@/components/ui/card"
import { useRouter } from "next/navigation"
type BaseProjectCardProps = {
id: string
name: string
type: string
visibility: "public" | "private"
createdAt: Date
likeCount: number
liked?: boolean
viewCount: number
}
type AuthenticatedProjectCardProps = BaseProjectCardProps & {
isAuthenticated: true
onVisibilityChange: (
sandbox: Pick<Sandbox, "id" | "name" | "visibility">
) => void
onDelete: (sandbox: Pick<Sandbox, "id" | "name">) => void
export default function ProjectCard({
children,
sandbox,
onVisibilityChange,
onDelete,
deletingId,
}: {
children?: React.ReactNode
sandbox: Sandbox
onVisibilityChange: (sandbox: Sandbox) => void
onDelete: (sandbox: Sandbox) => void
deletingId: string
}
type UnauthenticatedProjectCardProps = BaseProjectCardProps & {
isAuthenticated: false
}
type ProjectCardProps =
| AuthenticatedProjectCardProps
| UnauthenticatedProjectCardProps
const StatItem = memo(({ icon: Icon, value }: { icon: any; value: number }) => (
<div className="flex items-center space-x-1">
<Icon className="size-4" />
<span className="text-xs">{value}</span>
</div>
))
StatItem.displayName = "StatItem"
const formatDate = (date: Date): string => {
const now = new Date()
const diffInMinutes = Math.floor((now.getTime() - date.getTime()) / 60000)
if (diffInMinutes < 1) return "Now"
if (diffInMinutes < 60) return `${diffInMinutes}m ago`
if (diffInMinutes < 1440) return `${Math.floor(diffInMinutes / 60)}h ago`
return `${Math.floor(diffInMinutes / 1440)}d ago`
}
const ProjectMetadata = memo(
({
id,
visibility,
createdAt,
likeCount,
liked,
viewCount,
}: Pick<
BaseProjectCardProps,
"visibility" | "createdAt" | "likeCount" | "liked" | "viewCount" | "id"
>) => {
const { user } = useUser()
const [date, setDate] = useState<string>()
const Icon = visibility === "private" ? Lock : Globe
useEffect(() => {
setDate(formatDate(new Date(createdAt)))
}, [createdAt])
return (
<div className="flex flex-col text-muted-foreground space-y-2 text-sm z-10">
<div className="flex items-center justify-between">
<div className="flex items-center gap-2">
<Icon className="size-4" />
<span className="text-xs">
{visibility === "private" ? "Private" : "Public"}
</span>
</div>
</div>
<div className="flex gap-3">
<div className="flex items-center gap-2">
<Clock className="size-4" /> <span className="text-xs">{date}</span>
</div>
<LikeButton
sandboxId={id}
initialIsLiked={!!liked}
initialLikeCount={likeCount}
userId={user?.id ?? null}
/>
<StatItem icon={Eye} value={viewCount} />
</div>
</div>
)
}
)
ProjectMetadata.displayName = "ProjectMetadata"
interface LikeButtonProps {
sandboxId: string
userId: string | null
initialLikeCount: number
initialIsLiked: boolean
}
export function LikeButton({
sandboxId,
userId,
initialLikeCount,
initialIsLiked,
}: LikeButtonProps) {
// Optimistic state for like status and count
const [{ isLiked, likeCount }, optimisticUpdateLike] = useOptimistic(
{ isLiked: initialIsLiked, likeCount: initialLikeCount },
(state, optimisticValue: boolean) => {
return {
isLiked: optimisticValue,
likeCount: state.likeCount + (optimisticValue ? 1 : -1),
}
}
)
const [isPending, startTransition] = useTransition()
const handleLike: MouseEventHandler<HTMLButtonElement> = async (e) => {
e.stopPropagation() // Prevent click event from bubbling up which leads to navigation to /code/:id
if (!userId) return
startTransition(async () => {
const newLikeState = !isLiked
try {
optimisticUpdateLike(newLikeState)
await toggleLike(sandboxId, userId)
} catch (error) {
console.log("error", error)
optimisticUpdateLike(!newLikeState)
}
})
}
return (
<Button
variant="ghost"
size="sm"
disabled={!userId || isPending}
onClick={handleLike}
className="gap-1 px-1 rounded-full"
>
<Heart
className={cn("size-4", isLiked ? "stroke-red-500 fill-red-500" : "")}
/>
<span className="text-xs">{likeCount}</span>
</Button>
)
}
function ProjectCardComponent({
id,
name,
type,
visibility,
createdAt,
likeCount,
viewCount,
...props
}: ProjectCardProps) {
}) {
const [hovered, setHovered] = useState(false)
const [date, setDate] = useState<string>()
const router = useRouter()
const projectIcon = useMemo(
() =>
projectTemplates.find((p) => p.id === type)?.icon ??
"/project-icons/node.svg",
[type]
)
useEffect(() => {
const createdAt = new Date(sandbox.createdAt)
const now = new Date()
const diffInMinutes = Math.floor(
(now.getTime() - createdAt.getTime()) / 60000
)
const handleVisibilityChange = () => {
if (props.isAuthenticated) {
props.onVisibilityChange({
id,
name,
visibility,
})
if (diffInMinutes < 1) {
setDate("Now")
} else if (diffInMinutes < 60) {
setDate(`${diffInMinutes}m ago`)
} else if (diffInMinutes < 1440) {
setDate(`${Math.floor(diffInMinutes / 60)}h ago`)
} else {
setDate(`${Math.floor(diffInMinutes / 1440)}d ago`)
}
}
const handleDelete = () => {
if (props.isAuthenticated) {
props.onDelete({
id,
name,
})
}
}
}, [sandbox])
return (
<Card
tabIndex={0}
onClick={() => router.push(`/code/${id}`)}
onClick={() => router.push(`/code/${sandbox.id}`)}
onMouseEnter={() => setHovered(true)}
onMouseLeave={() => setHovered(false)}
className={`
group/canvas-card p-4 h-48 flex flex-col justify-between items-start
hover:border-muted-foreground/50 relative overflow-hidden transition-all
${
props.isAuthenticated && props.deletingId === id
? "opacity-50 pointer-events-none cursor-events-none"
: "cursor-pointer"
}
`}
className={`group/canvas-card p-4 h-48 flex flex-col justify-between items-start hover:border-muted-foreground/50 relative overflow-hidden transition-all`}
>
<AnimatePresence>
{hovered && (
@ -238,64 +59,47 @@ function ProjectCardComponent({
animate={{ opacity: 1 }}
className="h-full w-full absolute inset-0"
>
<CanvasRevealEffect
animationSpeed={3}
containerClassName="bg-muted"
colors={colors[type]}
dotSize={2}
/>
<div className="absolute inset-0 [mask-image:radial-gradient(400px_at_center,white,transparent)] bg-background/75" />
{children}
</motion.div>
)}
</AnimatePresence>
<div className="space-x-2 flex items-center justify-start w-full z-10">
<Image
alt={`${type} project icon`}
src={projectIcon}
alt=""
src={
sandbox.type === "react"
? "/project-icons/react.svg"
: "/project-icons/node.svg"
}
width={20}
height={20}
/>
<Link
href={`/code/${id}`}
className="font-medium static whitespace-nowrap w-full text-ellipsis overflow-hidden before:content-[''] before:absolute before:z-0 before:top-0 before:left-0 before:w-full before:h-full before:rounded-xl"
>
{name}
</Link>
{props.isAuthenticated && (
<ProjectCardDropdown
onVisibilityChange={handleVisibilityChange}
onDelete={handleDelete}
visibility={visibility}
/>
)}
<div className="font-medium static whitespace-nowrap w-full text-ellipsis overflow-hidden">
{sandbox.name}
</div>
<ProjectCardDropdown
sandbox={sandbox}
onVisibilityChange={onVisibilityChange}
onDelete={onDelete}
/>
</div>
<div className="flex flex-col text-muted-foreground space-y-0.5 text-sm z-10">
<div className="flex items-center">
{sandbox.visibility === "private" ? (
<>
<Lock className="w-3 h-3 mr-2" /> Private
</>
) : (
<>
<Globe className="w-3 h-3 mr-2" /> Public
</>
)}
</div>
<div className="flex items-center">
<Clock className="w-3 h-3 mr-2" /> {date}
</div>
</div>
<ProjectMetadata
visibility={visibility}
createdAt={createdAt}
likeCount={likeCount}
viewCount={viewCount}
id={id}
liked={props.liked}
/>
</Card>
)
}
ProjectCardComponent.displayName = "ProjectCard"
const ProjectCard = memo(ProjectCardComponent)
export default ProjectCard
const colors: { [key: string]: number[][] } = {
react: [
[71, 207, 237],
[30, 126, 148],
],
node: [
[86, 184, 72],
[59, 112, 52],
],
}

View File

@ -1,8 +1,8 @@
"use client"
import { cn } from "@/lib/utils"
import { Canvas, useFrame, useThree } from "@react-three/fiber"
import React, { useMemo, useRef } from "react"
import * as THREE from "three"
"use client";
import { cn } from "@/lib/utils";
import { Canvas, useFrame, useThree } from "@react-three/fiber";
import React, { useMemo, useRef } from "react";
import * as THREE from "three";
export const CanvasRevealEffect = ({
animationSpeed = 0.4,
@ -12,12 +12,12 @@ export const CanvasRevealEffect = ({
dotSize,
showGradient = true,
}: {
animationSpeed?: number
opacities?: number[]
colors?: number[][]
containerClassName?: string
dotSize?: number
showGradient?: boolean
animationSpeed?: number;
opacities?: number[];
colors?: number[][];
containerClassName?: string;
dotSize?: number;
showGradient?: boolean;
}) => {
return (
<div className={cn("h-full relative bg-white w-full", containerClassName)}>
@ -41,16 +41,16 @@ export const CanvasRevealEffect = ({
<div className="absolute inset-0 bg-gradient-to-t from-background to-[100%]" />
)}
</div>
)
}
);
};
interface DotMatrixProps {
colors?: number[][]
opacities?: number[]
totalSize?: number
dotSize?: number
shader?: string
center?: ("x" | "y")[]
colors?: number[][];
opacities?: number[];
totalSize?: number;
dotSize?: number;
shader?: string;
center?: ("x" | "y")[];
}
const DotMatrix: React.FC<DotMatrixProps> = ({
@ -69,7 +69,7 @@ const DotMatrix: React.FC<DotMatrixProps> = ({
colors[0],
colors[0],
colors[0],
]
];
if (colors.length === 2) {
colorsArray = [
colors[0],
@ -78,7 +78,7 @@ const DotMatrix: React.FC<DotMatrixProps> = ({
colors[1],
colors[1],
colors[1],
]
];
} else if (colors.length === 3) {
colorsArray = [
colors[0],
@ -87,7 +87,7 @@ const DotMatrix: React.FC<DotMatrixProps> = ({
colors[1],
colors[2],
colors[2],
]
];
}
return {
@ -111,8 +111,8 @@ const DotMatrix: React.FC<DotMatrixProps> = ({
value: dotSize,
type: "uniform1f",
},
}
}, [colors, opacities, totalSize, dotSize])
};
}, [colors, opacities, totalSize, dotSize]);
return (
<Shader
@ -168,87 +168,87 @@ const DotMatrix: React.FC<DotMatrixProps> = ({
uniforms={uniforms}
maxFps={60}
/>
)
}
);
};
type Uniforms = {
[key: string]: {
value: number[] | number[][] | number
type: string
}
}
value: number[] | number[][] | number;
type: string;
};
};
const ShaderMaterial = ({
source,
uniforms,
maxFps = 60,
}: {
source: string
hovered?: boolean
maxFps?: number
uniforms: Uniforms
source: string;
hovered?: boolean;
maxFps?: number;
uniforms: Uniforms;
}) => {
const { size } = useThree()
const ref = useRef<THREE.Mesh>()
let lastFrameTime = 0
const { size } = useThree();
const ref = useRef<THREE.Mesh>();
let lastFrameTime = 0;
useFrame(({ clock }) => {
if (!ref.current) return
const timestamp = clock.getElapsedTime()
if (!ref.current) return;
const timestamp = clock.getElapsedTime();
if (timestamp - lastFrameTime < 1 / maxFps) {
return
return;
}
lastFrameTime = timestamp
lastFrameTime = timestamp;
const material: any = ref.current.material
const timeLocation = material.uniforms.u_time
timeLocation.value = timestamp
})
const material: any = ref.current.material;
const timeLocation = material.uniforms.u_time;
timeLocation.value = timestamp;
});
const getUniforms = () => {
const preparedUniforms: any = {}
const preparedUniforms: any = {};
for (const uniformName in uniforms) {
const uniform: any = uniforms[uniformName]
const uniform: any = uniforms[uniformName];
switch (uniform.type) {
case "uniform1f":
preparedUniforms[uniformName] = { value: uniform.value, type: "1f" }
break
preparedUniforms[uniformName] = { value: uniform.value, type: "1f" };
break;
case "uniform3f":
preparedUniforms[uniformName] = {
value: new THREE.Vector3().fromArray(uniform.value),
type: "3f",
}
break
};
break;
case "uniform1fv":
preparedUniforms[uniformName] = { value: uniform.value, type: "1fv" }
break
preparedUniforms[uniformName] = { value: uniform.value, type: "1fv" };
break;
case "uniform3fv":
preparedUniforms[uniformName] = {
value: uniform.value.map((v: number[]) =>
new THREE.Vector3().fromArray(v)
),
type: "3fv",
}
break
};
break;
case "uniform2f":
preparedUniforms[uniformName] = {
value: new THREE.Vector2().fromArray(uniform.value),
type: "2f",
}
break
};
break;
default:
console.error(`Invalid uniform type for '${uniformName}'.`)
break
console.error(`Invalid uniform type for '${uniformName}'.`);
break;
}
}
preparedUniforms["u_time"] = { value: 0, type: "1f" }
preparedUniforms["u_time"] = { value: 0, type: "1f" };
preparedUniforms["u_resolution"] = {
value: new THREE.Vector2(size.width * 2, size.height * 2),
} // Initialize u_resolution
return preparedUniforms
}
}; // Initialize u_resolution
return preparedUniforms;
};
// Shader material
const material = useMemo(() => {
@ -272,33 +272,33 @@ const ShaderMaterial = ({
blending: THREE.CustomBlending,
blendSrc: THREE.SrcAlphaFactor,
blendDst: THREE.OneFactor,
})
});
return materialObject
}, [size.width, size.height, source])
return materialObject;
}, [size.width, size.height, source]);
return (
<mesh ref={ref as any}>
<planeGeometry args={[2, 2]} />
<primitive object={material} attach="material" />
</mesh>
)
}
);
};
const Shader: React.FC<ShaderProps> = ({ source, uniforms, maxFps = 60 }) => {
return (
<Canvas className="absolute inset-0 h-full w-full">
<ShaderMaterial source={source} uniforms={uniforms} maxFps={maxFps} />
</Canvas>
)
}
);
};
interface ShaderProps {
source: string
source: string;
uniforms: {
[key: string]: {
value: number[] | number[][] | number
type: string
}
}
maxFps?: number
value: number[] | number[][] | number;
type: string;
};
};
maxFps?: number;
}

View File

@ -1,12 +1,18 @@
"use client"
"use client";
import { deleteSandbox, updateSandbox } from "@/lib/actions"
import { Sandbox } from "@/lib/types"
import { useEffect, useMemo, useState } from "react"
import { toast } from "sonner"
import ProjectCard from "./projectCard"
import { Sandbox } from "@/lib/types";
import ProjectCard from "./projectCard";
import Image from "next/image";
import ProjectCardDropdown from "./projectCard/dropdown";
import { Clock, Globe, Lock } from "lucide-react";
import Link from "next/link";
import { Card } from "../ui/card";
import { deleteSandbox, updateSandbox } from "@/lib/actions";
import { toast } from "sonner";
import { useEffect, useState } from "react";
import { CanvasRevealEffect } from "./projectCard/revealEffect";
const colors: { [key: string]: number[][] } = {
const colors = {
react: [
[71, 207, 237],
[30, 126, 148],
@ -15,44 +21,38 @@ const colors: { [key: string]: number[][] } = {
[86, 184, 72],
[59, 112, 52],
],
}
};
export default function DashboardProjects({
sandboxes,
q,
}: {
sandboxes: Sandbox[]
q: string | null
sandboxes: Sandbox[];
q: string | null;
}) {
const [deletingId, setDeletingId] = useState<string>("")
const [deletingId, setDeletingId] = useState<string>("");
const onVisibilityChange = useMemo(
() => async (sandbox: Pick<Sandbox, "id" | "name" | "visibility">) => {
const newVisibility =
sandbox.visibility === "public" ? "private" : "public"
toast(`Project ${sandbox.name} is now ${newVisibility}.`)
await updateSandbox({
id: sandbox.id,
visibility: newVisibility,
})
},
[]
)
const onDelete = useMemo(
() => async (sandbox: Pick<Sandbox, "id" | "name">) => {
setDeletingId(sandbox.id)
toast(`Project ${sandbox.name} deleted.`)
await deleteSandbox(sandbox.id)
},
[]
)
const onDelete = async (sandbox: Sandbox) => {
setDeletingId(sandbox.id);
toast(`Project ${sandbox.name} deleted.`);
await deleteSandbox(sandbox.id);
};
useEffect(() => {
if (deletingId) {
setDeletingId("")
setDeletingId("");
}
}, [sandboxes])
}, [sandboxes]);
const onVisibilityChange = async (sandbox: Sandbox) => {
const newVisibility =
sandbox.visibility === "public" ? "private" : "public";
toast(`Project ${sandbox.name} is now ${newVisibility}.`);
await updateSandbox({
id: sandbox.id,
visibility: newVisibility,
});
};
return (
<div className="grow p-4 flex flex-col">
@ -65,19 +65,35 @@ export default function DashboardProjects({
{sandboxes.map((sandbox) => {
if (q && q.length > 0) {
if (!sandbox.name.toLowerCase().includes(q.toLowerCase())) {
return null
return null;
}
}
return (
<ProjectCard
<Link
key={sandbox.id}
onVisibilityChange={onVisibilityChange}
onDelete={onDelete}
deletingId={deletingId}
isAuthenticated
{...sandbox}
/>
)
href={`/code/${sandbox.id}`}
className={`${
deletingId === sandbox.id
? "pointer-events-none opacity-50 cursor-events-none"
: "cursor-pointer"
} transition-all focus-visible:outline-none focus-visible:ring-offset-2 focus-visible:ring-offset-background focus-visible:ring-2 focus-visible:ring-ring rounded-lg`}
>
<ProjectCard
sandbox={sandbox}
onVisibilityChange={onVisibilityChange}
onDelete={onDelete}
deletingId={deletingId}
>
<CanvasRevealEffect
animationSpeed={3}
containerClassName="bg-black"
colors={colors[sandbox.type]}
dotSize={2}
/>
<div className="absolute inset-0 [mask-image:radial-gradient(400px_at_center,white,transparent)] bg-background/75" />
</ProjectCard>
</Link>
);
})}
</div>
) : (
@ -87,5 +103,5 @@ export default function DashboardProjects({
)}
</div>
</div>
)
);
}

View File

@ -1,29 +1,29 @@
import { Sandbox } from "@/lib/types";
import {
Table,
TableBody,
TableCaption,
TableCell,
TableHead,
TableHeader,
TableRow,
} from "@/components/ui/table"
import { projectTemplates } from "@/lib/data"
import { ChevronRight } from "lucide-react"
import Image from "next/image"
import Link from "next/link"
import Avatar from "../ui/avatar"
import Button from "../ui/customButton"
} from "@/components/ui/table";
import Image from "next/image";
import Button from "../ui/customButton";
import { ChevronRight } from "lucide-react";
import Avatar from "../ui/avatar";
import Link from "next/link";
export default function DashboardSharedWithMe({
shared,
}: {
shared: {
id: string
name: string
type: string
author: string
authorAvatarUrl: string
sharedOn: Date
}[]
id: string;
name: string;
type: "react" | "node";
author: string;
sharedOn: Date;
}[];
}) {
return (
<div className="grow p-4 flex flex-col">
@ -47,8 +47,9 @@ export default function DashboardSharedWithMe({
<Image
alt=""
src={
projectTemplates.find((p) => p.id === sandbox.type)
?.icon ?? "/project-icons/node.svg"
sandbox.type === "react"
? "/project-icons/react.svg"
: "/project-icons/node.svg"
}
width={20}
height={20}
@ -59,11 +60,7 @@ export default function DashboardSharedWithMe({
</TableCell>
<TableCell>
<div className="flex items-center">
<Avatar
name={sandbox.author}
avatarUrl={sandbox.authorAvatarUrl}
className="mr-2"
/>
<Avatar name={sandbox.author} className="mr-2" />
{sandbox.author}
</div>
</TableCell>
@ -89,5 +86,5 @@ export default function DashboardSharedWithMe({
</div>
)}
</div>
)
);
}

View File

@ -1,77 +0,0 @@
import { Check, Loader2 } from "lucide-react"
import { useState } from "react"
import { toast } from "sonner"
import { Button } from "../../ui/button"
interface ApplyButtonProps {
code: string
activeFileName: string
activeFileContent: string
editorRef: { current: any }
onApply: (mergedCode: string, originalCode: string) => void
}
export default function ApplyButton({
code,
activeFileName,
activeFileContent,
editorRef,
onApply,
}: ApplyButtonProps) {
const [isApplying, setIsApplying] = useState(false)
const handleApply = async () => {
setIsApplying(true)
try {
const response = await fetch("/api/merge", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
originalCode: activeFileContent,
newCode: String(code),
fileName: activeFileName,
}),
})
if (!response.ok) {
throw new Error(await response.text())
}
const reader = response.body?.getReader()
const decoder = new TextDecoder()
let mergedCode = ""
if (reader) {
while (true) {
const { done, value } = await reader.read()
if (done) break
mergedCode += decoder.decode(value, { stream: true })
}
}
onApply(mergedCode.trim(), activeFileContent)
} catch (error) {
console.error("Error applying code:", error)
toast.error(
error instanceof Error ? error.message : "Failed to apply code changes"
)
} finally {
setIsApplying(false)
}
}
return (
<Button
onClick={handleApply}
size="sm"
variant="ghost"
className="p-1 h-6"
disabled={isApplying}
>
{isApplying ? (
<Loader2 className="w-4 h-4 animate-spin" />
) : (
<Check className="w-4 h-4" />
)}
</Button>
)
}

View File

@ -1,245 +0,0 @@
import { TFile, TFolder } from "@/lib/types"
import { Image as ImageIcon, Paperclip, Send, StopCircle } from "lucide-react"
import { useEffect } from "react"
import { Button } from "../../ui/button"
import { looksLikeCode } from "./lib/chatUtils"
import { ALLOWED_FILE_TYPES, ChatInputProps } from "./types"
export default function ChatInput({
input,
setInput,
isGenerating,
handleSend,
handleStopGeneration,
onImageUpload,
addContextTab,
activeFileName,
editorRef,
lastCopiedRangeRef,
contextTabs,
onRemoveTab,
textareaRef,
}: ChatInputProps) {
// Auto-resize textarea as content changes
useEffect(() => {
if (textareaRef.current) {
textareaRef.current.style.height = "auto"
textareaRef.current.style.height = textareaRef.current.scrollHeight + "px"
}
}, [input])
// Handle keyboard events for sending messages
const handleKeyDown = (e: React.KeyboardEvent) => {
if (e.key === "Enter") {
if (e.ctrlKey) {
e.preventDefault()
handleSend(true) // Send with full context
} else if (!e.shiftKey && !isGenerating) {
e.preventDefault()
handleSend(false)
}
} else if (
e.key === "Backspace" &&
input === "" &&
contextTabs.length > 0
) {
e.preventDefault()
// Remove the last context tab
const lastTab = contextTabs[contextTabs.length - 1]
onRemoveTab(lastTab.id)
}
}
// Handle paste events for image and code
const handlePaste = async (e: React.ClipboardEvent) => {
// Handle image paste
const items = Array.from(e.clipboardData.items)
for (const item of items) {
if (item.type.startsWith("image/")) {
e.preventDefault()
const file = item.getAsFile()
if (!file) continue
try {
// Convert image to base64 string for context tab title and timestamp
const reader = new FileReader()
reader.onload = () => {
const base64String = reader.result as string
addContextTab(
"image",
`Image ${new Date()
.toLocaleTimeString("en-US", {
hour12: true,
hour: "2-digit",
minute: "2-digit",
})
.replace(/(\d{2}):(\d{2})/, "$1:$2")}`,
base64String
)
}
reader.readAsDataURL(file)
} catch (error) {
console.error("Error processing pasted image:", error)
}
return
}
}
// Get text from clipboard
const text = e.clipboardData.getData("text")
// If text doesn't contain newlines or doesn't look like code, let it paste normally
if (!text || !text.includes("\n") || !looksLikeCode(text)) {
return
}
e.preventDefault()
const editor = editorRef.current
const currentSelection = editor?.getSelection()
const lines = text.split("\n")
// TODO: FIX THIS: even when i paste the outside code, it shows the active file name,it works when no tabs are open, just does not work when the tab is open
// If selection exists in editor, use file name and line numbers
if (currentSelection && !currentSelection.isEmpty()) {
addContextTab(
"code",
`${activeFileName} (${currentSelection.startLineNumber}-${currentSelection.endLineNumber})`,
text,
{
start: currentSelection.startLineNumber,
end: currentSelection.endLineNumber,
}
)
return
}
// If we have stored line range from a copy operation in the editor
if (lastCopiedRangeRef.current) {
const range = lastCopiedRangeRef.current
addContextTab(
"code",
`${activeFileName} (${range.startLine}-${range.endLine})`,
text,
{ start: range.startLine, end: range.endLine }
)
return
}
// For code pasted from outside the editor
addContextTab("code", `Pasted Code (1-${lines.length})`, text, {
start: 1,
end: lines.length,
})
}
// Handle image upload from local machine via input
const handleImageUpload = () => {
const input = document.createElement("input")
input.type = "file"
input.accept = "image/*"
input.onchange = (e) => {
const file = (e.target as HTMLInputElement).files?.[0]
if (file) onImageUpload(file)
}
input.click()
}
// Helper function to flatten the file tree
const getAllFiles = (items: (TFile | TFolder)[]): TFile[] => {
return items.reduce((acc: TFile[], item) => {
if (item.type === "file") {
acc.push(item)
} else {
acc.push(...getAllFiles(item.children))
}
return acc
}, [])
}
// Handle file upload from local machine via input
const handleFileUpload = () => {
const input = document.createElement("input")
input.type = "file"
input.accept = ".txt,.md,.csv,.json,.js,.ts,.html,.css,.pdf"
input.onchange = (e) => {
const file = (e.target as HTMLInputElement).files?.[0]
if (file) {
if (!(file.type in ALLOWED_FILE_TYPES)) {
alert(
"Unsupported file type. Please upload text, code, or PDF files."
)
return
}
const reader = new FileReader()
reader.onload = () => {
addContextTab("file", file.name, reader.result as string)
}
reader.readAsText(file)
}
}
input.click()
}
return (
<div className="space-y-2">
<div className="flex space-x-2 min-w-0">
<textarea
ref={textareaRef}
value={input}
onChange={(e) => setInput(e.target.value)}
onKeyDown={handleKeyDown}
onPaste={handlePaste}
className="flex-grow p-2 border rounded-lg min-w-0 bg-input resize-none overflow-hidden"
placeholder="Type your message..."
disabled={isGenerating}
rows={1}
/>
{/* Render stop generation button */}
{isGenerating ? (
<Button
onClick={handleStopGeneration}
variant="destructive"
size="icon"
className="h-10 w-10"
>
<StopCircle className="w-4 h-4" />
</Button>
) : (
<Button
onClick={() => handleSend(false)}
disabled={isGenerating}
size="icon"
className="h-10 w-10"
>
<Send className="w-4 h-4" />
</Button>
)}
</div>
<div className="flex items-center justify-end gap-2">
{/* Render file upload button */}
<Button
variant="ghost"
size="sm"
className="h-6 px-2 sm:px-3"
onClick={handleFileUpload}
>
<Paperclip className="h-3 w-3 sm:mr-1" />
<span className="hidden sm:inline">File</span>
</Button>
{/* Render image upload button */}
<Button
variant="ghost"
size="sm"
className="h-6 px-2 sm:px-3"
onClick={handleImageUpload}
>
<ImageIcon className="h-3 w-3 sm:mr-1" />
<span className="hidden sm:inline">Image</span>
</Button>
</div>
</div>
)
}

View File

@ -1,254 +0,0 @@
import { Check, Copy, CornerUpLeft } from "lucide-react"
import React, { useState } from "react"
import ReactMarkdown from "react-markdown"
import remarkGfm from "remark-gfm"
import { Button } from "../../ui/button"
import ContextTabs from "./ContextTabs"
import { copyToClipboard, stringifyContent } from "./lib/chatUtils"
import { createMarkdownComponents } from "./lib/markdownComponents"
import { MessageProps } from "./types"
export default function ChatMessage({
message,
setContext,
setIsContextExpanded,
socket,
handleApplyCode,
activeFileName,
activeFileContent,
editorRef,
mergeDecorationsCollection,
setMergeDecorationsCollection,
selectFile,
}: MessageProps) {
// State for expanded message index
const [expandedMessageIndex, setExpandedMessageIndex] = useState<
number | null
>(null)
// State for copied text
const [copiedText, setCopiedText] = useState<string | null>(null)
// Render copy button for text content
const renderCopyButton = (text: any) => (
<Button
onClick={() => copyToClipboard(stringifyContent(text), setCopiedText)}
size="sm"
variant="ghost"
className="p-1 h-6"
>
{copiedText === stringifyContent(text) ? (
<Check className="w-4 h-4 text-green-500" />
) : (
<Copy className="w-4 h-4" />
)}
</Button>
)
// Set context for code when asking about code
const askAboutCode = (code: any) => {
const contextString = stringifyContent(code)
const newContext = `Regarding this code:\n${contextString}`
// Format timestamp to match chat message format (HH:MM PM)
const timestamp = new Date().toLocaleTimeString("en-US", {
hour12: true,
hour: "2-digit",
minute: "2-digit",
})
// Instead of replacing context, append to it
if (message.role === "assistant") {
// For assistant messages, create a new context tab with the response content and timestamp
setContext(newContext, `AI Response (${timestamp})`, {
start: 1,
end: contextString.split("\n").length,
})
} else {
// For user messages, create a new context tab with the selected content and timestamp
setContext(newContext, `User Chat (${timestamp})`, {
start: 1,
end: contextString.split("\n").length,
})
}
setIsContextExpanded(false)
}
// Render markdown elements for code and text
const renderMarkdownElement = (props: any) => {
const { node, children } = props
const content = stringifyContent(children)
return (
<div className="relative group">
<div className="absolute top-0 right-0 flex opacity-0 group-hover:opacity-30 transition-opacity">
{renderCopyButton(content)}
<Button
onClick={() => askAboutCode(content)}
size="sm"
variant="ghost"
className="p-1 h-6"
>
<CornerUpLeft className="w-4 h-4" />
</Button>
</div>
{/* Render markdown element */}
{React.createElement(
node.tagName,
{
...props,
className: `${
props.className || ""
} hover:bg-transparent rounded p-1 transition-colors`,
},
children
)}
</div>
)
}
// Create markdown components
const components = createMarkdownComponents(
renderCopyButton,
renderMarkdownElement,
askAboutCode,
activeFileName,
activeFileContent,
editorRef,
handleApplyCode,
selectFile,
mergeDecorationsCollection,
setMergeDecorationsCollection,
)
return (
<div className="text-left relative">
<div
className={`relative p-2 rounded-lg ${
message.role === "user"
? "bg-foreground text-background"
: "bg-background text-foreground"
} max-w-full`}
>
{/* Render context tabs */}
{message.role === "user" && message.context && (
<div className="mb-2 bg-input rounded-lg">
<ContextTabs
socket={socket}
activeFileName=""
onAddFile={() => {}}
contextTabs={parseContextToTabs(message.context)}
onRemoveTab={() => {}}
isExpanded={expandedMessageIndex === 0}
onToggleExpand={() =>
setExpandedMessageIndex(expandedMessageIndex === 0 ? null : 0)
}
className="[&_div:first-child>div:first-child>div]:bg-[#0D0D0D] [&_button:first-child]:hidden [&_button:last-child]:hidden"
/>
{expandedMessageIndex === 0 && (
<div className="relative">
<div className="absolute top-0 right-0 flex p-1">
{renderCopyButton(
message.context.replace(/^Regarding this code:\n/, "")
)}
</div>
{/* Render code textarea */}
{(() => {
const code = message.context.replace(
/^Regarding this code:\n/,
""
)
const match = /language-(\w+)/.exec(code)
const language = match ? match[1] : "typescript"
return (
<div className="pt-6">
<textarea
value={code}
onChange={(e) => {
const updatedContext = `Regarding this code:\n${e.target.value}`
setContext(updatedContext, "Selected Content", {
start: 1,
end: e.target.value.split("\n").length,
})
}}
className="w-full p-2 bg-[#1e1e1e] text-foreground font-mono text-sm rounded"
rows={code.split("\n").length}
style={{
resize: "vertical",
minHeight: "100px",
maxHeight: "400px",
}}
/>
</div>
)
})()}
</div>
)}
</div>
)}
{/* Render copy and ask about code buttons */}
{message.role === "user" && (
<div className="absolute top-0 right-0 p-1 flex opacity-40">
{renderCopyButton(message.content)}
<Button
onClick={() => askAboutCode(message.content)}
size="sm"
variant="ghost"
className="p-1 h-6"
>
<CornerUpLeft className="w-4 h-4" />
</Button>
</div>
)}
{/* Render markdown content */}
{message.role === "assistant" ? (
<ReactMarkdown remarkPlugins={[remarkGfm]} components={components}>
{message.content}
</ReactMarkdown>
) : (
<div className="whitespace-pre-wrap group">{message.content}</div>
)}
</div>
</div>
)
}
// Parse context to tabs for context tabs component
function parseContextToTabs(context: string) {
// Use specific regex patterns to avoid matching import statements
const sections = context.split(/(?=File |Code from |Image \d{1,2}:)/)
return sections
.map((section, index) => {
const lines = section.trim().split("\n")
const titleLine = lines[0]
let content = lines.slice(1).join("\n").trim()
// Remove code block markers for display
content = content.replace(/^```[\w-]*\n/, "").replace(/\n```$/, "")
// Determine the type of context
const isFile = titleLine.startsWith("File ")
const isImage = titleLine.startsWith("Image ")
const type = isFile ? "file" : isImage ? "image" : "code"
const name = titleLine
.replace(/^(File |Code from |Image )/, "")
.replace(":", "")
.trim()
// Skip if the content is empty or if it's just an import statement
if (!content || content.trim().startsWith('from "')) {
return null
}
return {
id: `context-${index}`,
type: type as "file" | "code" | "image",
name: name,
content: content,
}
})
.filter(
(tab): tab is NonNullable<typeof tab> =>
tab !== null && tab.content.length > 0
)
}

View File

@ -1,177 +0,0 @@
import { Input } from "@/components/ui/input"
import {
Popover,
PopoverContent,
PopoverTrigger,
} from "@/components/ui/popover"
import { TFile, TFolder } from "@/lib/types"
import { FileText, Image as ImageIcon, Plus, X } from "lucide-react"
import { useState } from "react"
import { Button } from "../../ui/button"
import { ContextTab, ContextTabsProps } from "./types"
// Ignore certain folders and files from the file tree
import { ignoredFiles, ignoredFolders } from "./lib/ignored-paths"
export default function ContextTabs({
contextTabs,
onRemoveTab,
className,
files = [],
onFileSelect,
}: ContextTabsProps & { className?: string }) {
// State for preview tab
const [previewTab, setPreviewTab] = useState<ContextTab | null>(null)
const [searchQuery, setSearchQuery] = useState("")
// Allow preview for images and code selections from editor
const togglePreview = (tab: ContextTab) => {
if (!tab.lineRange && tab.type !== "image") {
return
}
// Toggle preview for images and code selections from editor
if (previewTab?.id === tab.id) {
setPreviewTab(null)
} else {
setPreviewTab(tab)
}
}
// Remove tab from context when clicking on X
const handleRemoveTab = (id: string) => {
if (previewTab?.id === id) {
setPreviewTab(null)
}
onRemoveTab(id)
}
// Get all files from the file tree to search for context
const getAllFiles = (items: (TFile | TFolder)[]): TFile[] => {
return items.reduce((acc: TFile[], item) => {
// Add file if it's not ignored
if (
item.type === "file" &&
!ignoredFiles.some(
(pattern: string) =>
item.name.endsWith(pattern.replace("*", "")) ||
item.name === pattern
)
) {
acc.push(item)
// Add all files from folder if it's not ignored
} else if (
item.type === "folder" &&
!ignoredFolders.some((folder: string) => folder === item.name)
) {
acc.push(...getAllFiles(item.children))
}
return acc
}, [])
}
// Get all files from the file tree to search for context when adding context
const allFiles = getAllFiles(files)
const filteredFiles = allFiles.filter((file) =>
file.name.toLowerCase().includes(searchQuery.toLowerCase())
)
return (
<div className={`border-none ${className || ""}`}>
<div className="flex flex-col">
<div className="flex items-center gap-1 overflow-hidden mb-2 flex-wrap">
{/* Add context tab button */}
<Popover>
<PopoverTrigger asChild>
<Button variant="ghost" size="icon" className="h-6 w-6">
<Plus className="h-4 w-4" />
</Button>
</PopoverTrigger>
{/* Add context tab popover */}
<PopoverContent className="w-64 p-2">
<div className="flex gap-2 mb-2">
<Input
placeholder="Search files..."
value={searchQuery}
onChange={(e) => setSearchQuery(e.target.value)}
className="flex-1"
/>
</div>
<div className="max-h-[200px] overflow-y-auto">
{filteredFiles.map((file) => (
<Button
key={file.id}
variant="ghost"
className="w-full justify-start text-sm mb-1"
onClick={() => onFileSelect?.(file)}
>
<FileText className="h-4 w-4 mr-2" />
{file.name}
</Button>
))}
</div>
</PopoverContent>
</Popover>
{/* Add context tab button */}
{contextTabs.length === 0 && (
<div className="flex items-center gap-1 px-2 rounded">
<span className="text-sm text-muted-foreground">Add Context</span>
</div>
)}
{/* Render context tabs */}
{contextTabs.map((tab) => (
<div
key={tab.id}
className="flex items-center gap-1 px-2 bg-input rounded text-sm cursor-pointer hover:bg-muted"
onClick={() => togglePreview(tab)}
>
{tab.type === "image" && <ImageIcon className="h-3 w-3" />}
<span>{tab.name}</span>
<Button
variant="ghost"
size="icon"
className="h-4 w-4"
onClick={(e) => {
e.stopPropagation()
handleRemoveTab(tab.id)
}}
>
<X className="h-3 w-3" />
</Button>
</div>
))}
</div>
{/* Preview Section */}
{previewTab && (
<div className="p-2 bg-input rounded-md max-h-[200px] overflow-auto mb-2">
{previewTab.type === "image" ? (
<img
src={previewTab.content}
alt={previewTab.name}
className="max-w-full h-auto"
/>
) : (
previewTab.lineRange && (
<>
<div className="text-xs text-muted-foreground mt-1">
Lines {previewTab.lineRange.start}-
{previewTab.lineRange.end}
</div>
<pre className="text-xs font-mono whitespace-pre-wrap">
{previewTab.content}
</pre>
</>
)
)}
{/* Render file context tab */}
{previewTab.type === "file" && (
<pre className="text-xs font-mono whitespace-pre-wrap">
{previewTab.content}
</pre>
)}
</div>
)}
</div>
</div>
)
}

View File

@ -1,298 +0,0 @@
import { ScrollArea } from "@/components/ui/scroll-area"
import { useSocket } from "@/context/SocketContext"
import { TFile } from "@/lib/types"
import { ChevronDown, X } from "lucide-react"
import { nanoid } from "nanoid"
import { useEffect, useRef, useState } from "react"
import LoadingDots from "../../ui/LoadingDots"
import ChatInput from "./ChatInput"
import ChatMessage from "./ChatMessage"
import ContextTabs from "./ContextTabs"
import { handleSend, handleStopGeneration } from "./lib/chatUtils"
import { AIChatProps, ContextTab, Message } from "./types"
export default function AIChat({
activeFileContent,
activeFileName,
onClose,
editorRef,
lastCopiedRangeRef,
files,
templateType,
handleApplyCode,
selectFile,
mergeDecorationsCollection,
setMergeDecorationsCollection,
projectName,
}: AIChatProps) {
// Initialize socket and messages
const { socket } = useSocket()
const [messages, setMessages] = useState<Message[]>([])
// Initialize input and state for generating messages
const [input, setInput] = useState("")
const [isGenerating, setIsGenerating] = useState(false)
// Initialize chat container ref and abort controller ref
const chatContainerRef = useRef<HTMLDivElement>(null)
const abortControllerRef = useRef<AbortController | null>(null)
// Initialize context tabs and state for expanding context
const [contextTabs, setContextTabs] = useState<ContextTab[]>([])
const [isContextExpanded, setIsContextExpanded] = useState(false)
const [isLoading, setIsLoading] = useState(false)
// Initialize textarea ref
const textareaRef = useRef<HTMLTextAreaElement>(null)
// state variables for auto scroll and scroll button
const [autoScroll, setAutoScroll] = useState(true)
const [showScrollButton, setShowScrollButton] = useState(false)
// scroll to bottom of chat when messages change
useEffect(() => {
if (autoScroll) {
scrollToBottom()
}
}, [messages, autoScroll])
// scroll to bottom of chat when messages change
const scrollToBottom = (force: boolean = false) => {
if (!chatContainerRef.current || (!autoScroll && !force)) return
chatContainerRef.current.scrollTo({
top: chatContainerRef.current.scrollHeight,
behavior: force ? "smooth" : "auto",
})
}
// function to handle scroll events
const handleScroll = () => {
if (!chatContainerRef.current) return
const { scrollTop, scrollHeight, clientHeight } = chatContainerRef.current
const isAtBottom = Math.abs(scrollHeight - scrollTop - clientHeight) < 50
setAutoScroll(isAtBottom)
setShowScrollButton(!isAtBottom)
}
// scroll event listener
useEffect(() => {
const container = chatContainerRef.current
if (container) {
container.addEventListener("scroll", handleScroll)
return () => container.removeEventListener("scroll", handleScroll)
}
}, [])
// Add context tab to context tabs
const addContextTab = (
type: string,
name: string,
content: string,
lineRange?: { start: number; end: number }
) => {
const newTab = {
id: nanoid(),
type: type as "file" | "code" | "image",
name,
content,
lineRange,
}
setContextTabs((prev) => [...prev, newTab])
}
// Remove context tab from context tabs
const removeContextTab = (id: string) => {
setContextTabs((prev) => prev.filter((tab) => tab.id !== id))
}
// Add file to context tabs
const handleAddFile = (tab: ContextTab) => {
setContextTabs((prev) => [...prev, tab])
}
// Format code content to remove starting and ending code block markers if they exist
const formatCodeContent = (content: string) => {
return content.replace(/^```[\w-]*\n/, "").replace(/\n```$/, "")
}
// Get combined context from context tabs
const getCombinedContext = () => {
if (contextTabs.length === 0) return ""
return contextTabs
.map((tab) => {
if (tab.type === "file") {
const fileExt = tab.name.split(".").pop() || "txt"
const cleanContent = formatCodeContent(tab.content)
return `File ${tab.name}:\n\`\`\`${fileExt}\n${cleanContent}\n\`\`\``
} else if (tab.type === "code") {
const cleanContent = formatCodeContent(tab.content)
return `Code from ${tab.name}:\n\`\`\`typescript\n${cleanContent}\n\`\`\``
} else if (tab.type === "image") {
return `Image ${tab.name}:\n${tab.content}`
}
return `${tab.name}:\n${tab.content}`
})
.join("\n\n")
}
// Handle sending message with context
const handleSendWithContext = () => {
const combinedContext = getCombinedContext()
handleSend(
input,
combinedContext,
messages,
setMessages,
setInput,
setIsContextExpanded,
setIsGenerating,
setIsLoading,
abortControllerRef,
activeFileContent,
false,
templateType,
files,
projectName
)
}
// Set context for the chat
const setContext = (
context: string | null,
name: string,
range?: { start: number; end: number }
) => {
if (!context) {
setContextTabs([])
return
}
// Always add a new tab instead of updating existing ones
addContextTab("code", name, context, range)
}
// update context tabs when file contents change
useEffect(() => {
setContextTabs((prevTabs) =>
prevTabs.map((tab) => {
if (tab.type === "file" && tab.name === activeFileName) {
const fileExt = tab.name.split(".").pop() || "txt"
return {
...tab,
content: `\`\`\`${fileExt}\n${activeFileContent}\n\`\`\``,
}
}
return tab
})
)
}, [activeFileContent, activeFileName])
return (
<div className="flex flex-col h-screen w-full">
<div className="flex justify-between items-center p-2 border-b">
<span className="text-muted-foreground/50 font-medium">CHAT</span>
<div className="flex items-center h-full">
<span className="text-muted-foreground/50 font-medium">
{activeFileName}
</span>
<div className="mx-2 h-full w-px bg-muted-foreground/20"></div>
<button
onClick={onClose}
className="text-muted-foreground/50 hover:text-muted-foreground focus:outline-none"
aria-label="Close AI Chat"
>
<X size={18} />
</button>
</div>
</div>
<ScrollArea
ref={chatContainerRef}
className="flex-grow p-4 space-y-4 relative"
>
{messages.map((message, messageIndex) => (
// Render chat message component for each message
<ChatMessage
key={messageIndex}
message={message}
setContext={setContext}
setIsContextExpanded={setIsContextExpanded}
socket={socket}
handleApplyCode={handleApplyCode}
activeFileName={activeFileName}
activeFileContent={activeFileContent}
editorRef={editorRef}
mergeDecorationsCollection={mergeDecorationsCollection}
setMergeDecorationsCollection={setMergeDecorationsCollection}
selectFile={selectFile}
/>
))}
{isLoading && <LoadingDots />}
{/* Add scroll to bottom button */}
{showScrollButton && (
<button
onClick={() => scrollToBottom(true)}
className="fixed bottom-36 right-6 bg-primary text-primary-foreground rounded-md border border-primary p-0.5 shadow-lg hover:bg-primary/90 transition-all"
aria-label="Scroll to bottom"
>
<ChevronDown className="h-5 w-5" />
</button>
)}
</ScrollArea>
<div className="p-4 border-t mb-14">
{/* Render context tabs component */}
<ContextTabs
activeFileName={activeFileName}
onAddFile={handleAddFile}
contextTabs={contextTabs}
onRemoveTab={removeContextTab}
isExpanded={isContextExpanded}
onToggleExpand={() => setIsContextExpanded(!isContextExpanded)}
files={files}
socket={socket}
onFileSelect={(file: TFile) => {
socket?.emit("getFile", { fileId: file.id }, (response: string) => {
const fileExt = file.name.split(".").pop() || "txt"
const formattedContent = `\`\`\`${fileExt}\n${response}\n\`\`\``
addContextTab("file", file.name, formattedContent)
if (textareaRef.current) {
textareaRef.current.focus()
}
})
}}
/>
{/* Render chat input component */}
<ChatInput
textareaRef={textareaRef}
addContextTab={addContextTab}
editorRef={editorRef}
input={input}
setInput={setInput}
isGenerating={isGenerating}
handleSend={handleSendWithContext}
handleStopGeneration={() => handleStopGeneration(abortControllerRef)}
onImageUpload={(file) => {
const reader = new FileReader()
reader.onload = (e) => {
if (e.target?.result) {
addContextTab("image", file.name, e.target.result as string)
}
}
reader.readAsDataURL(file)
}}
lastCopiedRangeRef={lastCopiedRangeRef}
activeFileName={activeFileName}
contextTabs={contextTabs.map((tab) => ({
...tab,
title: tab.id,
}))}
onRemoveTab={removeContextTab}
/>
</div>
</div>
)
}

Some files were not shown because too many files have changed in this diff Show More