Skip to content

Jamessdevops/micracode

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

12 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Micracode
Open-Source AI Web App Builder

Micracode Demo

Describe an app in natural language and Micracode streams code into an in-browser workspace.
Iterate by chat or edit the code directly in a Monaco editor β€” everything runs on your laptop.


License: MIT Python 3.12+ Next.js 15 Bun


Your local AI coding workspace β€” no database, no auth, no cloud.

Getting started & staying tuned with us.

Star us, and you will receive all release notifications from GitHub without any delay!


✨ Features

  • πŸ› οΈ Natural-Language Codegen β€” Describe an app in plain English; Micracode streams a working project into the workspace file by file.

  • πŸ’¬ Iterative Chat β€” Refine your project through conversation. Ask for changes, fixes, or new features and watch them stream in.

  • πŸ“ In-Browser Monaco Editor β€” Edit generated code directly in a full Monaco editor; changes persist to disk.

  • πŸ”Œ Pluggable LLM Providers β€” Ships with Google Gemini by default; switch to OpenAI with one env var. Configurable model IDs.

  • πŸ“¦ Local-First Storage β€” Projects live as plain folders on your filesystem. No database, no auth, no cloud service required.

  • πŸ§ͺ Streaming Backend β€” Server-sent events deliver generated code in real time using a typed stream-event contract shared between web and API.

  • πŸ—‚οΈ Snapshots & Prompt History β€” Every project keeps its prompt history and snapshots so you can review or roll back.


πŸ› οΈ Tech Stack

Backend

  • FastAPI β€” High-performance Python web framework
  • LangChain + Google Gemini / OpenAI β€” Pluggable LLM orchestration (gemini-2.5-flash by default)
  • SSE-Starlette β€” Server-sent events for streaming code generation
  • UV β€” Modern Python package manager
  • Pytest β€” Storage and HTTP test suite

Frontend

  • Next.js 15 β€” React framework with App Router
  • React 19 β€” Latest React with concurrent features
  • Tailwind CSS β€” Utility-first CSS framework
  • Radix UI + shadcn/ui β€” Accessible component primitives
  • Monaco Editor β€” VS Code's editor in the browser
  • WebContainer API β€” Run Node.js apps directly in the browser
  • Zustand β€” Lightweight state management
  • ai-sdk β€” Vercel AI SDK for chat streaming

Tooling

  • Bun β€” JS workspace manager and runtime
  • TypeScript β€” End-to-end type safety, with shared types in packages/shared

πŸš€ Getting Started

Prerequisites

  • Node.js v22.18.0 (pinned via .nvmrc)
  • Bun β‰₯ 1.1.0
  • Python β‰₯ 3.12 (managed automatically by uv)
  • uv β‰₯ 0.4
  • A Google Gemini or OpenAI API key

Environment Setup

Copy the example env file into the API app and add your key:

cp .env.example apps/api/.env
$EDITOR apps/api/.env

Minimum config (Gemini, the default provider):

LLM_PROVIDER=gemini
GOOGLE_API_KEY=your_gemini_api_key

Or use OpenAI:

LLM_PROVIDER=openai
OPENAI_API_KEY=your_openai_api_key
OPENAI_MODEL=gpt-4o

See docs/configuration.md for the full reference and supported model IDs.

Installation

nvm use                # picks up .nvmrc -> Node 22.18.0
bun install            # JS workspaces (web + shared)
bun run api:install    # Python deps for the API (creates a uv-managed venv)

Running the Application

Start both apps in parallel:

bun run dev

Or run them individually:

bun run dev:web        # Next.js only
bun run dev:api        # FastAPI only (uvicorn --reload)

Open http://localhost:3000, type a project description into the prompt box, and you're off. Full walkthrough in Getting Started.


πŸ“ Project Structure

micracode/
β”œβ”€β”€ apps/
β”‚   β”œβ”€β”€ web/                    # Next.js 15 frontend
β”‚   β”‚   β”œβ”€β”€ src/
β”‚   β”‚   β”‚   β”œβ”€β”€ app/            # App Router pages
β”‚   β”‚   β”‚   β”œβ”€β”€ components/     # React components (incl. shadcn/ui)
β”‚   β”‚   β”‚   β”œβ”€β”€ lib/            # Utilities and clients
β”‚   β”‚   β”‚   └── store/          # Zustand stores
β”‚   β”‚   └── package.json
β”‚   β”‚
β”‚   └── api/                    # FastAPI backend
β”‚       β”œβ”€β”€ src/micracode_api/
β”‚       β”‚   β”œβ”€β”€ agents/         # LLM orchestrator, prompts, model catalog
β”‚       β”‚   β”œβ”€β”€ routers/        # health, models, projects, generate
β”‚       β”‚   β”œβ”€β”€ schemas/        # Pydantic request/response models
β”‚       β”‚   β”œβ”€β”€ starter/        # Starter project templates
β”‚       β”‚   β”œβ”€β”€ config.py       # Settings (env vars)
β”‚       β”‚   β”œβ”€β”€ storage.py      # Local filesystem project storage
β”‚       β”‚   └── main.py         # FastAPI app entry point
β”‚       β”œβ”€β”€ tests/
β”‚       └── pyproject.toml
β”‚
β”œβ”€β”€ packages/
β”‚   └── shared/                 # Shared TypeScript types (stream event contract)
β”‚
β”œβ”€β”€ docs/                       # End-user documentation
└── README.md

πŸ”Œ API Endpoints

All endpoints are mounted under /v1.

Method Endpoint Description
GET /v1/health Service health check
GET /v1/models List available LLM models
POST /v1/generate Stream code generation events (SSE)
GET /v1/projects List all projects
POST /v1/projects Create a new project
GET /v1/projects/{id} Get a project by id
DELETE /v1/projects/{id} Delete a project
GET /v1/projects/{id}/files List/read project files
PUT /v1/projects/{id}/files Write project files
GET /v1/projects/{id}/download Download project as archive
GET /v1/projects/{id}/prompts Get prompt history
POST /v1/projects/{id}/prompts/pop-assistant Pop last assistant message
GET /v1/projects/{id}/snapshots List project snapshots

πŸ“š Documentation

End-user docs live in docs/:

  • Getting Started β€” install prerequisites, configure an API key, and run the app.
  • Configuration β€” environment variables, switching between OpenAI and Gemini, and supported model IDs.
  • Using the Workspace β€” the home page, chat, editor, and preview panels.
  • Projects on Disk β€” where your generated apps live and how to work with them outside the app.
  • Troubleshooting β€” common errors and how to fix them.
  • FAQ β€” short answers to common questions.

🧰 Useful Scripts

bun run dev           # web + api in parallel
bun run dev:web       # Next.js only
bun run dev:api       # FastAPI only (uvicorn --reload, 127.0.0.1:8000)
bun run typecheck     # TS across all workspaces
bun run lint          # eslint across workspaces
bun run format        # prettier
bun run test:api      # pytest (storage + HTTP tests)
bun run api:lint      # ruff check
bun run api:format    # ruff format

πŸ“ License

This project is licensed under the MIT License.


🀝 Contributing

Contributions are welcome! Feel free to open issues and pull requests.


Join our community Discord

About

Open Source Alternative to Lovable, v0, Bolt, Replit, Emergent. 🌟 Star if you like it!

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors