A modern, open-source web interface for OpenCode - the open-source AI coding assistant. Built with TanStack Start and React.
OpenCode ChatUI provides a beautiful, feature-rich web interface to interact with your locally running OpenCode server. It offers rich rendering of AI responses including tool calls, file diffs, search results, and more.
- Multi-model Support - Select from any model configured in your OpenCode server
- Session Management - Create, switch, and delete chat sessions
- Real-time Updates - Server-Sent Events (SSE) for live response streaming
- Model Persistence - Selected model is remembered across page refreshes
Toggle visibility of tool calls to see exactly what the AI is doing:
| Tool | Visualization |
|---|---|
| Edit | GitHub-style unified diff with additions/deletions highlighting |
| Write | New file content with line numbers and green highlighting |
| Read | File content viewer with line numbers |
| Grep/Search | Results grouped by file with pattern highlighting |
| TodoWrite | Task list with progress bar and status badges |
| Bash | Command input/output display |
- Expandable tool call views with "More/Less" toggle
- Full-screen modal for detailed viewing
- Copy content buttons
- Token usage and cost tracking
- Export session as JSON
- Node.js 18+ or Bun
- OpenCode Server running locally (default:
http://localhost:4096)
git clone https://github.com/redentordev/opencode-chatui.git
cd opencode-chatui# Using npm
npm install
# Using bun
bun installMake sure you have OpenCode server running. See OpenCode Server Documentation for setup instructions.
# Default runs on port 4096
opencode server# Using npm
npm run dev
# Using bun
bun run devThe app will be available at http://localhost:3000
| Variable | Default | Description |
|---|---|---|
OPENCODE_URL |
http://localhost:4096 |
Default OpenCode server URL (server-side) |
Create a .env file in the project root:
OPENCODE_URL=http://localhost:4096You can also change the OpenCode server URL at runtime through the Settings page:
- Click the Settings icon in the header
- In the "Server Connection" section, enter your OpenCode server URL
- Click "Test" to verify the connection
- Click "Save & Connect" to apply
The runtime URL is stored in localStorage and takes precedence over the environment variable.
- Framework: TanStack Start (React meta-framework)
- Routing: TanStack Router
- Data Fetching: TanStack Query + tRPC
- Styling: Tailwind CSS v4
- Icons: Lucide React
- Markdown: Streamdown
- Diff: diff
src/
├── components/
│ ├── chat/ # Chat-specific components
│ │ ├── ChatLayouts.tsx # Main chat layout components
│ │ ├── ConnectionStatus.tsx # SSE connection indicator
│ │ ├── MessageDisplay.tsx # Message rendering with markdown
│ │ ├── ModelSelector.tsx # Model dropdown with search
│ │ ├── SessionList.tsx # Sidebar session tree
│ │ └── ShareUrlModal.tsx # Session sharing modal
│ ├── DiffViewer.tsx # GitHub-style diff viewer
│ ├── DirectoryTree.tsx # File tree visualization
│ ├── FileViewer.tsx # Read/Write file content viewer
│ ├── GrepResults.tsx # Search results viewer
│ ├── Header.tsx # App header component
│ ├── PermissionModal.tsx # Tool permission approval
│ ├── SessionTodoPanel.tsx # Task list side panel
│ ├── StatusBar.tsx # Bottom status bar with context usage
│ ├── StreamdownConfig.tsx # Markdown renderer config
│ ├── TodoList.tsx # Todo item rendering
│ ├── ToolCall.tsx # Tool call wrapper component
│ └── TreeView.tsx # Reusable tree component
├── hooks/
│ └── useOpencode.ts # Main OpenCode hook with SSE
├── integrations/
│ ├── tanstack-query/ # TanStack Query setup
│ └── trpc/ # tRPC setup and router
├── lib/
│ ├── opencode.server.ts # Server-side OpenCode client
│ └── utils.ts # Utility functions
├── routes/
│ ├── __root.tsx # Root layout
│ ├── api.trpc.$.tsx # tRPC API handler
│ ├── index.tsx # Main chat page
│ └── settings.tsx # Settings & provider auth
└── types/
└── opencode.ts # TypeScript types
# Build the application
npm run build
# Preview the production build
npm run serveThis project includes Cloudflare Workers configuration:
npm run deploySee wrangler.jsonc for configuration.
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is open source and available under the MIT License.
- OpenCode - The open-source AI coding assistant
- TanStack - For the amazing React tools
- Vercel - For the Streamdown markdown library
Special thanks to the OpenCode team for building such an amazing open-source AI coding assistant! Your work has made projects like this possible.
vibe coded with ❤️ by @redentordev
