Browser AI Chat

About

An AI chat application that runs entirely in the browser — no server, no API keys, no data leaving your device. It supports multiple local AI backends:

  • Gemini Nano via Chrome’s built-in AI APIs
  • Transformers.js for running Hugging Face models in-browser
  • WebLLM for larger models via WebGPU

Why I built this

I wanted to explore what’s possible with client-side AI in 2025+. As browser capabilities improve (WebGPU, built-in AI APIs), more AI workloads can run locally without sacrificing too much quality. This project is a testbed for those possibilities.

Technical highlights

  • Zero backend — everything runs in the browser
  • WebGPU acceleration where available
  • Conversation history stored locally
  • Model switching between different AI backends

AI chat that runs entirely in your browser. No servers, no API keys, no data collection.

Features

  • Local AI Models: Built-in AI (Gemini Nano), Transformers.js, WebLLM
  • Complete Privacy: All processing happens in your browser
  • Multi-modal: Text, images, documents, audio
  • Offline: Works without internet after setup
  • PWA: Install like a native app
  • File Support: Drag-and-drop attachments

Tech Stack

  • React + TypeScript + TailwindCSS
  • Vite + PWA
  • TanStack Router
  • IndexedDB (Dexie)

Development

Terminal window
git clone https://github.com/bimsina/browser-chat.git
pnpm install
pnpm dev

Model Types

Built-in AI: Browser native (Chrome/Edge only), instant startup

Transformers.js: ONNX models, good variety, medium size

WebLLM: Large models, most capable, bigger downloads

Browser Support

  • Chrome/Edge: Full support
  • Firefox: Transformers.js + WebLLM
  • Safari: Transformers.js only