Opened as a local file:// page — the browser may block Ollama requests. Run
npm run dev (or npx serve .) and use http://localhost:… instead, or set up a CORS proxy
in front of Ollama.
New Conversation🤗 Transformers.js
button>
Bryan
Before we start, I need to download my brain
A local AI model runs entirely on your device. No data leaves your machine. This is a one-time download and is stored in your browser.
Model: —
You can pick a different size in the sidebar (Transformers.js section) first.
Hey, I'm Bryan!
Your local second brain — no API key, no cloud, just current on your machine.
🔒100% Private
⚡Blazing Fast
🆓Completely Free
Try asking:
Bryan is thinking…
Session Summary
Welcome
Hey, I'm Bryan — your local AI. No account. No API key. Just you and me, running on your machine.
🧠
Get Ollama running
Bryan talks to models through Ollama. Start the daemon, then connect below.
Check connection after you start Ollama.
Pick a model
Select a model in the sidebar, then type in the main box. Everything stays on this device.