Show HN: I built a browser AI to use GPT‑OSS locally (no server)

github.com

3 points by jjw9 2 days ago

Our little team built a small, open‑source browser extension because I wanted page summary/translate/chat without sending data off‑device.

It runs gpt‑oss locally via Ollama. Optional: switch to GPT‑5 if you add an OpenAI API key.

Try: 1.Install the extension (GitHub). 2.Install Ollama, then ollama pull gpt-oss. 3.In NativeMind choose “gpt‑oss”, then highlight text or click the toolbar button to summarize/translate/chat.

Notes: no backend servers from us; with gpt‑oss everything stays on your machine. With GPT‑5, requests go to OpenAI.

Would love feedback on setup friction, performance on your hardware, and missing features. Issues/PRs welcome.