Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In firefox yeah! I use it often.

I have it connected to a local Gemma model running in ollama and use it to quickly summarize webpages, nobody really wants to read 15 minutes worth of personal anecdotes before getting to that one paragraph that actually has relevant information, and for finding information within a page, kinda like ctrl-f on steroids.

The machine is sitting there anyway and the extra cost in electricity is buried in the hours of gaming that gpu is also used for, so i haven't noticed yet, and if you game, the graphics card is going to be obsolete long before the small amount of extra wear is obvious. YMMV if you dont already have a gaming rig laying around



An AI specifically customized to pull the recipe out of long rambling cooking blog posts would be great. I'd use that regularly.


that's not "AI" that's just a basic firefox extension, and one that's trivially easy to search for

literally googles first hit for me: https://www.reddit.com/r/Cooking/comments/jkw62b/i_developed...


Something like this I wouldn't mind, privacy focused local only models that allow you to use your own existing services. Can you give a quick pointer on how to connect Firefox to Ollama?


Docs here: https://docs.openwebui.com/tutorials/integrations/firefox-si...

I think its technically experiemntal, but ive been using this since day one with no issue


Use openwebui with ollama.

Openwebui is compatible with the firefox sidebar.

So grab ollama and your prefered model.

Install openwebui.

Connect openwebui to ollama

Then in firwdox open about:config

And set browser.ml.chat.provider to your local openwebui instance

Google suggests the you might also need to set browser.ml.chat.hideLocalhost to false. But i dont remember having to do that


The default AI integration doesn't seem to support this. The only thing I could find that does is called PageAssist, and it's a third-party extension. Is that what you're using?

https://addons.mozilla.org/en-US/firefox/addon/page-assist/


My mistake, I left a step out. Use openwebui with ollama. Openwebui is compatible with the firefox sidebar.

So grab ollama and your prefered model, install openwebui.

Then open about:config

And set browser.ml.chat.provider to your local openwebui instance

Google suggests the you might also need to set browser.ml.chat.hideLocalhost to false. But i dont remember having to do that




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: