TheMachineStops@discuss.tchncs.detoFirefox@lemmy.ml•Firefox introduces AI as experimental feature
41·
6 days agoIt gives you many options on what to use, you can use Llama which is offline. Needs to be enabled though about:config > browser.ml.chat.hideLocalhost.
Lol, that is certainly true and you would need to also set it up manually which even power users might not be able to do. Thankfully there is an easy to follow guide here: https://ai-guide.future.mozilla.org/content/running-llms-locally/.