Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Personally, I do not mind it if it's on-device, especially small specialised models (e.g. overview generation, audio generation, etc) with no internet access.




In the long term, on-device won't save us from a biased assistant. It might notice we seem tired and insinuate that we could use Mococoa, all natural beans straight from the upper slopes of Mount Nicaragua.

Or—and this happens—it "summarizes" the same text differently, depending on whether the author's name happens to fit a certain ethnicity.


On the inverse of this, it can also save us from biased content because it can point out all the ways that the article we are reading is trying to manipulate our perspective.

With how inexpensive trainings are starting to get, it will not be long until we can train our own specialized models to fit our specific needs.


> it can also save us from biased content

I am pessimistic on that front, since:

1. If LLM's can't detect biases in their own output, why would we expect them to reliably detect it in documents in general?

2. As a general rule, deploying bias/tricks/fallacies/BS is much easier than the job of detecting them and explaining why it's wrong.


Modern local models make it pretty easy to imagine a future where this would be useful, but they also make extremely apparent that the future has not arrived.

Maybe in five years they will be useful enough that it would have been worth including these features


That was the original intent. They only recently added the "chatbot-y" kind of stuff since the infra is all already there. The main uses were for their translation tools and PDF alt-text generation (which I believe disabling ML will disable as they rely on the on-device transformer tools to do).

I disabled browser.ml.enable and local translation was still working. In my case, that's all I need, but it looks like it still allows on-device transformers.

Oh interesting. All the local stuff for a good while was gated under browser.ml.enable but maybe they've finally freed that stuff.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: