Hacker Newsnew | past | comments | ask | show | jobs | submit | rhabarba's commentslogin

I upgraded my OpenBSD machines a few hours ago, and I'm still not entirely sure whether I notice any obvious TCP speed improvement. Then again, they're not really high-load computers. Maybe people with a higher throughput will be amazed.


FreeBSD is not really curious about being as portable as possible, I think. And it is somewhat larger, indeed, so it's not quite as easy to support more platforms.


Yeah isn’t netbsd the BSD focused on portability and platform support?


Yes, and OpenBSD being a fork of NetBSD still carries some of that spirit.


And both of those have very minimal ports compared to Linux. Notably in modern arm/riscv. Netbsd has really fallen behind.

Still better than the none of freebsd.


I mean, are we surprised? Linux has on the order of millions times more users and funds (probably not developers though, but who knows). Thus, if there is any financial viability of a port I am certainly expecting Linux to "move" first. Rather, I am impressed that OpenBSD and NetBSD are keeping up as well as they do.


NetBSD and OpenBSD support “old” hardware notably longer than Linux does though. OpenBSD having dropped the VAX is not that long ago.


Yeah I suppose.

But OpenBSD forked from NetBSD like, what, 30 years ago?


There is not even one common "the Linux kernel".


Then again, using Linux has no obvious advantage anymore, as you don't "own" most software you're running anyway.


Being able to trust your OS is an incredible advantage.


Too bad then that desktop Linux is so easy to pwn.


Obtuse gotcha. Trust and security are two different things. And Linux is as secure as anything else against physical attack.

edit: local, not remote attack


When you write, "physical attack", what do you mean? Laptop theft?


I genuinely wonder why it is considered "huge". Does it really matter how many % desktop usage one of the several dozens desktop operating systems has?


Role models for computer use in my generation: RMS, Bjarne Stroustrup, Doug McIlroy.

Role models for computer use today: “PewDiePie”.

This is why we can't have nice things.


PewDiePie is awesome. His linux video has over 6M views. We need to make linux cool again for the general audience.


I don't know whether operating systems need to be "cool". Is Windows, the market leader, "cool"?

Or rather, I don't know whether those who think PewDiePie is "cool" have the same understanding of "cool" as computer nerds do.


Because computers are cool. Its easier to point to a general operating system than to all the cool software that runs on them. When people say Linux is cool its not because of just the kernel, its everything from the culture to the software stack and ethos


Still looking forward to the year of the Plan 9 desktop. I'm helping!


You had me at "Browser compatibility".


Chrome embeds a small LLM (never stops being a funny thing) in the browser allowing them to do local translations.

I assume every browser will do the same as on-device models start becoming more useful.


While I appreciate the on-device approach for a couple of reasons, it is rather ironic that Mozilla needs to document that for them.


Firefox also has on-device translations for what its worth.


What's the easiest way to get this functionality outside of the browser, e.g. as a CLI tool?

Last time I looked I wasn't able to find any easy to run models that supported more than a handful of languages.


That depends on what counts as “a handful of languages” for you.

You can use llm for this fairly easily:

    uv tool install llm

    # Set up your model however you like. For instance:
    llm install llm-ollama
    ollama pull mistral-small3.2

    llm --model mistral-small3.2 --system "Translate to English, no other output" --save english
    alias english="llm --template english"

    english "Bonjour"
    english "Hola"
    english "Γειά σου"
    english "你好"
    cat some_file.txt | english
https://llm.datasette.io


Tip: You might want to use `uv tool install llm --with llm-ollama`.

ref: https://github.com/simonw/llm/issues/575


Thanks!


That's just the base/stock/instruct model for general use case. There gotta be a finetune specialized in translation, right? Any recommendations for that?

Plus, mistral-small3.2 has too many parameters. Not all devices can run it fast. That probably isn't the exact translation model being used by Chrome.


I haven’t tried it myself, but NLLB-200 has various sizes going down to 600M params:

https://github.com/facebookresearch/fairseq/tree/nllb/

If running locally is too difficult, you can use llm to access hosted models too.


Setting aside general-purpose LLMs, there exist a handful of models geared towards translation between hundred of language pairs: Meta's NLLB-200 [0] and M2M-100 [1] can be run using HuggingFace's transformers (plus numpy and sentencepieces), while Google's MADLAD-400 [2], in GGUF format [3], is also supported by llama.cpp.

You could also look into Argos Translate, or just use the same models as Firefox through kotki [4].

[0] https://huggingface.co/facebook/nllb-200-distilled-600M [1] https://huggingface.co/facebook/m2m100_418M [2] https://huggingface.co/google/madlad400-3b-mt [3] https://huggingface.co/models?other=base_model:quantized:goo... [4] https://github.com/kroketio/kotki


You can use bergamot ( https://github.com/browsermt/bergamot-translator ) with Mozilla's models ( https://github.com/mozilla/firefox-translations-models ).

Not the easiest, but easy enough (requires building).

I used these two projects to build an on-device translator for Android.



ollama run gemma3:1b

https://ollama.com/library/gemma3

> support for over 140 languages


Try to translate a paragraph with 1b gemma and compare it to DeepL :) Still amazing it can understand anything at all at that scale, but can't really rely on it for much tbh


If you need to support several languages, you're going to have to have a zoo of models. Small ones just can't handle that many; and they especially aren't good enough for distribution, we only use them for understanding.


What compatibility? It's Chrome-only.


> Everything you create should be in git or similar.

Everything you create should be on a machine you control, preferably in a house different from the one where you created it. Version control is optional (and Git probably overengineered for your one-man projects, but that's a different discussion).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: