It's really quite incredible that one guy basically started a project to create a whole operating system from scratch for fun and to give himself something interesting to do, and then accidentally created one of the most viable new browser engines in a decade or two...
I've been watching the development videos for a year or two, and the speed that this has progressed in such a short time is unbelievable. Now they have multiple volunteers and enough sponsorship to pay more than one developer, it's pretty exciting what could happen here!
He is a world expert on Web rendering, and an extremely capable C++ developer. One of their success recipes is to code up the various specifications directly, which is - today - the best way to go about this. They are also heavily test-driven.
He did not even use the C++ standard library, when he says "from scratch" it includes his own string class, for better or worse, which is fine since it's "just for fun", "to learn" etc.
And just when you think a library and an OS are crazy, he announced a browser and a JavaScript engine on top. Then a JIT compiler, then Jakt, their own novel programming language, because neither C++ nor Rust is what makes him perfectly happy.
More than his expertise I admire his modesty and kindness - unlike Linus etc. he is not full of himself, and each of his videos gives lots of credit name by name of who did what. A perfect role model for open source.
I think their C++ library is one of the reasons they can create capable software so quickly actually. They have jetisoned just a tonne of C++ nonesense and added some really nice modern features such as how they handle memory and errors.
Also, you would think that having to implement EVERYTHING themselves ( they are making their own image decoders as an example -- inclding SVG ) would slow them down. However, as it is written in a mono-repo from soup to nuts, this allows them to very rapidly add support throughout the stack. They do not have any of the endless conversation and arguing that happens between components in Open Source. They do not have to work around things missing upstream. If they need something, they add it.
This project reminds me of the approach at Xerox PARC in the 1970s. Alan Kay wrote [1]:
(At PARC we avoided) putting any externally controlled system, in- house or out, on one's critical path. ... Thus, virtually all the PARC hardware ... and software ... were completely built inhouse by these few dozen researchers.
This sounds disastrous, (because) in programming there is a widespread first order theory that one shouldn't build one's own tools, languages, and especially operating systems. This is true --- an incredible amount of time and energy has gone down these ratholes. On the second hand, if you can build your own tools, languages, and operating systems you absolutely should because the leverage that can be obtained (and often the time not wasted in trying to fix other people's not quite right tools) can be incredible.
When would you (or anyone else) say would it be the best to consider doing everything by yourself from scratch? For example, I want to build a little arm server. I'm realistically going to use linux server or similar as I don't want to make my own OS. But if I'm undertaking something on a microcontroller - there's definitely a point where bare metal starts winning. How do you find that?
I would say: when the existing offerings completely prevent you from doing what you want to do, or require ugly workarounds that are not consistent with your goals, or when future changes in those dependencies might compel you to do extra work just to keep your own system running.
One more reason to start from scratch: to get the functionality you want from an existing offering, you would also have to include a lot of other stuff you don't need, resulting in unnecessary complexity and resource consumption.
I think you misconceive how open source projects need to work. Some projects (especially those in the web-dev niche) might view their relationship with upstream the way you do. But others do not.
For Ardour, we feel entirely free to just bring an upstream library into our source tree if we need to. And we also have our dependency stack builder that configures and occasionally patches upstream libraries to be just the way we need them. We do not wait for upstream adoption of our patches.
Most recently, for example, we became aware of impending moves by various Linux distros to remove GTK2, which we rely on. Even though we don't support distro builds, we want Linux maintainers to still be able to build the software, so we just merged GTK2 into our source tree.
This idea that using a 3rd party library becomes some sort of constraint is very, very far from reflecting universal truth. If we need to hack a 3rd party lib to make it do what we need, we just do it. Meanwhile, we get all the benefits of that lib. Ardour depends on about 86 libraries - we would be insane to rewrite all that functionality from scratch.
> they are making their own image decoders as an example -- inclding SVG
Considering the vast amount of exploits that continually comes out of media decoders everywhere, this basically guarantees I will never ever use this browser.
Have you looked at how it's implemented? The image decoder is completely separate from the main browser and is in a sandboxed process (with restricted syscall and filesystem access). If the image decoder is exploited, there's nothing the attacker can do.
That advice has context. Do not roll your own if the feature is not your core product offering. So don't roll crypto if you're not selling crypto. If it is your core offering (and media decoding is absolutely a core offering of a web browser), you should choose carefully whether to get it off the shelf or roll your own.
Otherwise how would new/better stuff ever get built?!
If Apple and Google can’t even find all the vulnerabilities in their libs, how on earth would a scrappy team of a few devs, especially since media decode isn’t the sole thing they’re focused on?
> Otherwise how would new/better stuff ever get built?!
The problem here is that people are salivating to use this as their daily driver. When WireGuard was still in development, everyone got told in very strong terms to not use it in any setting that required actual security.
Browsing the web at large is sort-of hostile by default.
Ladybird is a great project, and I hope it keeps developing, but any user that thinks their media decode libraries will be bulletproof libs free of vulnerabilities are nuts.
If Apple and Google can’t even find all the vulnerabilities in their libs, how on earth would a scrappy team of a few devs
Perhaps a few devs have nowhere near the required escape velocity to create vulnerabilities before they can be fixed, nor the pressure of PMs to ship substandard code?
> but any user that thinks their media decode libraries will be bulletproof libs free of vulnerabilities are nuts.
Sure. And its a high bar to challenge the same or better vulnerability profile that the established players have. But a "small scrappy team" which is capable of doing everything this team has done certainly garners a lot of confidence that the bar is possible.
Apple and google are big corpos and those are legendary for their inability to make anything properly. It has been a while since they were small and could move fast... So no, I would not take them as a standard.
That's fine, I'm sure they weren't targeting only you when they developed it. So it will still have utility for the developers of the project and other users.
Just because something is widely used doesn't mean it's more secure (example: libwebp). The security issues tend to happen mostly when creating optimizations that bypass the "obviously secure" way to do things, but rely on an internal state that gets broken by another optimization down the line. This is way less frequent in "smaller" projects, just because they didn't need to go through that round of optimizations yet.
For this question specifically, tho, I think Ladybird is extremely interesting in terms of creating a security-focuses C++ project. Between the constant fuzzing that each part of Ladybird already goes under (courtesy of the OSS-Fuzz initiative), the first-class citizenship of process separation for the different features of the browser, the modern C++ used in Ladybird (that prevents a lot of the issues cropping up in commonly used media decoding libraries), the overall focus on accuracy over performance in library implementations, and the compiler-level sanitatization utils (UBSAN, ASAN) that are enabled-by-default, I think it's less likely that an critical security-impacting bug would exists in Ladybird's .ico support than in Webkit's for example.
If massive companies like Google and Apple can’t even find all the vulnerabilities, how are you expecting a scrappy team to?
Don’t get me wrong, how far they’ve gotten is very laudable and as a educational exercise it is really cool, but it starts being a pretty massive risk if users start using this as a daily driver.
Google and Apple are just a bunch of scrappy teams trying to work together on insanely massive and bloated code bases.
Numbers of bugs scale with lines of code.
Small scrappy team writing simple and consice code from scratch is likely to produce fewer bugs than enterprisey monstrosities.
>He did not even use the C++ standard library, when he says "from scratch" it includes his own string class, for better or worse
I can't imagine how it can be for worse; the standard C++ string library is awful. It makes perfect sense that a super-talented C++ dev would make something better if they have the energy and time.
As far as I can tell Jakt's reference counting is not optional. So it may be closer to Swift.
That being said I've seen a few people here suggest it's easier to use rust's Rc and Box for everything and treat it like Haskell or Scala. So it might not be so different in practice.
He was probably their top guy too. Sort of like how Lucifer was the top angel in Heaven before his drug problems brought about his fall from Apple and since then he's been a true light bringer working tirelessly to give us awesome software as he recovers from addition.
Compared to a lot of other new browser engines, this one actually renders a lot of web content decently. And if you follow their update videos, they improve their coverage really quickly.
Ladybird also comes with their LibJS runtime, which has good coverage of the JS standards and even manages to implement some new features before the big browsers all get to it.
When you open sites on other "new" browser engines you typically get a really butchered visual result, with layouts completely broken, elements missing, wrong colors, etc. For example, Servo didn't support floats until recently, and IIRC even simple sites like Hacker News look "wrong".
Ladybird's approach has been to start with a somewhat naive implementation of features, then choose popular websites and apps and just continuously iterate to make them gradually look better, by fixing the parts that stand out. This pragmatic approach means that their supported feature set, while nowhere near 100%, can decently render 90% of websites due to being aligned with the most commonly used features.
I can't relate / do not recognise these claims of incorrect rendering; is there a resource out there that shows images of how it's supposed to be vs what it looks like? I thought this was a problem of the past, IE compatibility with web standards kind of thing.
Which browser are you talking about that renders everything correctly? Are you using a Servo-based browser? Is there even a Servo-based browser that someone can easily download and use?
Servo themselves say they only pass 55.8% of tests[1]. This thread[2] says Servo doesn't support SVG[2] as of Nov 2022.
> I thought this was a problem of the past, IE compatibility with web standards kind of thing.
For the mainstream browser engines, yes, but if you're starting a browser from scratch the amount of stuff you have to implement is massive and cannot be implemented in the span of even a couple of years.
I've been watching the development videos for a year or two, and the speed that this has progressed in such a short time is unbelievable. Now they have multiple volunteers and enough sponsorship to pay more than one developer, it's pretty exciting what could happen here!