> Today, the Servo team has released new versions of the servoshell binaries for all our supported platforms, tagged v0.0.1. These binaries are essentially the same nightly builds that were already available from the download page with additional manual testing, now tagging them explicitly as releases for future reference.
> We plan to publish such a tagged release every month. For now, we are adopting a simple release process where we will use a recent nightly build and perform additional manual testing to identify issues and regressions before tagging and publishing the binaries.
> There are currently no plans to publish these releases on crates.io or platform-specific app stores. The goal is just to publish tagged releases on GitHub.
The release announcement doesn't contain much information, but Servo does publish regular "This month in Servo" updates on their blog which contain lots of details:
One of my favorite RSS readers is https://vore.website - river of news, no unread indicators, simple. It's a website, as the domain suggests, so no need to install anything.
Tried it out on Linux. Worked better than I expected. Sites that are text heavy render well, and quickly. Sites with more "customization" sometimes struggled with rendering; stuff all over the place. Memory usage seemed a bit higher than Firefox with the same tabs, but not out of this world higher.
It’s still a ways off, but I’m excited for the possibility of something like Tauri using Servo natively instead of needing host browsers. A pure Rust desktop app stack with only a single browser to target sounds fantastic.
this part is important:
> A pure Rust desktop app stack
I think the parent is imagining a desktop with servo available as a standard lib, in which case you're left with the same complaints as Tauri, not electron; that the system version of Servo might be out of date.
Personally I'm more optimistic about Servo - because originating at Mozilla, I imagine more web browser experience and expertise went into its architecture, and also because Rust.
I don't know.. Servo has been in development for a decade and still has quite underwhelming performance and UX. The binary is 100MB+ on Mac, scrolling is janky, a google image search takes 10+ seconds to render and goes through very buggy states. Meanwhile Ladybird renders a legacy UI, but feels really fast and stable.
I was curious how you arrived at that figure so I checked the dates. Servo began in 2012 as a Mozilla skunkworks project, died off in 2020, and was revived in late 2023. If you simply subtract the "dead" period, sure, it doesn't look like it was going anywhere fast, but that's ignoring the multiple major changes in direction and the 5+ years during which Servo development was fully subordinate to Firefox development. It only became a fully independent browser development effort after the project was revived by Igalia.
“Servo is more than a browser engine—it’s a collection of crates used widely across the Rust ecosystem. Maintaining these libraries benefits not just Servo, but the broader web platform.“
Seeing Servo and full-fat Electron [1] both at 100 MB made me wonder if that's the minimum for an "Everything bagel" browser engine that does WebRTC, video playback, etc., etc.
How big is Ladybird?
[1] I believe you can make Electron smaller by cutting parts of Chromium out, but the default is around 100 MB
There are ways to slim it down, but WebRTC and video playback would probably be one of the first things I'd remove if I were looking to do that!
The other obvious target is the JS engine. IIRC V8 is 90mb just by itself. I don't think SpiderMonkey is quite so large but definitely still in the 10s of megabytes. A slower simpler JS engine (QuickJS, hermes, etc) could be quite a bit smaller.
That however would limit the browser to small audiences. Many users won't accept movies not playing and many sites require a JavaScript engine with all those optimisations, even SpiderMonkey loses too much in that space.
Binary size however is less of an issue for most users.
Yeah, I think these kind of setups don't make much sense for the main "browser application that end-users use" use case. They can make a lot of sense in the Electron "I'm wrapping a browser to use as an app framework" use case though.
Can you elaborate what you mean by pushover license?
Ladybird uses bsd-2 license which is OSI, I mean its not fsf/copyleft but permissive which should be better sometimes for things like embedding etc. no?
It looks like servo uses mozilla public license 2, can you please explain me the difference and why you think one is pushover and other is not?
I think Ladybird will beat Servo at making an usable and good product, Mozilla might have more resources but that's not the only thing that you need if you want to build great software.
Ladybird is C++ and that still has the same issues as every other engine.
I suspect Ladybird will/has already leapfrogged Servo in performance and usage due to the Ladybird team and its momentum. Mozilla isn't doing anything with Servo anymore.
But I also don't really see a compelling reason for Ladybird's existence - we already have Chromium, Blink, Gecko, etc. It's hard for me to imagine a world where Ladybird is a healthy contender for marketshare.
The only real novel thing to do in this space is "rewrite it in Rust".
We are lucky it's even a duopoly. All it would take is the demise of Firefox, and the entire web would be defined entirely by the implementation of Chrome/Chromium.
Servo is very welcome; a third leg to the stool makes real diversity possible again.
Don't forget that pretty much 100% of iOS users and a nontrivial percentage of Mac users are on Webkit/Safari. That's not to say Safari is really leading the pack on anything at all, but Google also hasn't led Apple by the nose on pretty much anything on the web in recent years.
Yup, the split is really Blink+WebKit. Gecko marketshare is tiny these days.
What's interesting is seeing a few non-Apple WebKit browsers pop up, like Orion (Kagi) and Epiphany.
Call me cynical, but I don't see Ladybird or Servo do much beyond making a splash. Browser engines take an incredible amount of dev hours to maintain. Ladybird is hot now, but what about in a decade? Hype doesn't last that long and at that point the money and a chunk of the dev interest will have dried up.
Blink and WebKit both have massive corporations championing them, so those engines do not run that risk.
Ladybird seems to be progressing at an impressive pace also, time will tell however if their choice of C++ will be a big problem or if modern ways of doing things are safe enough.
I remember they mentioning Swift a few months ago, but currently I don’t see any swift in their github repo,
didn’t checked other branches besides main.
> all the C++ code is intended to have been replaced.
That is not their goal at all, I don't where you heard that. Swift is currently stalled due to some blockers listed on their issue tracker, but any usage of it will be in safety-critical areas first and not a complete rewrite of existing code.
Very excited for Ladybird and Servo. I wonder if a good thing that may emerge from this era of LLM code-support capabilities is that its more feasible to support alternative browser codebases even as they get into the multi-million lines of code.
That doesn't really make sense to me either. Even if WebIDL is inheritance based, that is going to be processed automatically so you can easily use codegen to make the resulting interface nice in Rust, in a way that would be relatively difficult if you were hand-writing it.
Yep, but there was another post mentioning half a million lines of C++ code so far.
While the C++ interop in Swift seems sane with Clang being embedded I wonder how much time/energy they will have to actually move significant parts if it's so large already.
I've seen a lot of criticism of Mozilla in these parts, some more fair than others. (Adtech = bad, regardless of whether you call it privacy preserving. CEO pay, not as bad as people say but don't love it.) But the notion that a trillion dollar platform company dictating web standards and Firefox are two sides of the same coin is, by my lights, the singularly most spectacular failure of comprehension that's been wrought by this era of Mozilla skepticism. It's not exactly a big lie because the people saying it seem to sincerely believe it but it's comparably disastrous as a test of information literacy.
Mozilla was sitting on a chest of cash that could have funded engineering efforts for decades. Instead they decided to inflate managers and marketers in an effort to expand market/mindshare and follow that with needs for ever increasing funding drives to fund lavish parties and events on the marketing side, while shuttering engineering efforts and even laying off swaths of engineering talent.
That doesn't even touch some of the more salient political movements or failure after failure to spin the brands off into something more/different for profit motives.
Mozilla needs to restructure as an engineering focused organization where business operations, marketting and brand management are not steering the ship.
Are non-profits in the US allowed to hoard cash long-term?
In the UK, spending on furthering their charitable purpose is expected to roughly match income over the medium term. There are carve-outs for specific types of "permanent endowment" (and even there, spending is meant to match the investment income) but it wouldn't cover anything like Mozilla's commercial agreement with Google.
But the Mozilla Foundation's purpose is "protect and improve the Internet as a public resource, open and accessible to all".
It's not clear to me why that requires a sizeable team of developers - surely they'd be better off working for MoCo (the commercial subsidiary who make the browser and who provide a large portion of the MoFo's income)?
MoFo's activities are centred on philanthropy and advocacy. You'd expect most of their staff to be experts in things like community engagement, policy research and development, grant-making, campaign strategy, volunteer welfare, reporting & transparency, and management of investments.
Sure, there'll be some engineering needed to support that, but it shouldn't be their core focus.
The MoCo/MoFo split happened for a reason: a non-profit couldn't do the big commercial deals that became available to MoCo.
If you went back to the pre-2005 situation, in which MoFo was all there was, it would have at most low single-digit millions in the bank rather than a billion. The AOL dowry was only intended to last a couple of years, and there's simply no way it could have sustained development of the browser beyond that. The Phoenix would have been consumed by the flames, and we'd be left with a stagnant IE/Chrome duopoly.
Worth noting that Mozilla Corporation (which I believe is the entity that has the contract with Google) is a for-profit organisation wholely owned by Mozilla Foundation which is the non-profit.
In theory, it feels like that ought not to change anything regarding the legal situation, but I bet it does.
Firefox isn't a part of any duopoly, with market share numbers as low as they are these days. Chrome + Safari, perhaps? (Or Chrome + Edge if you exclude mobile, though Edge of course uses the same rendering engine as Chome.)
I wonder if it is deliberate choice to not include scrollbar? Is it due to limitations of UI widgets, or nowadays scrollbars are part of website, as some websites are very happy to set scrollbar size to "too narrow for comfortable use" or even remove it altogether. To end on positive note: is there a way for an average developer to try and fix this issue, thus doing my own share of contributing? Where should one start?
I'd like to see this succeed, but I'm skeptical that a small team can keep up with the major players in this area. Many years ago Dan Kennedy (of the SQLite team) wrote a lovely HTML widget for TCL/TK. It rendered CSS 1.0 quite nicely, and was a pleasure to use, modulo a few font-related bugs; but was soon rendered obsolete and out of date. Not blaming Dan, here; it simply wasn't a one-person job. Meanwhile, I'd rewritten an app to make use of it. Got burned once, don't want to get burned again.
I feel like part of the solution here is to build the browser as reusable modular components. For some parts of browsers that's been common for years: JS engines (V8, SpiderMonkey, etc) are typically reusable, as are rendering backends (WebRender, Skia, etc), and lower-level components like Freetype/Harfbuzz/icu.
Servo's CSS engine Stylo is also modular, and is shared by Firefox which is part of how they've managed to not completely fall behind in web standards support despite the project being all but abandoned for several years.
I'm building another browser engine Blitz [0] which also uses Stylo, and we're building our layout/text engine in such a way that it can be reused so future browser engines (at least ones written in Rust) shouldn't need to build either Style or Layout if they don't want to.
A few more infrastructure pieces like this and browser engine development starts to look more approachable.
It's several small teams. Servo is modular, and parts of it are useful outside of Servo. Other projects are using and maintaining and enhancing those modules. For example, IIRC dioxus uses many of the modules.
Edit: see sister comment by the actual Dioxus guy, which is more accurate than mine!
Is Servo ready if I want to play around with it in a embedded-browser capacity? Say I wanted to have some basic HTML+CSS UI, can I create a Rust binary that embeds Servo+those resources and it kind of works?
It pulls in Servo/Firefox's CSS engine Stylo (and Servo's HTML parser html5ever) and pairs it with our own layout engine (which we are implementing mostly as libraries: Taffy [0] for box-level layout and Parley [1] for text/inline layout) and DOM implementation. Rendering and networking are abstracted behind traits (with default implementations available) and you can drive it using your own event loop.
Minimal binary sizes are around 5mb (although more typical build would be more like 10-15mb).
would this be a good fit for rendering a game UI? showing various stat/dialogue displays, an inventory/equip screen with draggable items, menus, etc. All I really want is html+css to do styling and layout and I'd rather have the interaction logic in the game code than javascript anyway
I think it would, modulo that it's not really "ready" yet.
We do have a couple of PoC examples of integrating with the Bevy game engine. Both of these use Dioxus Native, which wraps Blitz with Dioxus (which is a React-like framework but in Rust rather than JavaScript - https://github.com/DioxusLabs/dioxus), but you could do DOM tree creation and event handling manually if you wanted to.
- This first one includes Bevy inside a window setup by Dioxus Native (using a `<canvas>` element similar to how you might on the web). Here the event loop is controled by Dioxus Native and the Bevy game is rendered to a texture which is then included in Blitz's scene. https://github.com/DioxusLabs/dioxus/tree/main/examples/10-i...
- This second one does it the other way around and embeds a Dioxus Native document inside a window setup by Bevy. Here the event loop is controlled by Bevy and the Blitz document is rendered to a texture with which Bevy can then do whatever it likes (generally you might just render it on top of the games, but someone tried mapping it into 3d space https://github.com/rectalogic/bevy_blitz) https://github.com/DioxusLabs/dioxus/tree/main/examples/10-i...
The latter is probably what I would recommended for game UI.
Both approaches probably need more work (and Blitz could do with more complete event handling support) before I would consider them "production ready".
You would end up simply with Electron 2.0. I tried de-entangling the Servo CSS / JS / Layout engine some years ago, to see if it would be more lightweight, it wasn't: https://github.com/fschutt/servo_gui_test (62 MB binary size, several hundred MB RAM usage IIRC)
I am currently working on getting https://azul.rs/reftest ready, which uses some of the underlying technologies as Servo (taffy-layout, webrender) but uses no JavaScript and also has a C / Python API. Azul is basically that, except it's not usable yet.
I know about Dioxus / Blitz, but it's a very, very different project. The only common part is that both Azul and Blitz use taffy for flexbox / grid, but both the technologies, architecture, funding and goals are extremely different:
Blitz:
- Custom renderer (Skia?) vs Azuls WebRender fork (to get rid of any C dependencies)
- Stylo (CSS parser) vs azul-css (to support compilation of CSS to const items)
- HarfRust (font shaping) - vs allsorts (I used allsorts also in printpdf, so it fits)
- Skrifa (font parsing) - vs allsorts again (simplifies things)
- Fontique (font selection) - vs rust-fontconfig (custom pure-Rust rewrite of fontconfig)
- Parley (line breaking) - vs Azuls text3 engine
- All as separate projects vs Azuls monorepo-style
Dioxus:
- RSX macros, data + function coupled together vs Azuls "C function callbacks + HTML dataset" model
- Binary hot-patching vs Azuls dynamic linking model
- Macros vs Azuls HTML/CSS to Rust/C compiler build tool (no macros)
- Funded by YC (not sure about upsell?) vs funded by donations (once it's stable enough) and my Maps4Print cartography startup (dogfooding)
These things matter, even for small decisions. For example, Azul uses a custom CSS parser because the CSSProperty is a C-compatible enum, so that later on you can compile your entire CSS to a const fn and use CSS strings without even doing any allocations. So even on that level, there's a technological-architectural difference between Azul and Stylo.
But the core point is more architecturally: Azuls architecture is built for de-coupling the user data from the function callbacks, because I see this as the Archilles heel that all GUI systems so far have failed at:
Dioxus however repeats this exact same pattern again, and even the Elm architecture doesn't really fix it. I didn't finish the document but basically there is a (1) "hierarchy of DOM elements" and a (2) "graph of UI data" and those two are not always the same - they can overlap, but the core assumption of many GUI toolkits is that (2) is a tree (it's a graph, really) and (2) is always in the same hierarchy as (1), which is why GUI programming is a pain, no matter what language / framework. Electron just makes the visual part easier, but then you still need React to deal with the pain of data model / view sync.
I can collaborate on the flex / grid solver ofc, but it's very hard to collaborate on anything else because the technologies used, the goals, the architecture, etc. are very different between Dioxus / Azul. Azul is more "monorepo-NIH integrated solution" (because I often got bug reports in 2019 that I couldn't fix because I didn't own the underlying crate, so I had to wait for the maintainers to do another release, etc. - I learned from that).
As a note, the layout engine is also now heavily vibe-coded (sorry not sorry), so I don't take credit - but feel free to take inspiration or copy code. Gemini says the solver3 code is a "textbook implementation", take that as you will. My idea was to build a "AI feedback loop" to semi-automatically put the HTML input, the debug messages (to see what code paths are hit), the source code and the final display list into a loop to let the AI auto-debug the layout engine. So that part of writing the HTML engine isn't really hard, assuming the plan works out. The hardest part is caching, scrolling, performance debugging, interactions between different systems, and especially supporting the C API. Layout is comparably simple.
- You don't have use Dioxus to use Blitz: you can do your own DOM construction and event handling with imperative Rust APIs.
- You don't have use any of the provided renderers to use blitz-dom (although our default renderer is Vello which is also pure Rust), and it would be possible to hook it up to WebRender.
- We have a lot of the tricky incremental layout and caching logic implemented (although there are still bugs).
- Blitz has grant funding through NLnet as well as funding from DioxusLabs, and is fully open source under permissive licenses (MIT/Apache 2.0) that don't really allow for "rug pulling".
---
That being said, the designs around CSS do sound quite different: we have chosen to take on a relatively heavy dependency in Stylo; we don't support non-tree-like structures; and in general, if you wish to do your own thing then that it what you ought to do!
Not sure that I agree that layout is simple (I have spent many long hours debugging the nuances of CSS layout over the past months), and I'm a little skeptical that an AI-based approach will work out. But I wish you luck!
I tried it as a little preview window for writing my blog, which is (in my opinion) very basic HTML and CSS. Whole page rendered wrong, though I admit I didn't bother to find out why. Give it a shot, but keep your expectations low.
If you have a basic site that doesn't work you can open an issue on the repo. If you have some relatively simple site, its useful for the team to know what features that people are using are broken.
This is an incomplete browser engine, suitable mostly for technical contributors. If you're looking for a solution for kiosks, there are good for-purpose products/projects. Examples include: OpenKiosk, Porteus Kiosk, SiteKiosk.
>Servo aims to empower developers with a lightweight, high-performance alternative for embedding web technologies in applications.
Um... what? Are they just saying it's a browser in a verbose way or what? It just seems like you could replace literally all those words with "browser" and the clarity would skyrocket. Although perhaps it's not actually just a browser and I dont understand.
To many people involved in browser development there is a distinction between the "browser" (Chrome and Firefox, but also Opera, Brave, Arc that don't develop their own engine) and the "web engine" (Blink, WebKit, Gecko).
Servo is currently more of the latter than the former as it's UI is a pretty minimal one that is mostly useful for testing and doesn't much of the niceties that users expect of a modern browser (bookmarks, history, password manager, etc).
I do agree that it's confusing for most people though.
Servo is to a browser what Chromium Embedded Framework is to Chromium. It is the vast majority of what is necessary for a browser, but it is not a browser in itself: it renders websites, but all of the user-facing browser functionality around that is a separate concern.
Version numbers don't really mean much, especially for a project that was initially supposed to just be a proving ground for new Firefox technologies, some of which are indeed used in Firefox today.
Only more recently has the plan emerged to release a full browser engine based on servo.
From the blog at https://servo.org/blog/2025/10/20/servo-0.0.1-release/
> Today, the Servo team has released new versions of the servoshell binaries for all our supported platforms, tagged v0.0.1. These binaries are essentially the same nightly builds that were already available from the download page with additional manual testing, now tagging them explicitly as releases for future reference.
> We plan to publish such a tagged release every month. For now, we are adopting a simple release process where we will use a recent nightly build and perform additional manual testing to identify issues and regressions before tagging and publishing the binaries.
> There are currently no plans to publish these releases on crates.io or platform-specific app stores. The goal is just to publish tagged releases on GitHub.
Is it as simple as "now is as good a time as any to start tagging releases"? There's no special motivating factor that drove this to happen now?
I think it's also that they finally got Mac/Arm releases sorted, so now they have the full platform support matrix for nightlies?
That's roughly correct. The other side of this is figuring out a release process and thinking about versioning.
The release announcement doesn't contain much information, but Servo does publish regular "This month in Servo" updates on their blog which contain lots of details:
- Blog: https://servo.org/blog/
- Most recent TMIS post https://servo.org/blog/2025/09/25/this-month-in-servo/
Check them out if you're interested in what's going on with Servo.
When Google Reader died, so did a large part of me, and the web.
That said, I'm recently back on RSS and this is another good feed:
https://servo.org/blog/feed.xml
I wish I had a RSS reader to feed this to...
One of my favorite RSS readers is https://vore.website - river of news, no unread indicators, simple. It's a website, as the domain suggests, so no need to install anything.
Tried it out on Linux. Worked better than I expected. Sites that are text heavy render well, and quickly. Sites with more "customization" sometimes struggled with rendering; stuff all over the place. Memory usage seemed a bit higher than Firefox with the same tabs, but not out of this world higher.
All in all, an impressive release.
It’s still a ways off, but I’m excited for the possibility of something like Tauri using Servo natively instead of needing host browsers. A pure Rust desktop app stack with only a single browser to target sounds fantastic.
But then we have the same complaint against Electron, namely large deployment sizes and no shared memory, no?
this part is important: > A pure Rust desktop app stack
I think the parent is imagining a desktop with servo available as a standard lib, in which case you're left with the same complaints as Tauri, not electron; that the system version of Servo might be out of date.
Whether it's something like this, or ladybird's engine, I'm happy there is work being made in this space.
+1
Personally I'm more optimistic about Servo - because originating at Mozilla, I imagine more web browser experience and expertise went into its architecture, and also because Rust.
> originating at Mozilla, I imagine more web browser experience and expertise went into its architecture
Andreas Kling who created Ladybird had prior experience working on KHTML/WebKit so there is expertise there too.
I don't know.. Servo has been in development for a decade and still has quite underwhelming performance and UX. The binary is 100MB+ on Mac, scrolling is janky, a google image search takes 10+ seconds to render and goes through very buggy states. Meanwhile Ladybird renders a legacy UI, but feels really fast and stable.
> Servo has been in development for a decade
I was curious how you arrived at that figure so I checked the dates. Servo began in 2012 as a Mozilla skunkworks project, died off in 2020, and was revived in late 2023. If you simply subtract the "dead" period, sure, it doesn't look like it was going anywhere fast, but that's ignoring the multiple major changes in direction and the 5+ years during which Servo development was fully subordinate to Firefox development. It only became a fully independent browser development effort after the project was revived by Igalia.
“Servo is more than a browser engine—it’s a collection of crates used widely across the Rust ecosystem. Maintaining these libraries benefits not just Servo, but the broader web platform.“
Per: https://www.igalia.com/2025/10/09/Igalia,-Servo,-and-the-Sov...
> binary is 100MB+ on Mac
If you're worrying about that size then Mac OS is not the platform for you.
Seeing Servo and full-fat Electron [1] both at 100 MB made me wonder if that's the minimum for an "Everything bagel" browser engine that does WebRTC, video playback, etc., etc.
How big is Ladybird?
[1] I believe you can make Electron smaller by cutting parts of Chromium out, but the default is around 100 MB
There are ways to slim it down, but WebRTC and video playback would probably be one of the first things I'd remove if I were looking to do that!
The other obvious target is the JS engine. IIRC V8 is 90mb just by itself. I don't think SpiderMonkey is quite so large but definitely still in the 10s of megabytes. A slower simpler JS engine (QuickJS, hermes, etc) could be quite a bit smaller.
That however would limit the browser to small audiences. Many users won't accept movies not playing and many sites require a JavaScript engine with all those optimisations, even SpiderMonkey loses too much in that space.
Binary size however is less of an issue for most users.
Yeah, I think these kind of setups don't make much sense for the main "browser application that end-users use" use case. They can make a lot of sense in the Electron "I'm wrapping a browser to use as an app framework" use case though.
Meanwhile Lua is under 200kb. Imagine if you could use it as a browser language, no more bloat and churn.
Is some kind of a browser microkernel possible? Could you ship, say, JS Canvas support in a separate optional module?
A separate module that is configurable at build time would probably be doable. A separate module that is loaded at runtime probably isn't feasible.
I’m more hopeful about Servo because it’s released under a copyleft licence, whereas Ladybird chose a pushover one.
Can you elaborate what you mean by pushover license?
Ladybird uses bsd-2 license which is OSI, I mean its not fsf/copyleft but permissive which should be better sometimes for things like embedding etc. no?
It looks like servo uses mozilla public license 2, can you please explain me the difference and why you think one is pushover and other is not?
I think Ladybird will beat Servo at making an usable and good product, Mozilla might have more resources but that's not the only thing that you need if you want to build great software.
> Mozilla might have more resources but that's not the only thing that you need if you want to build great software.
Servo is no longer a Mozilla project, and hasn't been since 2020. It's now developed by Igalia, Huawei, and a collection of volunteers.
Servo's value is that it's written in Rust.
Ladybird is C++ and that still has the same issues as every other engine.
I suspect Ladybird will/has already leapfrogged Servo in performance and usage due to the Ladybird team and its momentum. Mozilla isn't doing anything with Servo anymore.
But I also don't really see a compelling reason for Ladybird's existence - we already have Chromium, Blink, Gecko, etc. It's hard for me to imagine a world where Ladybird is a healthy contender for marketshare.
The only real novel thing to do in this space is "rewrite it in Rust".
> The only real novel thing to do in this space is "rewrite it in Rust".
Ironically Chromium is now starting to include quite a bit of Rust. And of course Firefox has for some time.
They are planning to use swift in the future. Last point: https://ladybird.org/#faq
Aren't Chromium and Blink basically the same thing? And Gecko isn't embeddable.
Agreed. Servo is emphatically not anything resembling a priority at Mozilla and hasn't been for a long while.
Mozilla gave up on it a while ago.
It somehow survived after years with little progress and has relatively recently gathered speed again under new stewardship.
I am sooo ready to ditch chrome and firefox duopoly
We are lucky it's even a duopoly. All it would take is the demise of Firefox, and the entire web would be defined entirely by the implementation of Chrome/Chromium.
Servo is very welcome; a third leg to the stool makes real diversity possible again.
Don't forget that pretty much 100% of iOS users and a nontrivial percentage of Mac users are on Webkit/Safari. That's not to say Safari is really leading the pack on anything at all, but Google also hasn't led Apple by the nose on pretty much anything on the web in recent years.
Yup, the split is really Blink+WebKit. Gecko marketshare is tiny these days.
What's interesting is seeing a few non-Apple WebKit browsers pop up, like Orion (Kagi) and Epiphany.
Call me cynical, but I don't see Ladybird or Servo do much beyond making a splash. Browser engines take an incredible amount of dev hours to maintain. Ladybird is hot now, but what about in a decade? Hype doesn't last that long and at that point the money and a chunk of the dev interest will have dried up.
Blink and WebKit both have massive corporations championing them, so those engines do not run that risk.
Ladybird seems to be progressing at an impressive pace also, time will tell however if their choice of C++ will be a big problem or if modern ways of doing things are safe enough.
Their choice is actually Swift and by the time there's a stable release all the C++ code is intended to have been replaced.
Time will tell if that will be a big problem or if more mainstream ways of doing things are better for a project intended to run everywhere!
I remember they mentioning Swift a few months ago, but currently I don’t see any swift in their github repo, didn’t checked other branches besides main.
> all the C++ code is intended to have been replaced.
That is not their goal at all, I don't where you heard that. Swift is currently stalled due to some blockers listed on their issue tracker, but any usage of it will be in safety-critical areas first and not a complete rewrite of existing code.
Very excited for Ladybird and Servo. I wonder if a good thing that may emerge from this era of LLM code-support capabilities is that its more feasible to support alternative browser codebases even as they get into the multi-million lines of code.
They chose c++ because the web spec implies object oriented design.
No they didn't. It's C++ because the primary author was most familiar with C++ and only allowed C++ in SerenityOS.
https://ladybird.org/#:~:text=The%20choice%20of%20language%2...
That was the answer I remember Andreas give in a update video to answer the "why not rust" question.
That doesn't really make sense to me either. Even if WebIDL is inheritance based, that is going to be processed automatically so you can easily use codegen to make the resulting interface nice in Rust, in a way that would be relatively difficult if you were hand-writing it.
They're announced they want to move to Swift to combat some of this.
Yep, but there was another post mentioning half a million lines of C++ code so far.
While the C++ interop in Swift seems sane with Clang being embedded I wonder how much time/energy they will have to actually move significant parts if it's so large already.
Firefox market share is so low, it really seems more like a Chrome and Safari duopoly.
It's all Konqueror's fault, really.
I've seen a lot of criticism of Mozilla in these parts, some more fair than others. (Adtech = bad, regardless of whether you call it privacy preserving. CEO pay, not as bad as people say but don't love it.) But the notion that a trillion dollar platform company dictating web standards and Firefox are two sides of the same coin is, by my lights, the singularly most spectacular failure of comprehension that's been wrought by this era of Mozilla skepticism. It's not exactly a big lie because the people saying it seem to sincerely believe it but it's comparably disastrous as a test of information literacy.
Mozilla was sitting on a chest of cash that could have funded engineering efforts for decades. Instead they decided to inflate managers and marketers in an effort to expand market/mindshare and follow that with needs for ever increasing funding drives to fund lavish parties and events on the marketing side, while shuttering engineering efforts and even laying off swaths of engineering talent.
That doesn't even touch some of the more salient political movements or failure after failure to spin the brands off into something more/different for profit motives.
Mozilla needs to restructure as an engineering focused organization where business operations, marketting and brand management are not steering the ship.
Are non-profits in the US allowed to hoard cash long-term?
In the UK, spending on furthering their charitable purpose is expected to roughly match income over the medium term. There are carve-outs for specific types of "permanent endowment" (and even there, spending is meant to match the investment income) but it wouldn't cover anything like Mozilla's commercial agreement with Google.
Mozilla has already hoarded well over a billion. A billion would pay a sizable development team of experts for quite a while.
https://assets.mozilla.net/annualreport/2024/mozilla-fdn-202...
But the Mozilla Foundation's purpose is "protect and improve the Internet as a public resource, open and accessible to all".
It's not clear to me why that requires a sizeable team of developers - surely they'd be better off working for MoCo (the commercial subsidiary who make the browser and who provide a large portion of the MoFo's income)?
MoFo's activities are centred on philanthropy and advocacy. You'd expect most of their staff to be experts in things like community engagement, policy research and development, grant-making, campaign strategy, volunteer welfare, reporting & transparency, and management of investments.
Sure, there'll be some engineering needed to support that, but it shouldn't be their core focus.
You're arguing the stated purpose of the current system, I'm arguing we'd actually benefit more from refocusing it into software (like it used to be).
And that's the stated purpose. The observed current purpose of the system is to make a small handful of people more rich.
The MoCo/MoFo split happened for a reason: a non-profit couldn't do the big commercial deals that became available to MoCo.
If you went back to the pre-2005 situation, in which MoFo was all there was, it would have at most low single-digit millions in the bank rather than a billion. The AOL dowry was only intended to last a couple of years, and there's simply no way it could have sustained development of the browser beyond that. The Phoenix would have been consumed by the flames, and we'd be left with a stagnant IE/Chrome duopoly.
Worth noting that Mozilla Corporation (which I believe is the entity that has the contract with Google) is a for-profit organisation wholely owned by Mozilla Foundation which is the non-profit.
In theory, it feels like that ought not to change anything regarding the legal situation, but I bet it does.
Firefox isn't a part of any duopoly, with market share numbers as low as they are these days. Chrome + Safari, perhaps? (Or Chrome + Edge if you exclude mobile, though Edge of course uses the same rendering engine as Chome.)
The duopoly is Chrome and Safari. Firefox barely registers, especially because all browsers on iOS are Safari.
Also, what's your issue with Firefox?
A few hours ago, just a few comments: https://news.ycombinator.com/item?id=45642051
If you email the mods they’ll merge the duplicate discussions. Footer contact link.
I wonder if it is deliberate choice to not include scrollbar? Is it due to limitations of UI widgets, or nowadays scrollbars are part of website, as some websites are very happy to set scrollbar size to "too narrow for comfortable use" or even remove it altogether. To end on positive note: is there a way for an average developer to try and fix this issue, thus doing my own share of contributing? Where should one start?
Related: https://github.com/servo/servo/issues/21817
You should likely join https://servo.zulipchat.com and ask questions to know where to start.
Congrats to the servo team. It's been a long road and it's amazing they kept it alive.
I'd like to see this succeed, but I'm skeptical that a small team can keep up with the major players in this area. Many years ago Dan Kennedy (of the SQLite team) wrote a lovely HTML widget for TCL/TK. It rendered CSS 1.0 quite nicely, and was a pleasure to use, modulo a few font-related bugs; but was soon rendered obsolete and out of date. Not blaming Dan, here; it simply wasn't a one-person job. Meanwhile, I'd rewritten an app to make use of it. Got burned once, don't want to get burned again.
I feel like part of the solution here is to build the browser as reusable modular components. For some parts of browsers that's been common for years: JS engines (V8, SpiderMonkey, etc) are typically reusable, as are rendering backends (WebRender, Skia, etc), and lower-level components like Freetype/Harfbuzz/icu.
Servo's CSS engine Stylo is also modular, and is shared by Firefox which is part of how they've managed to not completely fall behind in web standards support despite the project being all but abandoned for several years.
I'm building another browser engine Blitz [0] which also uses Stylo, and we're building our layout/text engine in such a way that it can be reused so future browser engines (at least ones written in Rust) shouldn't need to build either Style or Layout if they don't want to.
A few more infrastructure pieces like this and browser engine development starts to look more approachable.
[0]: https://github.com/DioxusLabs/blitz
Thanks for you hard work, I already saw taffy being used by other prominent projects like Cosmic desktop environment, bevy, etc
It's several small teams. Servo is modular, and parts of it are useful outside of Servo. Other projects are using and maintaining and enhancing those modules. For example, IIRC dioxus uses many of the modules.
Edit: see sister comment by the actual Dioxus guy, which is more accurate than mine!
I seem to recall that MMM was based on this widget.
For context, MMM was a browser that supported both browser addons and sandboxed applets, around 1995.
I am confused, I remember downloading and trying an early Servo release out a very long time (decade?) ago.
I've not been following the space, is this a different project with the same name?
If the other project was a web browser then it's the same project. It got abandoned ~5 years ago, but has since been picked up again.
Same, reborn
I hope they give it a new name with the rebirth. I know it means something to some people but there are a lot of different things with that name
I hope they don't, Servo is a technology
If someone wants to put marketing veneer on top of a new project that uses servo, great! But servo is servo: a rendering engine
I'm so going to try this, and I hope it will end up as when I tried and used Phoenix, and then Firebird.
Is Servo ready if I want to play around with it in a embedded-browser capacity? Say I wanted to have some basic HTML+CSS UI, can I create a Rust binary that embeds Servo+those resources and it kind of works?
If you don't need JavaScript, then you might be interested in https://github.com/DioxusLabs/blitz.
It pulls in Servo/Firefox's CSS engine Stylo (and Servo's HTML parser html5ever) and pairs it with our own layout engine (which we are implementing mostly as libraries: Taffy [0] for box-level layout and Parley [1] for text/inline layout) and DOM implementation. Rendering and networking are abstracted behind traits (with default implementations available) and you can drive it using your own event loop.
Minimal binary sizes are around 5mb (although more typical build would be more like 10-15mb).
[0]: https://github.com/DioxusLabs/taffy [1]: https://github.com/linebender/parley
would this be a good fit for rendering a game UI? showing various stat/dialogue displays, an inventory/equip screen with draggable items, menus, etc. All I really want is html+css to do styling and layout and I'd rather have the interaction logic in the game code than javascript anyway
I think it would, modulo that it's not really "ready" yet.
We do have a couple of PoC examples of integrating with the Bevy game engine. Both of these use Dioxus Native, which wraps Blitz with Dioxus (which is a React-like framework but in Rust rather than JavaScript - https://github.com/DioxusLabs/dioxus), but you could do DOM tree creation and event handling manually if you wanted to.
- This first one includes Bevy inside a window setup by Dioxus Native (using a `<canvas>` element similar to how you might on the web). Here the event loop is controled by Dioxus Native and the Bevy game is rendered to a texture which is then included in Blitz's scene. https://github.com/DioxusLabs/dioxus/tree/main/examples/10-i...
- This second one does it the other way around and embeds a Dioxus Native document inside a window setup by Bevy. Here the event loop is controlled by Bevy and the Blitz document is rendered to a texture with which Bevy can then do whatever it likes (generally you might just render it on top of the games, but someone tried mapping it into 3d space https://github.com/rectalogic/bevy_blitz) https://github.com/DioxusLabs/dioxus/tree/main/examples/10-i...
The latter is probably what I would recommended for game UI.
Both approaches probably need more work (and Blitz could do with more complete event handling support) before I would consider them "production ready".
Igalia (who are heading Servo nowadays), say:
> Embedding Servo into applications requires a stable and complete WebView API. While early work exists, it’s not yet ready for general use.
(While announcing that they got funded to fix that.)
https://www.igalia.com/2025/10/09/Igalia,-Servo,-and-the-Sov...
You would end up simply with Electron 2.0. I tried de-entangling the Servo CSS / JS / Layout engine some years ago, to see if it would be more lightweight, it wasn't: https://github.com/fschutt/servo_gui_test (62 MB binary size, several hundred MB RAM usage IIRC)
I am currently working on getting https://azul.rs/reftest ready, which uses some of the underlying technologies as Servo (taffy-layout, webrender) but uses no JavaScript and also has a C / Python API. Azul is basically that, except it's not usable yet.
See my comment (https://news.ycombinator.com/item?id=45644277) about Blitz. Perhaps you might be interested in collaborating :)
Also, we're not using it in Blitz (although it could be added as a backend) but a note that WebRender is maintained. See Servo's most recent 0.68 branch (https://github.com/servo/webrender/tree/0.68) and also ongoing upstream development in the Firefox repository https://github.com/mozilla-firefox/firefox/tree/main/gfx/wr
I know about Dioxus / Blitz, but it's a very, very different project. The only common part is that both Azul and Blitz use taffy for flexbox / grid, but both the technologies, architecture, funding and goals are extremely different:
Blitz:
- Custom renderer (Skia?) vs Azuls WebRender fork (to get rid of any C dependencies)
- Stylo (CSS parser) vs azul-css (to support compilation of CSS to const items)
- HarfRust (font shaping) - vs allsorts (I used allsorts also in printpdf, so it fits)
- Skrifa (font parsing) - vs allsorts again (simplifies things)
- Fontique (font selection) - vs rust-fontconfig (custom pure-Rust rewrite of fontconfig)
- Parley (line breaking) - vs Azuls text3 engine
- All as separate projects vs Azuls monorepo-style
Dioxus:
- RSX macros, data + function coupled together vs Azuls "C function callbacks + HTML dataset" model
- Binary hot-patching vs Azuls dynamic linking model
- Macros vs Azuls HTML/CSS to Rust/C compiler build tool (no macros)
- Funded by YC (not sure about upsell?) vs funded by donations (once it's stable enough) and my Maps4Print cartography startup (dogfooding)
These things matter, even for small decisions. For example, Azul uses a custom CSS parser because the CSSProperty is a C-compatible enum, so that later on you can compile your entire CSS to a const fn and use CSS strings without even doing any allocations. So even on that level, there's a technological-architectural difference between Azul and Stylo.
But the core point is more architecturally: Azuls architecture is built for de-coupling the user data from the function callbacks, because I see this as the Archilles heel that all GUI systems so far have failed at:
https://github.com/fschutt/azul/blob/master/doc/guide/02_App...
Dioxus however repeats this exact same pattern again, and even the Elm architecture doesn't really fix it. I didn't finish the document but basically there is a (1) "hierarchy of DOM elements" and a (2) "graph of UI data" and those two are not always the same - they can overlap, but the core assumption of many GUI toolkits is that (2) is a tree (it's a graph, really) and (2) is always in the same hierarchy as (1), which is why GUI programming is a pain, no matter what language / framework. Electron just makes the visual part easier, but then you still need React to deal with the pain of data model / view sync.
I can collaborate on the flex / grid solver ofc, but it's very hard to collaborate on anything else because the technologies used, the goals, the architecture, etc. are very different between Dioxus / Azul. Azul is more "monorepo-NIH integrated solution" (because I often got bug reports in 2019 that I couldn't fix because I didn't own the underlying crate, so I had to wait for the maintainers to do another release, etc. - I learned from that).
As a note, the layout engine is also now heavily vibe-coded (sorry not sorry), so I don't take credit - but feel free to take inspiration or copy code. Gemini says the solver3 code is a "textbook implementation", take that as you will. My idea was to build a "AI feedback loop" to semi-automatically put the HTML input, the debug messages (to see what code paths are hit), the source code and the final display list into a loop to let the AI auto-debug the layout engine. So that part of writing the HTML engine isn't really hard, assuming the plan works out. The hardest part is caching, scrolling, performance debugging, interactions between different systems, and especially supporting the C API. Layout is comparably simple.
It's worth noting that:
- You don't have use Dioxus to use Blitz: you can do your own DOM construction and event handling with imperative Rust APIs.
- You don't have use any of the provided renderers to use blitz-dom (although our default renderer is Vello which is also pure Rust), and it would be possible to hook it up to WebRender.
- We have a lot of the tricky incremental layout and caching logic implemented (although there are still bugs).
- Blitz has grant funding through NLnet as well as funding from DioxusLabs, and is fully open source under permissive licenses (MIT/Apache 2.0) that don't really allow for "rug pulling".
---
That being said, the designs around CSS do sound quite different: we have chosen to take on a relatively heavy dependency in Stylo; we don't support non-tree-like structures; and in general, if you wish to do your own thing then that it what you ought to do!
Not sure that I agree that layout is simple (I have spent many long hours debugging the nuances of CSS layout over the past months), and I'm a little skeptical that an AI-based approach will work out. But I wish you luck!
I tried it as a little preview window for writing my blog, which is (in my opinion) very basic HTML and CSS. Whole page rendered wrong, though I admit I didn't bother to find out why. Give it a shot, but keep your expectations low.
I tried my simple html css website and it kinda worked actually. Even the dark mode/light mode worked but it was also minimalist pure html css website
If you have a basic site that doesn't work you can open an issue on the repo. If you have some relatively simple site, its useful for the team to know what features that people are using are broken.
Link? I'm a Servo maintainer and I appreciate test cases like that.
I'm seriously impressed on how far this has come. Tried a few websites in the experimental mode, it renders quite well.
Mozilla/5.0 (Android; Mobile; rv:128.0) Servo/0.0.1 Firefox/128.0
Does it support kiosk mode or is it configurable to run “locked down” to a single page and full-screen?
This is an incomplete browser engine, suitable mostly for technical contributors. If you're looking for a solution for kiosks, there are good for-purpose products/projects. Examples include: OpenKiosk, Porteus Kiosk, SiteKiosk.
If servoshell doesn't, Tauri will, the Tauri project seemed open to collaborating with Servo as an alternative to OS-provided WebViews
OK my understanding is that servo is a browser.
Then I read this on their repo:
>Servo aims to empower developers with a lightweight, high-performance alternative for embedding web technologies in applications.
Um... what? Are they just saying it's a browser in a verbose way or what? It just seems like you could replace literally all those words with "browser" and the clarity would skyrocket. Although perhaps it's not actually just a browser and I dont understand.
To many people involved in browser development there is a distinction between the "browser" (Chrome and Firefox, but also Opera, Brave, Arc that don't develop their own engine) and the "web engine" (Blink, WebKit, Gecko).
Servo is currently more of the latter than the former as it's UI is a pretty minimal one that is mostly useful for testing and doesn't much of the niceties that users expect of a modern browser (bookmarks, history, password manager, etc).
I do agree that it's confusing for most people though.
OK, that is a fair distinction I guess. A browser engine would be more clear then, I think. That is what it says in the readme.
Servo is to a browser what Chromium Embedded Framework is to Chromium. It is the vast majority of what is necessary for a browser, but it is not a browser in itself: it renders websites, but all of the user-facing browser functionality around that is a separate concern.
> Servo is a prototype web browser engine
Yes, those words are also in the repo.
Ah nice, they’re finally generating native ARM Mac binaries.
They just issued their first release, 0.0.1, after 50,000 commits. I've never seen that before.
Version numbers don't really mean much, especially for a project that was initially supposed to just be a proving ground for new Firefox technologies, some of which are indeed used in Firefox today.
Only more recently has the plan emerged to release a full browser engine based on servo.
It would be a pleasure to check out the open source web engine you have been a major contributor to :)
Is there a remind me bot once a relevant version number releases? Like 1.0 for example
That might be a while. It's taken 5 years from being transferred to the Linux Foundation to get to 0.0.1.
All the more reason for asking the question?
Adding context on a tangent
"The Missing Protocol: Let Me Know" https://news.ycombinator.com/item?id=44881287
Such a thing could be implemented with RSS on a long scale or ntfy.sh on a short scale, but afaik most projects don't.