« BackBehind the scenes of Bun Installbun.comSubmitted by Bogdanp 18 hours ago
  • captn3m0 12 hours ago

    > The M4 Max MacBook I'm using to write this would've ranked among the 50 fastest supercomputers on Earth in 2009.

    I attempted to validate this: You'd need >75 TFlop/s to get into the top50 in the TOP500[0] rankings in 2009. M4 Max review says 18.4 TFlop/s at FP32, but TOP500 uses LINPACK, which uses FP64 precision.

    An M2 benchmark gives a 1:4 ratio for double precision, so you'd get maybe 9 TFlop/s at FP64? That wouldn't make it to TOP500 in 2009.

    [0]: https://top500.org/lists/top500/list/2009/06/

    • nine_k 11 hours ago

      > Now multiply that by thousands of concurrent connections each doing multiple I/O operations. Servers spent ~95% of their time waiting for I/O operations.

      Well, no. The particular thread of execution might have been spending 95% of time waiting for I/O, but a server (the machine serving the thousands of connections) would easily run at 70%-80% of CPU utilization (because above that, tail latency starts to suffer badly). If your server had 5% CPU utilization under full load, you were not running enough parallel processes, or did not install enough RAM to do so.

      Well, it's a technicality, but the post is devoted to technicalities, and such small blunders erode the trust to the rest of the post. (I'm saying this as a fan of Bun.)

      • fleebee 7 hours ago

        I'm guessing that's an LLM hallucination. The conclusion section especially has some hints it was pulled out of an LLM:

        > The package managers we benchmarked weren't built wrong, they were solutions designed for the constraints of their time.

        > Buns approach wasn't revolutionary, it was just willing to look at what actually slows things down today.

        > Installing packages 25x faster isn't "magic": it's what happens when tools are built for the hardware we actually have.

    • robinhood 14 hours ago

      Complex subject, beautifully simple to read. Congrats to the author.

      Also: I love that super passionate people still exist, and are willing to challenge the statut quo by attacking really hard things - things I don't have the brain to even think about. It's not normal that we have better computers each month and slower softwares. If only everyone (myself included) were better at writing more efficient code.

      • ljm 12 hours ago

        I didn’t know it was written in Zig. That’s a fascinating choice to me given how young the language is.

        Amazing to see it being used in a practical way in production.

        • robinhood 11 hours ago

          Zig was created in 2016 though - almost 10 years at this point. Perhaps the surprise here is that we are not as exposed to this language on well-known and established projects as other languages like Rust, Go and C.

          • pdpi 10 hours ago

            Zig is still at the 0.x stage, and there's still a bunch of churn going on on really basic stuff like IO and memory allocation. I really enjoy writing it, but it's by no means stable to the point many people would write production software in it.

            • dwattttt 6 hours ago

              Rust hit 1.0 in 2015, it started as a project by Graydon Hoare in 2006; those dates line up pretty well with Zig's timeline.

              • ivanjermakov 6 hours ago

                To be fair, Zig 10 years ago is drastically different language from Zig today.

              • epolanski 10 hours ago

                The language is very in development but it's ecosystem and tooling are absolutely mature.

                • ivanjermakov 6 hours ago

                  I would not say that ecosystem is mature. Outside of superb C interop and popular C/C++ lib wrappers.

            • blizdiddy 17 hours ago

              I used bun for the first time last week. It was awesome! The built-in server and SQLite meant i didn’t need any dependencies besides bun itself, which is certainly my favorite way to develop.

              I do almost all of my development in vanilla js despite loathing the node ecosystem, so i really should have checked it out sooner.

              • k__ 17 hours ago

                I tried using Bun a few times, and I really liked working with it.

                Much better than Node.

                However...!

                I always managed to hit a road block with Bun and had to go back to Node.

                First it was the crypto module that wasn't compatible with Nodejs signatures (now fixed), next Playwright refused working with Bun (via Crawlee).

                • Jarred 4 hours ago

                  Playwright support will improve soon. We are rewriting node:http’s client implementation to pass node’s test suite. Expecting that to land next week.

                  • koakuma-chan 17 hours ago

                    You can use Bun as package manager only. You don't have to use Bun as runtime.

                    • iansinnott 3 hours ago

                      Indeed! also as a test runner/lib if you're not doing browser automation. bun definitely has benefits even if not used as a runtime.

                      • winterrdog 13 hours ago

                        Sure?

                        Does it work if I have packages that have nodejs c++ addons?

                        • abejfehr 10 hours ago

                          Why wouldn’t it? The end result of a npm install or a bun install is that the node_modules folder is structured in the way it needs to be, and I think it can run node-gyp for the packages that need it.

                      • Cthulhu_ 17 hours ago

                        I think this is the big one that slows adoption of "better" / "faster" tooling down, that is, backwards compatibility and drop-in-replacement-ability. Probably a lot of Hyrum's Law.

                        • epolanski 7 hours ago

                          Playwright has been fixed one year ago I think.

                          • drewbitt 12 hours ago

                            Deno doesn't work with crawlee either unfortunately

                            • petralithic 12 hours ago

                              You should try Deno, they have good Node compatibility

                              • erikpukinskis 7 hours ago

                                Does it? Last I tried, several years ago, coverage of the Node APIs was not good. I wanted to send data over UDP and a lot of Node basics there were missing.

                                • spartanatreyu 5 hours ago

                                  Deno's node compat is way better now.

                                  They're still missing niche things and they tend to target the things that most devs (and their dependencies) are actually using.

                                  But I can see they have it in their compat stuff now and it looks like it's working in the repl locally: https://docs.deno.com/api/node/dgram/

                              • jherdman 15 hours ago

                                Storybook is another for me.

                              • simantel 13 hours ago

                                Node also has a built-in server and SQLite these days though? Or if you want a lot more functionality with just one dependency, Hono is great.

                                • blizdiddy 12 hours ago

                                  And how many dependencies does Hono have? Looks like about 26. And how many dependencies do those have?

                                  A single static zig executable isn’t the same as a a pipeline of package management dependencies susceptible to supply chain attacks and the worst bitrot we’ve had since the DOS era.

                                  • bakkoting 9 hours ago

                                    > And how many dependencies does Hono have?

                                    Zero.

                                    I'm guessing you're looking at the `devDependencies` in its package.json, but those are only used by the people building the project, not by people merely consuming it.

                              • manuhabitela 14 hours ago

                                I'm impressed how pleasant and easy to read this pretty technical explanation was. Good job on the writing.

                                • winterrdog 13 hours ago

                                  Truth!

                                  Lydia is very good at presenting complex ideas simply and well. I've read and watched most of her work or videos. She really goes to great lengths in her work to make it come to life. Highly recommend her articles and YouTube videos.

                                  Though she's been writing less I think due to her current job

                                • thornewolf 16 hours ago

                                  I think they forgot to include the benchmark time for "npm (cached)" inside the Binary Manifest Caching section. We have bun, bun (cached), npm. I think the summary statistics are also incorrect.

                                  • Jarred 4 hours ago

                                    I work on Bun and also spent a lot of time optimizing bun install. Happy to answer any questions

                                    • nzoschke 4 hours ago

                                      I just want to say thanks to you and the team and community. Bun is a treat to use.

                                    • alberth 14 hours ago

                                      I really enjoyed the writing style of this post.

                                      A few things:

                                      - I feel like this post repurposed could be a great explanation on why io_uring is so important.

                                      - I wonder if Zig recently io updates in v0.15 make any perf improvement to Bun beyond its current fast perf.

                                      • aleyan 16 hours ago

                                        I have been excited about bun for about a year, and I thought that 2025 is going to be its breakout year. It is really surprising to me that it is not more popular. I scanned top 100k repos on GitHub, and for new repos in 2025, npm is 35 times more popular and pnpm is 11 time more popular than bun [0][1]. The other up and coming javascript runtime, deno is not so popular either.

                                        I wonder why that is? Is it because it is a runtime, and getting compatibility there is harder than just for a straight package manager?

                                        Can someone who tried bun and didn't adopt it personally or at work chime in and say why?

                                        [0] https://aleyan.com/blog/2025-task-runners-census/#javascript...

                                        [1] https://news.ycombinator.com/item?id=44559375

                                        • dsissitka 15 hours ago

                                          I really want to like Bun and Deno. I've tried using both several times and so far I've never made it more than a few thousand lines of code before hitting a deal breaker.

                                          Last big issue I had with Bun was streams closing early:

                                          https://github.com/oven-sh/bun/issues/16037

                                          Last big issue I had with Deno was a memory leak:

                                          https://github.com/denoland/deno/issues/24674

                                          At this point I feel like the Node ecosystem will probably adopt the good parts of Bun/Deno before Bun/Deno really take off.

                                          • hoten 11 hours ago

                                            uh... looks like an AI user saw this comment and fixed your bun issue? Or maybe it just deleted code in a random manner idk.

                                            https://github.com/oven-sh/bun/commit/b474e3a1f63972979845a6...

                                            • drewbitt 9 hours ago

                                              The bun team uses Discord to kick off the Claude bot, so someone probably saw the comment and told it to do it. that edit doesn't look particularly good though

                                          • phpnode 16 hours ago

                                            It’s a newer, vc funded competitor to the open source battle tested dominant player. It has incentives to lock you in and ultimately is just not that different from node. There’s basically no strategic advantage to using bun, it doesn’t really enable anything you can’t do with node. I have not seen anyone serious choose it yet, but I’ve seen plenty of unserious people use it

                                            • marcjschmidt 11 hours ago

                                              I think that summarizes it well. It's not 10x better that makes the risky bet of going into vendor lock from a VC-backed company worth it. Same issue with Prisma and Next for me.

                                            • williamstein 16 hours ago

                                              I am also very curious what people think about this. To me, as a project, Node gives off a vibe of being mature, democratic and community driven, especially after successfully navigating then io.js fork drama etc a few years ago. Clearly neither bun nor deno are community driven democratic projects, since they are both VC funded.

                                              • silverwind 16 hours ago

                                                Take a look at their issue tracker, it's full of crashes because apparently this Zig language is highly unsafe. I'm staying on Node.

                                                • audunw 12 hours ago

                                                  Zig isn’t inherently highly unsafe. A bit less than Rust in some regards. But arguably more safe in a few others.

                                                  But the language haven’t even reached 1.0 yet. A lot of the strategies for doing safe Zig isn’t fully developed.

                                                  Yet, TigerBeetle is written in Zig and is an extremely robust piece of software.

                                                  I think the focus of Bun is probably more on feature parity in the short term.

                                                  • petralithic 12 hours ago

                                                    That's why out if I had to choose a Node competitor, out of Bun and Deno, I'd choose Deno.

                                                    • mk12 15 hours ago

                                                      Good thing libuv is written in a "safe" language.

                                                      • otikik 14 hours ago

                                                        npm is a minefield that thousands of people traverse every day. So you are unlikely to hit a mine.

                                                        bun is a bumpy road that sees very low traffic. So you are likely to hit some bumps.

                                                      • keybored 9 hours ago

                                                        There’s a `crash` label. 758 open issues.

                                                        • actionfromafar 9 hours ago

                                                          Well node is C++ which isn’t exactly safe either. But it’s more tested.

                                                        • johnfn 16 hours ago

                                                          I am Bun's biggest fan. I use it in every project I can, and I write all my one-off scripts with Bun/TS. That being said, I've run into a handful of issues that make me a little anxious to introduce it into production environments. For instance, I had an issue a bit ago where something simple like an Express webserver inside Docker would just hang, but switching bun for node worked fine. A year ago I had another issue where a Bun + Prisma webserver would slowly leak memory until it crashed. (It's been a year, I'm sure they fixed that one).

                                                          I actually think Bun is so good that it will still net save you time, even with these annoyances. The headaches it resolves around transpilation, modules, workspaces etc, are just amazing. But I can understand why it hasn't gotten closer to npm yet.

                                                          • MrJohz 16 hours ago

                                                            I think part of the issue is that a lot of the changes have been fairly incremental, and therefore fairly easy to include back into NodeJS. Or they've been things that make getting started with Bun easier, but don't really add much long-term value. For example, someone else in the comments talked about the sqlite module and the http server, but now NodeJS also natively supports sqlite, and if I'm working in web dev and writing servers, I'd rather use an existing, battle-tested framework like Express or Fastify with a larger ecosystem.

                                                            It's a cool project, and I like that they're not using V8 and trying something different, but I think it's very difficult to sell a change on such incremental improvements.

                                                            • veber-alex 16 hours ago

                                                              Neither Bun nor Deno have any killer features.

                                                              Sure, they have some nice stuff that should also be added in Node, but nothing compelling enough to deal with ecosystem change and breakage.

                                                              • gkiely 11 hours ago

                                                                bun test is a killer feature

                                                              • fleebee 7 hours ago

                                                                There are some rough edges to Bun (see sibling comments), so there's a apparent cost to switching, namely wasted developer time in dealing with Node incompatibility. Being able to install packages 7x faster doesn't matter much to me so I don't see an upside to making the switch.

                                                                • davidkunz 16 hours ago

                                                                  I tried to run my project with bun - it didn't work so I gave up. Also, there needs to be a compelling reason to switch to a different ecosystem.

                                                                  • tracker1 14 hours ago

                                                                    There's still a few compatibility sticking points... I'm far more familiar with Deno and have been using it a lot the past few years, it's pretty much my default shell scripting tool now.

                                                                    That said, for many work projects, I need to access MS-SQL, which the way it does socket connections isn't supported by the Deno runtime, or some such. Which limits what I can do at work. I suspect there's a few similar sticking points with Bun for other modules/tools people use.

                                                                    It's also very hard to break away from entropy. Node+npm had over a decade and a lot of effort to build that ecosystem that people aren't willing to just abandon wholesale.

                                                                    I really like Deno for shell scripting because I can use a shebang, reference dependencies and the runtime just handles them. I don't have the "npm install" step I need to run separately, it doesn't pollute my ~/bin/ directory with a bunch of potentially conflicting node_modules/ either, they're used from a shared (configurable) location. I suspect bun works in a similar fashion.

                                                                    That said, with work I have systems I need to work with that are already in place or otherwise chosen for me. You can't always just replace technology on a whim.

                                                                    • oefrha 15 hours ago

                                                                      To beat an incumbent you need to be 2x better. Right now it seems to be a 1.1x better (for any reasonably sized projects) work in progress with kinks you’d expect from a work in progress and questionable ecosystem buy-in. That may be okay for hobby projects or tiny green field projects, but I’m absolutely not gonna risk serious company projects with it.

                                                                      • k__ 11 hours ago

                                                                        Seems awfully close to 2x, and that was last year.

                                                                        https://dev.to/hamzakhan/rust-vs-go-vs-bun-vs-nodejs-the-ult...

                                                                        • oefrha 8 hours ago

                                                                          > 1.1x better (for any reasonably sized projects)

                                                                          2x in specific microbenchmarks doesn’t translate to big savings in practice. We don’t serve a static string with an application server in prod.

                                                                      • turtlebits 14 hours ago

                                                                        Tried it last year - I spent a few hours fighting the built in sqlite driver and found it buggy (silent errors) and the docs were very lacking.

                                                                        • fkyoureadthedoc 16 hours ago

                                                                          Bun is much newer than pnpm, looking at 1.0 releases pnpm has about a 6 year head start.

                                                                          I write a lot of one off scripts for stuff in node/ts and I tried to use Bun pretty early on when it was gaining some hype. There were too many incompatibilities with the ecosystem though, and I haven't tried since.

                                                                          • madeofpalk 16 hours ago

                                                                            Honestly, it doesn't really solve a big problem I have, and introduces all the problem with being "new" and less used.

                                                                            • koakuma-chan 16 hours ago

                                                                              > I wonder why that is?

                                                                              LLMs default to npm

                                                                              • fkyoureadthedoc 16 hours ago

                                                                                You sure it's not just because npm has been around for 15 years as the default package manager for node?

                                                                                • koakuma-chan 16 hours ago

                                                                                  Didn't prevent me from switching to Bun as the cost is 0.

                                                                            • RestartKernel 7 hours ago

                                                                              This is very nicely written, but I don't quite get how Linux's hardlinks are equivalent to MacOS's clonefile. If I understand correctly, wouldn't the former unexpectedly update files across all your projects if you modify just one "copy"?

                                                                              • valtism 2 hours ago

                                                                                I had no idea Lydia was working for Bun now. Her technical writing is absolutely top notch

                                                                                • atonse 9 hours ago

                                                                                  I absolutely loved reading this. It's such an excellent example of a situation where Computer Science principles are very important in day to day software development.

                                                                                  So many of these concepts (Big O, temporal and spatial locality, algorithmic complexity, lower level user space/kernel space concepts, filesystems, copy on write), are ALL the kinds of things you cover in a good CS program. And in this and similar lower level packages, you use all of them to great effect.

                                                                                  • epolanski 7 hours ago

                                                                                    This is about software engineering not computer science.

                                                                                    CS is the study of computations and their theory (programming languages, algorithms, cryptography, machine learning, etc).

                                                                                    SE is the application of engineering principles to building scalable and reliable software.

                                                                                  • wink 16 hours ago

                                                                                    > Node.js uses libuv, a C library that abstracts platform differences and manages async I/O through a thread pool.

                                                                                    > Bun does it differently. Bun is written in Zig, a programming language that compiles to native code with direct system call access:

                                                                                    Guess what, C/C++ also compiles to native code.

                                                                                    I mean, I get what they're saying and it's good, and nodejs could have probably done that as well, but didn't.

                                                                                    But don't phrase it like it's inherently not capable. No one forced npm to be using this abstraction, and npm probably should have been a nodejs addon in C/C++ in the first place.

                                                                                    (If anything of this sounds like a defense of npm or node, it is not.)

                                                                                    • k__ 16 hours ago

                                                                                      To me, the reasoning seems to be:

                                                                                      Npm, pnpm, and yarn are written in JS, so they have to use Node.js facilities, which are based on libuv, which isn't optimal in this case.

                                                                                      Bun is written in Zig, so it doesn't need libuv, and can so it's own thing.

                                                                                      Obviously, someone could write a Node.js package manager in C/C++ as a native module to do the same, but that's not what npm, pnpm, and yarn did.

                                                                                      • lkbm 15 hours ago

                                                                                        Isn't the issue not that libuv is C, but that the thing calling it (Node.js) is Javascript, so you have to switch modes each time you have libuv make a system call?

                                                                                      • markasoftware 9 hours ago

                                                                                        I'm pretty confused about why it's beneficial to wait to read the whole compressed file before decompressing. Surely the benefit of beginning decompression before the download is complete outweigh having to copy the memory around a few extra times as the vector is resized?

                                                                                        • Jarred 4 hours ago

                                                                                          Streaming prevents many optimizations because the code can’t assume it’s done when run once, so it has to suspend / resume, clone extra data for longer, and handle boundary cases more carefully.

                                                                                          It’s usually only worth it after ~tens of megabytes, but vast majority of npm packages are much smaller than that. So if you can skip it, it’s better.

                                                                                        • LeicaLatte 9 hours ago

                                                                                          Liking the package management from first principles as a systems-level optimization problem rather than file scripting. resembling a database engine - dependency aware task scheduling, cache locality, sys call overhead - they are all there.

                                                                                          • tracker1 14 hours ago

                                                                                            I'm somewhat curious how Deno stands up with this... also, not sure what packages are being installed. I'd probably start a vite template project for react+ts+mui as a baseline, since that's a relatively typical application combo for tooling. Maybe hono+zod+openapi as well.

                                                                                            • tracker1 11 hours ago

                                                                                              For my own curiousity on a React app on my work desktop.

                                                                                                  - Clean `bun install`, 48s - converted package-lock.json
                                                                                                  - With bun.lock, no node_modules, 19s
                                                                                                  - Clean with `deno install --allow-scripts`, 1m20s
                                                                                                  - with deno.lock, no node_modules, 20s
                                                                                                  - Clean `npm i`, 26s
                                                                                                  - `npm ci` (package-lock.json), no node_modules, 1m,2s (wild)
                                                                                              
                                                                                              So, looks like if Deno added a package-lock.json conversion similar to bun the installs would be very similar all around. I have no control over the security software used on this machine, was just convenience as I was in front of it.

                                                                                              Hopefully someone can put eyes on this issue: https://github.com/denoland/deno/issues/25815

                                                                                              • steve_adams_86 12 hours ago

                                                                                                I think Deno isn't included in the benchmark because it's a harder comparison to make than it might seem.

                                                                                                Deno's dependency architecture isn't built around npm; that compatibility layer is a retrofit on top of the core (which is evident in the source code, if you ever want to see). Deno's core architecture around dependency management uses a different, URL-based paradigm. It's not as fast, but... It's different. It also allows for improved security and cool features like the ability to easily host your own secure registry. You don't have to use npm or jsr. It's very cool, but different from what is being benchmarked here.

                                                                                                • tracker1 11 hours ago

                                                                                                  All the same, you can run deno install in a directory with a package.json file an it will resolve and install to node_modules. The process is also written in compiled code, like bun... so I was just curious.

                                                                                                  edit: replied to my own post... looks like `deno install --allow-scripts` is about 1s slower than bun once deno.lock exists.

                                                                                              • randomsofr 13 hours ago

                                                                                                wow, crazy to see yarn being so slow, when it used to beat npm by a lot, at a company i was we went from npm, to yarn, to pnpm, back to npm. Nowadays i try to use Bun as much as possible, but Vercel still does not uses natively for Next.

                                                                                                • chrisweekly 13 hours ago

                                                                                                  why leave pnpm?

                                                                                                • k__ 16 hours ago

                                                                                                  "... the last 4 bytes of the gzip format. These bytes are special since store the uncompressed size of the file!"

                                                                                                  What's the reason for this?

                                                                                                  I could imagine, many tools could profit from knowing the decompressed file size in advance.

                                                                                                  • philipwhiuk 16 hours ago

                                                                                                    It's straight from the GZIP spec if you assume there's a single GZIP "member": https://www.ietf.org/rfc/rfc1952.txt

                                                                                                    > ISIZE (Input SIZE)

                                                                                                    > This contains the size of the original (uncompressed) input data modulo 2^32.

                                                                                                    So there's two big caveats:

                                                                                                    1. Your data is a single GIZP member (I guess this means everything in a folder)

                                                                                                    2. Your data is < 2^32 bytes.

                                                                                                    • k__ 16 hours ago

                                                                                                      Yeah, I understood that.

                                                                                                      I was just wondering why GZIP specified it that way.

                                                                                                      • ncruces 16 hours ago

                                                                                                        Because it allows streaming compression.

                                                                                                        • k__ 15 hours ago

                                                                                                          Ah, makes sense.

                                                                                                          Thanks!

                                                                                                    • lkbm 15 hours ago

                                                                                                      I believe it's because you get to stream-compress efficiently, at the cost of stream-decompress efficiency.

                                                                                                      • 8cvor6j844qw_d6 16 hours ago

                                                                                                        gzip.py [1]

                                                                                                        ---

                                                                                                        def _read_eof(self):

                                                                                                        # We've read to the end of the file, so we have to rewind in order

                                                                                                        # to reread the 8 bytes containing the CRC and the file size.

                                                                                                        # We check the that the computed CRC and size of the

                                                                                                        # uncompressed data matches the stored values. Note that the size

                                                                                                        # stored is the true file size mod 2*32.

                                                                                                        ---

                                                                                                        [1]: https://stackoverflow.com/a/1704576

                                                                                                      • djfobbz 16 hours ago

                                                                                                        I really like Bun too, but I had a hard time getting it to play nicely with WSL1 on Windows 10 (which I prefer over WSL2). For example:

                                                                                                          ~/: bun install
                                                                                                          error: An unknown error occurred (Unexpected)
                                                                                                        • lfx 16 hours ago

                                                                                                          Why you prefer WSL1 over WSL2?

                                                                                                          • tracker1 14 hours ago

                                                                                                            FS calls across the OS boundary are significantly faster in WSL1, as the biggest example from the top of my head. I prefer WSL2 myself, but I avoid using the /mnt/c/ paths as much as possible, and never, ever run a database (like sqlite) across that boundary, you will regret it.

                                                                                                            • djfobbz 13 hours ago

                                                                                                              WSL1's just faster, no weird networking issues, and I can edit the Linux files from both Windows and Linux without headaches.

                                                                                                          • rs_rs_rs_rs_rs 17 hours ago

                                                                                                            Python has uv, JS has bun, what does Ruby or PHP have? Are the devs using those languages happy with how fast the current popular dependency managers are?

                                                                                                            • JamesSwift 17 hours ago

                                                                                                              Youre looking at it wrong. Python has nix, JS has nix, ruby and php have nix : D

                                                                                                              Thats closer to how pnpm achieves speed up though. I know there is 'rv' recently, but havent tried it.

                                                                                                              • koakuma-chan 17 hours ago

                                                                                                                You mean nix the package manager? I used to use NixOS and I had to switch off because of endless mess with environment variables.

                                                                                                                • JamesSwift 4 hours ago

                                                                                                                  Yes, nix package manager. Or devenv for a more streamlined version of what I'm describing, similar to mise but powered by nix.

                                                                                                              • hu3 17 hours ago

                                                                                                                PHP is getting Mago (written in Rust).

                                                                                                                Repo: https://github.com/carthage-software/mago

                                                                                                                Announcement 9 months ago:

                                                                                                                https://www.reddit.com/r/PHP/comments/1h9zh83/announcing_mag...

                                                                                                                For now its main features are 3: formatting, linting and fixing lint issues.

                                                                                                                I hope they add package management to do what composer does.

                                                                                                                • tommasoamici 17 hours ago

                                                                                                                  it's pretty new, but in Ruby there's `rv` which is clearly inspired by `uv`: https://github.com/spinel-coop/rv.

                                                                                                                  >Brought to you by Spinel

                                                                                                                  >Spinel.coop is a collective of Ruby open source maintainers building next-generation developer tooling, like rv, and offering flat-rate, unlimited access to maintainers who come from the core teams of Rails, Hotwire, Bundler, RubyGems, rbenv, and more.

                                                                                                                  • weaksauce 15 hours ago

                                                                                                                    bundler is generally pretty fast on the ruby side. it also reuses dependencies for a given ruby version so you don't have the stupid node_folder in every project you use with every dependency re-downloaded and stored. if you have 90% of the dependencies for a project you only have to download and install/compile 10% of them. night and day difference.

                                                                                                                    • aarondf 17 hours ago

                                                                                                                      PHP has Composer, and it's extremely good!

                                                                                                                      • kijin 16 hours ago

                                                                                                                        PHP is much closer to raw C and doesn't do any threading by default, so I suppose composer doesn't suffer from the thread synchronization and event loop related issues that differentiate bun from npm.

                                                                                                                        • gertop 7 hours ago

                                                                                                                          But node doesn't do threading by default either? Are you saying that npm is somehow multithreaded?

                                                                                                                    • phildougherty 14 hours ago

                                                                                                                      Bun is FUN to say.

                                                                                                                      • wojtek1942 8 hours ago

                                                                                                                        > However, this mode switching is expensive! Just this switch alone costs 1000-1500 CPU cycles in pure overhead, before any actual work happens.

                                                                                                                        ...

                                                                                                                        > On a 3GHz processor, 1000-1500 cycles is about 500 nanoseconds. This might sound negligibly fast, but modern SSDs can handle over 1 million operations per second. If each operation requires a system call, you're burning 1.5 billion cycles per second just on mode switching.

                                                                                                                        > Package installation makes thousands of these system calls. Installing React and its dependencies might trigger 50,000+ system calls: that's seconds of CPU time lost to mode switching alone! Not even reading files or installing packages, just switching between user and kernel mode.

                                                                                                                        Am I missing something or is this incorrect. They claim 500ns per syscall with 50k syscalls. 500ns * 50000 = 25 milliseconds. So that is very far from "seconds of CPU time lost to mode switching alone!" right?

                                                                                                                        • Bolwin 8 hours ago

                                                                                                                          Read further. In one of the later benchmarks, yarn makes 4 million syscalls.

                                                                                                                          Still only about 2 secs, but still.

                                                                                                                        • yahoozoo 5 hours ago

                                                                                                                          Good article but it sounds a lot like AI wrote it.

                                                                                                                          • swyx 14 hours ago

                                                                                                                            i'm curious why Yarn is that much slower than npm? whats the opposite of this article?

                                                                                                                            • WesolyKubeczek 8 hours ago

                                                                                                                              macOS has hardlinks too. Why not use them?

                                                                                                                              • moffkalast 14 hours ago

                                                                                                                                Anyone else also having a first association to https://xkcd.com/1682 instead of, you know, bread?

                                                                                                                                • paularmstrong 14 hours ago

                                                                                                                                  This is all well and good, but the time it takes to install node modules is not a critical blocker for any project that I've ever been a part of. It's a drop in the bucket compared to human (ability and time to complete changes) and infrastructure (CI/deploy/costs). Cutting 20 seconds off the dependency install time is just not a make or break issue.

                                                                                                                                  • tracker1 14 hours ago

                                                                                                                                    It's more than enough to lose your focus. If you can make a process take a couple seconds or less vs over 15, you should do that.

                                                                                                                                    • paularmstrong 13 hours ago

                                                                                                                                      How often are you doing a full install of dependencies? Re-runs for me using npm/pnpm/yarn are 1-2 seconds at worst in very large monorepos. I can't imagine needing to do full installs on any sort frequency.

                                                                                                                                      • tracker1 12 hours ago

                                                                                                                                        I find that it's heavily dependent on the drive speed, so I've leaned into getting current generation, very fast drives as much as possible when I put together new computers and sometimes a mid-generation upgrade. Considering I often do consulting work across random projects, I pretty often am having to figure out and install things in one mono repo managed with pnpm, another with yarn, etc... so the pain is relatively real, that said, fastest drive matters as much or more, especially with build steps.

                                                                                                                                        When handling merge/pull requests, I'll often do a clean step (removing node_modules, and temp files) before a full install and build to test everything works. I know not everyone else is this diligent, but this can happen several times a day... Automation (usually via docker) can help a lot with many things tested through a CI/CD environment, that said, I'm also not a fan of having to wait for too long for that process... it's too easy to get side-tracked and off-task. I tend to set alarms/timers throughout the day just so I don't miss meetings. I don't want to take a moment to look at HN, and next I know it's a few hours later. Yeah, that's my problem... but others share it.

                                                                                                                                        So, again, if you can make something take less than 15s that typically takes much more, I'm in favor... I went from eslint to Rome/Biome for similar reasons... I will switch to faster tooling to reduce the risk of going off-task and not getting back.

                                                                                                                                        • paularmstrong 13 hours ago

                                                                                                                                          I also just tried bun install in my main work monorepo vs yarn. bun took 12s and yarn took 15s. This is hardly a difference worth noting.

                                                                                                                                          • tracker1 12 hours ago

                                                                                                                                            Yeah, I find HDD speed can make more of a difference too. Gen 5 PCIe is amazing.. the difference in Rust builds was pretty phenomenal over even a good gen 4 drive.

                                                                                                                                      • sgarland 5 hours ago

                                                                                                                                        I am thrilled that anyone in the web dev community is giving a shit about performance, and clearly knows something about how a computer actually works.

                                                                                                                                        I am so, so tired of the “who cares if this library eats 100 MiB of RAM; it’s easier” attitude.