I am more shocked about the origin story compared to the acquisition.
> Almost five years ago, I was building a Minecraft-y voxel game in the browser. The codebase got kind of large, and the iteration cycle time took 45 seconds to test if changes worked. Most of that time was spent waiting for the Next.js dev server to hot reload.
Why in the hell would anyone be using Next.js to make a 3D game... Jarred has always seemed pretty smart, but this makes no sense. He could've saved so much time and avoided building a whole new runtime by simply not using the completely wrong tool for the job.
A lot of people seem confused about this acquisition because they think of Bun as a node.js compatible bundler / runtime and just compare it to Deno / npm. But I think its a really smart move if you think of where Bun has been pushing into lately which is a kind of cloud-native self contained runtime (S3 API, SQL, streaming, etc). For an agent like Claude Code this trajectory is really interesting as you are creating a runtime where your agent can work inside of cloud services as fluently as it currently does with a local filesystem. Claude will be able to leverage these capabilities to extend its reach across the cloud and add more value in enterprise use cases
They discussed how running generated code is better for context management in many cases. The AI can generate code to retrieve, process, and filter the data it needs rather than doing it in-context, thus reducing context needs. Furthermore, if you can run the code right next to the server where the data is, it's all that much faster.
I see Bun like a Skynet: if it can run anywhere, the AI can run anywhere.
AI tools value simplicity, fast bootstrapping and iterations, this rules out the JVM which has the worst build system and package repositories I've ever had the displeasure of needing to use. Check in gradle binaries in 2025? Having to wait days for packages to sync? Wrappers on every project for Windows/Linux? It's broken beyond repair.
By contrast `bun install` is about as good as it gets.
It’s relevant enough that I feel I can roll out this bash.org classic…
<Alanna> Saying that Java is nice because it works on all OS's is like saying that anal sex is nice because it works on all genders
EDIT: someone has (much to my joy) made an archive of bash.org so here is a link[1], but I must say I’m quite jealous of today’s potential 1/10,000[2] who will discover bash.org from my comment!
Not discovered from scratch, but was a big fan when it was alive and kicking. Went there from time to time to get some mood boosters. So was very sad when found that it's gone (original one). Thanks a lot for sharing that bash-org-archive.com exists, what a great fun going down this memory lane.
I’ve been browsing the archive since I left that comment, they really were the good old days weren’t they. IRC was my introduction to geekdom, and I don’t think it would be unreasonable to say it shaped my life. Here I am 30-ish years later, an old man yelling at clouds — and I wouldn’t change much!
If anyone ever requested/used an eggdrop(?) bot from #farmbots or #wildbots on quakenet then thanks to you too; that was certainly one of the next steps down the path I took. A (probably very injectable) PHP blog and a bunch of TCL scripts powering bots, man I wish I could review that code now.
I don’t think there is a search function, I got the exact wording from a web search (I think “bash Java anal”, arguably a dangerous search!) and then after submitting I wondered if there is an archive of the quotes.
May I ask, what is this obsession with targeting the browser? I've also noticed a hatred of k8s here, and while I truly understand it, I'd take the complication of managing infrastructure over frontend fads any day.
This is a site for startups. They have no business running k8s, in fact, many of the lessons learned get passed on from graybeards to the younger generation along those lines. Perhaps I'm wrong! I'd love to talk shop somewhere.
This is absolutely untrue. Code from JDK 8 runs fine on JDK 25 (just released LTS). It is true that if you did something silly that locks you into certain dependency versions, you may be stuck, but this is not the majority of applications.
Yea - if you want a paranoidly-sandboxed, instant-start, high-concurrency environment, not just on beefy servers but on resource-constrained/client devices as well, you need experts in V8 integration shenanigans.
Cloudflare Workers had Kenton Varda, who had been looking at lightweight serverless architecture at Sandstorm years ago. Anthropic needs this too, for all the reasons above. Makes all the sense in the world.
JS has the fastest, most robust and widely deployed sandboxing engines (V8, followed closely by JavaScriptCore which is what Bun uses). It also has TypeScript which pairs well with agentic coding loops, and compiles to the aforementioned JavaScript which can run pretty much anywhere.
Note that "sandboxing" in this case is strictly runtime sandboxing - it's basically like having a separate process per event loop (as if you ran separate Node processes). It does not sandbox the machine context in which it runs (i.e. it's not VM-level containment).
When you say runtime sandboxing, are you referring to JavaScript agents? I haven't worked all that much with JavaScript execution environments outside of the browser so I'm not sure about what sandboxing mechanics are available.
Bun claims this feature is for running untrusted code (https://bun.com/reference/node/vm), while Node says "The node:vm module is not a security mechanism. Do not use it to run untrusted code." I'm not sure whom to believe.
It's interesting to see the difference in how both treat the module. It feels similar to a realm which makes me lean by default to not trusting it for untrusted code execution.
It looks like Bun also supports Shadow Realms which from my understanding was more intended for sandboxing (although I have no idea how resources are shared between a host environment and Shadow Realms, and how that might potentially differ from the node VM module).
The reference docs are auto generated from node’s TypeScript types. node:vm is better than using the same global object to run untrusted code, but it’s not really a sandbox
> It also has TypeScript which pairs well with agentic coding loops, (...)
I've heard that TypeScript is pretty rough on agentic coding loops because the idiomatic static type assertion code ends up requiring huge amounts of context to handle in a meaningful way. Is there any truth to it?
> It also has TypeScript which pairs well with agentic coding loops
The language syntax has nothing to do with it pairing well with agentic coding loops.
Considering how close Typescript and C# are syntactically, and C#'s speed advantage over JS among many other things would make C# the main language for building Agents. It is not and that's because the early SDKs were JS and Python.
Typescript is probably generally a good LLM language because
- static types
- tons and tons of training data
Kind of tangent but I used to think static types were a must-have for LLM generated code. But the most magical and impressively awesome thing I’ve seen for LLM code generation is “calva backseat driver”, a vscode extension that lets copilot evaluate clojure expressions and generally do REPL stuff.
It can write MUCH cleaner and more capable code, using all sorts of libraries that it’s unfamiliar with, because it can mess around and try stuff just like a human would. It’s mind blowingly cool!!
> C#'s speed advantage over JS among many other things would make C# the main language
Nobody cares about this, JS is plenty fast for LLM needs. If maximum performance was necessary, you're better off using Go because of fast compiler and better performance.
And that was my point. The choice of using JS/TS for LLM stuff was made for us based on initial wave of SDK availabilities. Nothing to do with language merits.
This is one of those, "in theory, there's no difference between theory and practice. In practice, there is" issues.
In their, quality software can be written in any programming language.
In practice, folks who use Python or JavaScript as their application programming language start from a position of just not carrying very much about correctness or performance. Folks who use languages like Java or C#, do. And you can see the downstream effects of this in the difference in the production-grade developer experience and the quality of packages on offer in PIP and NPM versus Maven and NuGet.
As a developer that switches between java, python and typescript every day I think this is fairly myopic opinion. Being siloed to one lang for long enough tends to brings out our tribalistic tendencies, tread carefully.
I've seen codebases of varying quality in nearly every language, "enterprise" and otherwise. I've worked at a C# shop and it was no better or worse than the java/kotlin/typescript ones I've worked at.
You can blame the "average" developer in a language for "not caring ", but more likely than not you're just observing the friction imposed by older packaging systems. Modern languages are usually coupled with package managers that make it trivial to publish language artifacts to package hubs, whereas gradle for example is it's own brand of hell just to get your code to build.
That's not a fair comparison. In your example, you're talking about the average of developers in a language. In this situation, it's specific developers choosing between languages. Having the developers you already have choose language A or B makes no difference to their code quality (assuming they're proficient with both)
These are statements these developers will make themselves. They will say they don't like more strictly typed languages because they feel constrained and slowed down in development. They will argue that the performance hit is worth the trade offs.
A typical backend developer using C#/Java is likely solving more complicated problems and having all the concerns of an enterprise system to worry about and maintain.
Dismissing a dev or a system because it is enterprisy is a weak argument to make against a language. A language being used a lot in an enterprise to carry the weight of the business is a sign the language is actually great and reliable enough.
> Nonsense. Average Java/C# is an enterprise monkey who barely knows outside of their grotesque codebase.
Netflix is Java. Amazon is mostly Java. Some of the biggest open source projects in the world are Java. Unity and Godot both use C# for scripting.
I don't know where you're getting the impression that Java and C# are somehow only for "enterprise monkey who barely knows outside of their grotesque codebase"
Could also be a way to expand the customer for Claude Code from coding assistant to vibe coding, a la Replit creating a hosted app. CC working more closely with Bun could make all that happen much faster:
> Our default answer was always some version of "we'll eventually build a cloud hosting product.", vertically integrated with Bun’s runtime & bundler.
>Claude will be able to leverage these capabilities to extend its reach across the cloud and add more value in enterprise use cases
100%. even more robust if paired with an overlay network which provides identity based s3 access (rather than ip address/network based). else server may not have access to s3/cloud resource, at least for many enterprises with s3 behind vpn/direct connect.
ditto for cases when want agent/client side to hit s3 directly, bypassing the server, and agent/client may not have permitted IP in FW ACL, or be on vpn/wan.
That's a really cool use case and seems super helpful. working cloud native is a chore sometimes. having to fiddle with internal apis, acl/permissions issues.
The writeup makes it sound like an acquihire, especially the "what changes" part.
ChatGPT is feeling the pressure of Gemini [0]. So it's a bit strange for Anthropic to be focusing hard on its javascript game. Perhaps they see that as part of their advantage right now.
This matches some previous comments around LLMs driving adoption of programming languages or frameworks. If you ask Claude to write a web app, why not have it use your own framework, that it was trained on, by default?
Currently Claude etc. can interact with services (including AWS) via MCPs.
What the user you're replying to is saying the Bun acquisition looks silly as a dev tool for Node. However if you look at their binding work for services like s3[0], the LLM will be able to interact directly with cloud services directly (lower latency, tighter integration, simplified deployment).
Claude Code would be a much worse product if it didn't ship with web search and filesystem operations but instead required you to configure an MCP kit.
As a commandline end user who prefers to retreive data from the www as text-only, I see deno and bun as potential replacements (for me, not necessarily for anyone else) for the so-called "modern" browser in those rare cases where I need to interpret Javascript^1
At present the browser monstrosity is used to (automatically, indiscriminantly) download into memory and run Javascripts from around the web. At least with a commandline web-capable JS runtime monstrosity the user could in theory exercise more control over what scripts are downloaded and if and when to run them. Perhaps more user control over permissions to access system resources as well (cf. corporate control)
1. One can already see an approach something like this being used in the case of
> At the time of writing, Bun's monthly downloads grew 25% last month (October, 2025), passing 7.2 million monthly downloads. We had over 4 years of runway to figure out monetization. We didn't have to join Anthropic.
I believe this completely. They didn't have to join, which means they got a solid valuation.
> Instead of putting our users & community through "Bun, the VC-backed startups tries to figure out monetization" – thanks to Anthropic, we can skip that chapter entirely and focus on building the best JavaScript tooling.
I believe this a bit less. It'll be nice to not have some weird monetization shoved into bun, but their focus will likely shift a bit.
> They didn't have to join, which means they got a solid valuation.
Did they? I see a $7MM seed round in 2022. Now to be clear that's a great seed round and it looks like they had plenty of traction. But it's unclear to me how they were going to monetize enough to justify their $7MM investment. If they continued with the consultancy model, they would need to pay back investors from contracts they negotiate with other companies, but this is a fraught way to get early cashflow going.
Though if I'm not mistaken, Confluent did the same thing?
I don't like all of the decisions they made for the runtime, or some of the way they communicate over social media/company culture, but I do admire how well-run the operation seems to have been from the outside. They've done a lot with (relatively) little, which is refreshing in our industry. I don't doubt they had a long runway either.
Thanks I scrolled past that in the announcement page.
With more runway comes more investor expectations too though. Some of the concern with VC backed companies is whether the valuation remains worthwhile. $26mm in funding is plenty for 14 people, but again the question is whether they can justify their valuation.
Regardless happy for the Oven folks and Bun has been a great experience (especially for someone who got on the JS ecosystem quite late.) I'm curious what the structure of the acquisition deal was like.
> They didn't have to join, which means they got a solid valuation.
This isn't really true. It's more about who wanted them to join. Maybe it was Anthropic who really wanted to take over Bun/hire Jarred, or it was Jarred who got sick of Bun and wanted to work on AI.
I don't really know any details about this acquisition, and I assume it's the former, but acquihires are also done for other reasons than "it was the only way".
Yeah, now they are part of Anthropic, who haven't figured out monetization themselves. Shikes!
I'm a user of Bun and an Anthropic customer. Claude Code is great and it's definitely where their models shine. Outside of that Anthropic sucks,their apps and web are complete crap, borderline unusable and the models are just meh. I get it, CC's head got probably a powerplay here given his department is towing the company and his secret sauce, according to marketing from Oven, was Bun. In fact VSCode's claude backend is distributed in bun-compiled binary exe, and the guy is featured on the front page of the Bun website since at least a week or so. So they bought the kid the toy he asked for.
Anthropic needs urgently, instead, to acquire a good team behind a good chatbot and make something minimally decent. Then make their models work for everything else as well as they do with code.
> Yeah, now they are part of Anthropic, who haven't figured out monetization themselves.
Anthropic are on track to reach $9BN in annualised revenue by the end of the year, and the six-month-old Claude Code already accounts for $1BN of that.
Given the worries about LLM focused companies reaching profitability I have concerns that Bun's runway will be hijacked... I'd hate for them to go down with the ship when the bubble pops.
I'm sort of surprised to see that you used Claude Code so much. I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc. And I know Bun started with an extreme attention to detail around performance.
I would have thought LLM-generated code would run a bit counter to both of those. I had sort of carved the world into "vibe coders" who care about the eventual product but don't care so much about the "craft" of code, and people who get joy out of the actual process of coding and designing beautiful abstractions and data structures and all that, which I didn't really think worked with LLM code.
But I guess not, and this definitely causes me to update my understanding of what LLM-generated code can look like (in my day to day, I mostly see what I would consider as not very good code when it comes from an LLM).
Would you say your usage of Claude Code was more "around the edges", doing things like writing tests and documentation and such? Or did it actually help in real, crunchy problems in the depths of low level Zig code?
I am not your target with this question (I don't write Zig) but there is a spectrum of LLM usage for coding. It is possible to use LLMs extensively but almost never ship LLM generated code, except for tiny trivial functions. One can use them for ideation, quick research, or prototypes/starting places, and then build on that. That is how I use them, anyway
Culturally I see pure vibe coders as intersecting more with entrepreneurfluencer types who are non-technical but trying to extend their capabilities. Most technical folks I know are fairly disillusioned with pure vibe coding, but that's my corner of the world, YMMV
> Culturally I see pure vibe coders as intersecting more with entrepreneurfluencer types who are non-technical but trying to extend their capabilities. Most technical folks I know are fairly disillusioned with pure vibe coding, but that's my corner of the world, YMMV
Anyone who has spent time working with LLMs knows that the LinkedIn-style vibecoding where someone writes prompts and hits enter until they ship an app doesn't work.
I've had some fun trying to coax different LLMs into writing usable small throwaway apps. It's hilarious in a way to the contrast between what an experienced developer sees coming out of LLMs and what the LinkedIn and Twitter influencers are saying. If you know what you're doing and you have enough patience you really can get an LLM to do a lot of the things you want, but it can require a lot of handholding, rejecting bad ideas, and reviewing.
In my experience, the people pushing "vibecoding" content are influencers trying to ride the trend. They use the trend to gain more followers, sell courses, get the attention of a class of investors desperate to deploy cash, and other groups who want to believe vibecoding is magic.
I also consider them a vocal minority, because I don't think they represent the majority of LLM users.
I'll give you a basic example where it saved me a ton of time to vibe code instead of doing it myself, and I believe it would hold true for anyone.
Creating ~50 different types of calculators in JavaScript. Gemini can bang out in seconds what would take me far longer (and it's reasonable at basic tailwind style front-end design to boot). A large amount of work smashed down to a couple of days of cumulative instruction + testing in my spare time. It takes far long to think of how I want something to function in this example than it does for Gemini to successfully produce it. This is a use case scenario where something like Gemini 3 is exceptionally capable, and far exceeds the capability requirements needed to produce a decent outcome.
Do I want my next operating system vibe coded by Gemini 3? Of course not. Can it knock out front-end JavaScript tasks trivially? Yes, and far faster than any human could ever do it. Classic situation of using a tool for things it's particularly well suited.
Here's another one. An SM-24 Geophone + Raspberry PI 5 + ADC board. Hey Gemini / GPT, I need to build bin files from the raw voltage figures + timestamps, then using flask I need a web viewer + conversion on the geophone velocity figures for displacement and acceleration. Properly instructed, they'll create a highly functional version of that with some adjustments/iteration in 15-30 minutes. I basically had them recreate REW RTA mode for my geophone velocity data, and there's no way a person could do it nearly as fast. It requires some checking and iteration, and that's assumed in the comparison.
Yeah I had OpenAI crank out 100 different fizzbuzz implementations in a dozen seconds—-and many of them worked! No chance a developer would have done it that fast, and for anyone who needs to crank out fizzbuzz implementations at scale this is the tool to beat. The haters don’t know what they’re talking about.
We never associated with Bun other than extending an invitation to rent a job booth at a conference: this was years ago when I had a Twitter account, so it's fair if Jarred doesn't remember.
If Handmade Cities had the opportunity to collaborate with Bun today, we would not take it, even prior to this acquisition. HMC wants to level up systems while remaining performant, snappy and buttery smooth. Notable examples include File Pilot [0] or my own Terminal Click (still early days) [1], both coming from bootstrapped indie devs.
I'll finish with a quote from a blog post [2]:
> Serious Handmade projects, like my own Terminal Click, don’t gain from AI. It does help at the margins: I’ve delegated website work since last year, and I enjoy seamless CI/CD for my builds. This is meaningful.
However, it fails at novel problems and isn’t practical for my systems programming work.
All that said, I congratulate Bun even as we disagree on philosophy. I imagine it's no small feat getting acquired!
Finding this comment interesting, parent comment didn't suggest any past association but it seemingly uses project reference as pivot point to do various outgroup counter signaling / neg bun?
I understand the concern, but really? I found this quote enough to offer proper comments:
> had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types
Folks at Bun are "Zig people" for obvious reasons, and a link was made with Handmade software. This happened multiple times before with Bun specifically, so my response is not a "pivot" of any kind. I've highlighted and constrasted our differences to prevent further associations inside a viral HN thread. That's not unreasonable.
I also explicitly congratulated them for the acquisition.
> I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc.
I feel like an important step for a language is when people outside of the mainline language culture start using it in anger. In that respect, Zig has very much "made it."
That said, if I were to put on my cynical hat, I do wonder how much of that Anthropic money will be donated to the Zig Software Foundation itself. After all, throwing money at maintaining and promoting the language that powers a critical part of their infrastructure seems like a mutually beneficial arrangement.
> I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc.
In my experience, the extreme anti-LLM people and extreme pro-vibecoding people are a vocal online minority.
If you get away from the internet yelling match, the typical use case for LLMs is in the middle. Experienced developers use them for some small tasks and also write their own code. They know when to switch between modes and how to make the most of LLMs without deferring completely to their output.
Most of all: They don't go around yelling about their LLM use (or anti-use) because they're not interesting in the online LLM wars. They just want to build things with the tools available.
more people should have such a healthy approach not only to llms but to life in general. Same reason I partake less and less in online discourse: its so tribal and filled with anger that its just not worth it to contribute anymore. Learning how to be in the middle did wonders to me as a programmer and I think as a person as well.
It goes completely contrary to a lot of what I think is good software engineering. There is very little reuse, everything is ad-hoc, NIH-heavy, verbose, seemingly fragile (there's a lot of memory manipulation interwoven with business logic!), with relatively few tests or assurances.
And yet it works on many levels: as a piece of software, as a project, as a business. Therefore, how can it be anything but good engineering? It fulfils its purpose.
I can also see why it's a very good fit for LLM-heavy workflows.
I can't speak as much about the last two examples, but writing a giant parser file is pretty common in Zig from what I've seen. Here's Zig's own parser, for example[1]. I'm also not sure what you mean by memory unsafe, since all slices have bounds checks. It also looks like this uses an arena allocator, so lifetime tracking is pretty simple (dump everything onto the allocator, and copy over the result at the end). Granted, I could be misunderstanding the code, but that's the read I get of it.
Amazing news, congrats! Been using Bun for a long while now and I love it.
Is there anything I could do to improve this PR/get a review? I understand you are def very busy right now with the acquisition, but wanted to give my PR the best shot:
Thanks, Jarred. Seeing what you built with Bun has been a real inspiration, the way one focused engineer can shift an entire ecosystem. It pushed me back into caring about the lower-level side of things again, and I’m grateful for that spark. Congrats on the acquisition, and excited to see what’s next
Isn't that still "acqui-hiring" according to common usage of the term?
Sometimes people use the term to mean that the buyer only wants some/all of the employees and will abandon or shut down the acquired company's product, which presumably isn't the case here.
But more often I see "acqui-hire" used to refer to any acquisition where the expertise of the acquired company are the main reason to the acquisition (rather than, say, an existing revenue stream), and the buyer intends to keep the existing team dynamics.
Acquihiring usually means that the product the team are working on will be ended and the team members will be set to work on other aspects of the existing company.
That is part of the definition given in the first paragraph of the Wikipedia article, but I think it’s a blurry line when the acquired company is essentially synonymous with a single open source project and the buyer wants the team of experts to continue developing that open source project.
I've never personally used Bun. I use node.js I guess. What makes Bun fundamentally better at AI than, say, bundling a node.js app that can run anywhere?
If the answer is performance, how does Bun achieve things quicker than Node?
on Bun's website, the runtime section features HTTP, networking, storage -- all are very web-focused. any plans to start expanding into native ML support? (e.g. GPUs, RDMA-type networking, cluster management, NFS)
Probably not. When we add new APIs in Bun, we generally base the interface off of popular existing packages. The bar is very high for a runtime to include libraries because the expectation is to support those APIs ~forever. And I can’t think of popular existing JS libraries for these things.
You said elsewhere that there were many suitors. What is the single most important thing about Anthropic that leads you to believe they will be dominant in the coming years?
No idea about his feelings but believing that they will be dominant wouldn't have to be the reason he chose them. I could easily imagine that someone would decide based on (1) they offered enough money and (2) values alignment.
How much of your day-to-day is spent contributing code to the Bun codebase and do you expect it to decrease as Anthropic assigns more people to work on Bun?
I contributed to Bun one time for SQLite. I've a question about the licensing.
Will each contributor continue to retain their copyright, or will a CLA be introduced?
With Bun's existing OSS license and contribution model, all contributors retain their copyright and Bun retains the license to use those contributions. An acquisition of this kind cannot change the terms under which prior contributions were made without explicit agreement from all contributors. If Bun did switch to a CLA in the future, just like with any OSS project, that would only impact future contributions made after that CLA went into effect and it depends entirely on the terms established in that hypothetical CLA.
I have a PR that’s been sitting for awhile that exposes the extra options from the renameat2 and renameatx_np syscalls which is a good way to implement self-updaters that work even when multiple processes are updating the same path on disk at the same time. These syscalls are supported on Linux & macOS but I don’t think there’s an equivalent on Windows. We use these syscalls internally for `bun install` to make adding packages into the global install cache work when multiple `bun install` processes are running simultaneously
No high-level self updater api is planned right now, but yes for at least the low level parts needed to make a good one
I know that one thing you guys are working on or are at least aware of is the size of single-file executables. From a technical perspective, is there a path forward on this?
I'm not familiar with Bun's internals, but in order to get the size down, it seems like you'd have to somehow split up/modularize Bun itself and potentially JavaScriptCore as well (not sure how big the latter is). That way only the things that are actually being used by the bundled code are included in the executable.
Is this even possible? Is the difficulty on the Bun/Zig side of things, or JSC, or something else? Seems like a very interesting (and very difficult) technical problem.
I wonder if this is a sign of AI companies trying to pivot?
> Bun will ship faster.
That'll last until FY 2027. This is an old lie that acquirers encourage the old owner to say because they have no power to enforce it, and they didn't actually say it so they're not on the hook. It's practically a cheesy pickup line, and given the context, it kinda is.
I’ll be honest, while I have my doubts about the match of interests and cohesion between an AI company and a JS runtime company I have to say this is the single best acquisition announcement blog post I’ve seen in 20 years or so.
Very direct, very plain and detailed. They cover all the bases about the why, the how, and what to expect. I really appreciate it.
Best of luck to the team and hopefully the new home will support them well.
But how is another company that is also VC backed and losing money providing stability for Bun?
How long before we hear about “Our Amazing Journey”?
On the other hand, I would rather see someone like Bun have a successful exit where the founders seem to have started out with a passion project, got funding, built something out they were excited about and then exit than yet another AI company by non technical founders who were built with the sole purpose of getting funding and then exit.
If that was genuinely happening here - Anthropic were selling inference for less than the power and data center costs needed to serve those tokens - it would indeed be a very bad sign for their health.
Those are estimates. Notice they didn’t assume 0% or a million %. They chose numbers that are a plausible approximation of the true unknown values, also known as an estimate.
This is pretty silly thing to say. Investment banks suffer zero reputational damage when their analysts get this sort of thing wrong. They don’t even have to care about accuracy because there will never be a way to even check this number, if anyone even wanted to go back and rate their assumptions, which also never happens.
I've seen a bunch of other estimates / claims of a %50-60 margin for Anthropic on serving. This was just the first one I found a credible-looking link I could drop into this discussion.
They had pretty drastic price cuts on Opus 4.5. It's possible they're now selling inference at a loss to gain market share, or at least that their margins are much lower. Dario claims that all their previous models were profitable (even after accounting for research costs), but it's unclear that there's a path to keeping their previous margins and expanding revenue as fast or faster than their costs (each model has been substantially more expensive than the previous model).
It wouldn't surprise me if they found ways to reduce the cost of serving Opus 4.5. All of the model vendors have been consistently finding new optimizations over the last few years.
I've been wondering about this generally... Are the per-request API prices I'm paying at a profit or a loss? My billing would suggest they are not making a profit on the monthly fees (unless there are a bunch of enterprise accounts in group deals not being used, I am one of those I think)
but those AI/ML researchers aka LLM optimization staff are not cheap. their salaries have skyrocketed, and some are being fought for like top-tier soccer stars and actors/actresses
The leaders of Anthropic, OpenAI and DeepMind all hope to create models that are much more powerful than the ones they have now.
A large portion of the many tens of billions of dollars they have at their disposal (OpenAI alone raised 40 billion in April) is probably going toward this ambition—basically a huge science experiment. For example, when an AI lab offers an individual researcher a $250 million pay package, it can only be because they hope that the researcher can help them with something very ambitious: there's no need to pay that much for a single employee to help them reduce the costs of serving the paying customers they have now.
The point is that you can be right that Anthropic is making money on the marginal new user of Claude, but Anthropic's investors might still get soaked if the huge science experiment does not bear fruit.
> their investors might still take a bath if the very-ambitious aspect of their operations do not bear fruit
Not really. If the technology stalls where it is, AI still have a sizable chunk of the dollars previously paid to coders, transcribers, translators and the like.
The bet, (I would have thought) obviously, is that AI will be a huge part of humanity’s future, and that Anthropic will be able to get a big piece of that pie.
This is (I would have thought) obviously different from selling dollars for $0.50, which is a plan with zero probability of profit.
Edit: perhaps the question was meant to be about how Bun fits in? But the context of this sub-thread has veered to achieving a $7 billion revenue.
You are saying that you can raise $7b debt at double-digit interest rate. I am doubtful. While $7b is not a big number, the Madoff scam is only ~$70b in total over many years.
I am fairly skeptical about many AI companies, but as someone else pointed out, Anthropic has 10x'ed their revenue for the past 3 years. 100m->1b->10b. While past performance no predictor of future results, their product is solid and to me looks like they have found PMF.
Often it happens that VCs buy out companies from funds belonging to a fresh because the selling fund wants to show performance to their investors until "the big one", or move cash one from wealthy pocket to another one.
"You buy me this, next time I save you on that", etc...
"Raised $19 million Series A led by Khosla Ventures + $7 million"
"Today, Bun makes $0 in revenue."
Everything is almost public domain (MIT) and can be forked without paying a single dollar.
Questionable to claim that the technology is the real reason this was bought.
It's an acquihire. If Anthropic is spending significant resources, or see that they will have to, to improve Bun internally already it makes a lot of sense. No nefarious undertones required.
An analogous example off the top of my head is Shopify hired Rafael Franca to work on Rails full-time.
If it was an acquihire, still a lot less slimy than just offering the employees they care about a large compensation package and leaving the company behind as a husk like Amazon, Google and Microsoft have done recently.
From the acquirer’s perspective, you’re right. (Bonus: it diminishes your own employees’ ability to leave and fundraise to compete with you.)
From an ecosystem perspective, acquihires trash the funding landscape. And from the employees’ perspective, as an investor, I’d see them being on an early founding team as a risk going forward. But that isn’t relevant if the individual pay-off is big.
> And from the employees’ perspective, as an investor, I’d see them being on an early founding team as a risk going forward.
Every employee is a flight risk if you don't pay them a competitive salary; that's just FUD from VC bros who are getting their playbook (sell the company to the highest bidder and let early employees get screwed) used against them.
> Every employee is a flight risk if you don't pay them a competitive salary
Not relevant to acquihires, who typically aren’t hired away with promises of a salary but instead large signing bonuses, et cetera, and aren’t typically hired individually but as teams. (You can’t solve key man problems with compensation alone, despite what every CEO compensation committee will lead one to think.)
> that's just FUD
What does FUD mean in this context? I’m precisely relaying a personal anecdote.
> aren’t hired away with promises of a salary but instead large signing bonuses
Now you're being nitpicky.
Take the vesting period of the sign on bonus, divide the bonus amount by that and add it to the regular salary and you get the effective salary.
> aren’t typically hired individually but as teams.
So?
VC bros seem to forget the labor market is also a free market as soon it hurts their cashout opportunity.
> What does FUD mean in this context? I’m precisely relaying a personal anecdote.
Fear, Uncertainty and Doubt.
Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future.
> Now you're being nitpicky. Take the vesting period of the sign on bonus, divide the bonus amount by that and add it to the regular salary and you get the effective salary
These aren't the same things and nobody negotating and acquisition or acqhihire converts in this way. (I've done both.)
> Fear, Uncertainty and Doubt. Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future
It's a personal anecdote. There shouldn't be any uncertainty about what I personally believe. I've literally negotiated acquihires. If you're getting a multimillion dollar payout, you shouldn't be particularly concerned about your standing in the next founding team unless you're a serial entrepreneur.
Broader online comment, invoking FUD seems like shorthand for objecting to something without knowing (or wanting to say) why.
And the secretary, sales, project managers, etc who get left behind because the founders and key people were taken away? In an acquisition, they may still be let go. But they also would make money from their equity
You want those people specifically. To get them, you need to hire them for a lot more money than you pay your current folks. That causes a lot of resentment with folks and messes up things like salary bands, etc.
But since they own equity in the current company, you can give them a ton of money by buying out that equity/paying acquisition bonuses that are conditional on staying for specific amounts of time, etc. And your current staff doesn't feel left out because "it's an acquisition" the way they would if you just paid some engineers 10x or 100x what you pay them.
I left out the part that the motivations for the acquirers were not to save money or to be slimy. It was the only way to get around overzealous government regulators making it harder to acquirer companies.
The real risk is not that Anthropic will run out of money, but that they will change their strategy to something that isn't Bun-based, and supporting Bun won't make sense for them any more.
I admit, it is a good acquisition announcement. I can’t remember the last acquisition announcement that was kept for more than 1-2 years. Leadership changes, priorities shift…
One thing I like about this, despite it meaning Bun will be funded, is Anthropic is a registered public benefit corporation. While this doesn't mean Anthropic cant fuck over the users of Bun, it at least puts in some roadblocks. The path of least-resistance here should be to improve Bun for users, not to monetize it to the point where it's no longer valuable.
Bun and Deno's goals seem quite different, I don't expect that to change. Bun is a one stop shop with an ever increasing number of built-in high-level APIs. Deno is focused on low level APIs, security, and building out a standard lib/ecosystem that (mostly) supports all JS environments.
People who like Bun for what it is are probably still going to, and same goes for Deno.
That being said I don't see how Anthropic is really adding long term stability to Bun.
I think Deno's management have been somewhat distracted by their ongoing lawsuits with Oracle over the release of the Javascript trademark.
I started out with Deno and when I discovered Bun, I pivoted. Personally I don't need the NodeJS/NPM compatability. Wish there was a Bun-lite which was freed of the backward compatability.
> Will this make it more or less likely for people to use Bun vs Deno?
I'm not sure it will make much of a difference in the short term.
For those who were drawn to Bun by hype and/or some concerns around speed, they will continue to use Bun.
For me personally, I will continue to use Node for legacy projects and will continue using Deno for current projects.
I'm not interested in Bun for it's hype (since hype is fleeting). I have a reserved interested in Bun's approach to speed but I don't see it being a significant factor since most JS speed concerns come from downloading dependencies (which is a once-off operation) and terrible JS framework practices (which aren't resolved by changing engines anyway).
----------------------------
The two largest problems I see in JS are:
1. Terrible security practices
2. A lack of a standard library which pushes people into dependency hell
Deno fixes both of those problems with a proper permission model and a standard library.
----------------------------
> And now that Bun doesn't need to run a profitable cloud company will they move faster and get ahead of Deno?
I think any predictions between 1-10 years are going to be a little too chaotic. It all depends on how the AI bubble goes away.
But after 10 years, I can see runtimes switching from their current engines to one based on Boa, Kiesel or something similar.
Anthropic has been trying to win the developer marketshare, and has been quite successful with Claude Code. While I understand the argument that this acquisition is to protect their usage in CC or even just to acquire the team, I do hope that part of their goal is to use this to strengthen their brand. Being good stewards of open source projects is a huge part of how positively I view a company.
As someone who have been using Deno for the last few years, is there anything that Bun does better? Bun seems to use a different runtime (JSC) which is less tested than V8, which makes me assume it might perform worse in real-world tasks (maybe not anymore?). The last time I checked Bun's source code, it was... quite messy and spaghetti-like, plus Zig doesn't really offer many safety features, so it's not that hard to write incorrect code. Zig does force some safety with ReleaseSafe IIRC, but it's still not the same as even modern C++, let alone Rust.
I'll admit I'm somewhat biased against Bun, but I'm honestly interested in knowing why people prefer Bun over Deno.
I haven't used Deno, but I do use Bun purely as a replacement for npm. It does the hard-linking thing that seems to be increasingly common for package managers these days (i.e. it populates your local node_modules with a bunch of hard links to its systemwide cache), which makes it vastly quicker and more disk-efficient than npm for most usage.
Even with a cold cache, `bun install` with a large-ish dependency graph is significantly faster than `npm install` in my experience.
I don't know if Deno does that, but some googling for "deno install performance vs npm install" doesn't turn up much, so I suspect not?
As a runtime, though, I have no opinion. I did test it against Node, but for my use case (build tooling for web projects) it didn't make a noticeable difference, so I decided to stick with Node.
Deno does that. It also refrains from keeping a local node_modules at all until/unless you explicitly ask it to for whatever compatibility reason. There are plugins to things like esbuild to use the Deno resolver and not need a node_modules at all (if you aren't also using the Deno-provided bundler for whatever reason such as it disappeared for a couple versions and is still marked "experimental").
As the victim of the larger pre-Shai-Hulud attack, unfortunately the install script validation wouldn't have protected you. Also, if you already have an infected package on the whitelist, a new infection in the install script will still affect you.
I decided to stick with Node in general. I don't see any compelling reason to change it.
Faster install and less disk space due to hardlink? Not really all that important to me. Npm comes with a cache too, and I have the disk space. I don't need it to be faster.
With the old-school setup I can easily manually edit something in node_modules to quickly test a change.
No more node_modules? It was a cool idea when yarn 2 initially implemented it, but at the end of the day I prefer things to just work rather than debug what is and isn't broken by the new resolver. At the time my DevOps team also wasn't too excited about me proposing to put the dependencies into git for the zero-install.
Search for pointer exceptions or core dumps on Bun's GitHub issues and you'll see why people (should) use Deno over Bun, if only because Rust is a way more safe language than Zig.
This is a non sequitur. Both Rust and Zig and any other language has the ability to end in an exception state. Whether it be kernel exception, pointer exception, or Rust's panic! - these things exist.
The reason why you see so many GitHub issues about it is because that's where the development is. Deno is great. Bun is great. These two things can both be great and we don't have to choose sides. Deno has it's use case. Bun has it's. Deno want's to be secure and require permissions. Bun just wants to make clean, simple, projects. This fight between Rust vs The World is getting old. Rust isn't any "safer" when Deno can panic too.
Don't make a false equivalence, how many times does one get a panic from Deno versus a segmentation fault in Bun? It's not a similar number, and it's simply wrong to say that both are just as unsafe when that's plainly untrue.
The only time I got a segfault in Bun is when I used bun:ffi to wrap glfw and wgpu-native so I can threejs on the desktop. Ironically, the segfault was in wgpu. Which is Rust. But to be fair it was because the glfw surface had dirty flags for OpenGL and didn’t have the Vulkan extensions. So anyone would have faulted.
> This is a non sequitur. Both Rust and Zig and any other language has the ability to end in an exception state.
There are degrees to this though. A panic + unwind in Rust is clean and _safe_, thus preferable to segfaults.
Java and Go are another similar example. Only in the latter can races on multi-word data structures lead to "arbitrary memory corruption" [1]. Even in those GC languages there's degrees to memory safety.
I agree. Pointing at Github issues is a strange metric to me. If we want to use that as a canary then you shouldn't use Deno (2.4k open issues) or Bun (4.5k open issues) at all.
I haven't verified this, but I would be willing to bet that most of Bun's issues here have more to do with interfacing with JavaScriptCore through the C FFI than Zig itself. this is as much a problem in Rust as it is in Zig. in fact, it has been argued that writing unsafe Zig is safer than writing unsafe Rust: https://zackoverflow.dev/writing/unsafe-rust-vs-zig/
As someone who has researched the internals of Deno and Bun, your unverified vibe thoughts are flat out wrong. Bun is newer and buggier and that's just the way things go sometimes. You'll get over it.
I tried several times to port Node projects to Deno. Each time compatibility had "improved" but I still didn't have a working build after a few days of effort.
I don't know how Deno is today. I switched to Bun and porting went a lot smoother.
Philosophically, I like that Bun sees Node compatibility as an obvious top priority. Deno sees it as a grudging necessity after losing the fight to do things differently.
Which makes sense given that a big impetus for Deno's existence was the creator of Node/Deno (Ryan Dahl) wanting to correct things he viewed as design mistakes in Node.
I’ve been using Deno too. Although npm support has improved and it’s fine for me, I think Deno has more of a “rewrite the world” philosophy. For example, they created their own package registry [1] and their own web framework [2]. Bun seems much more focused on preexisting JavaScript projects.
It's interesting that people have directly opposite opinions on whether Deno or Bun are meant to be used with the existing ecosystem - https://news.ycombinator.com/item?id=46125049
I don’t think these are mutually exclusive takes. Bun is essentially taking Node and giving it a standard library and standard tooling. But you can still use regular node packages if you want. Whereas Deno def leaned into the clean break for a while
> Bun seems to use a different runtime (JSC) which is less tested than V8, which makes me assume it might perform worse in real-world tasks (maybe not anymore?).
JSC is still the JS engine for WebKit-based browsers, especially Safari, and per Apple App Store regulations the only JS engine supposedly allowable in all of iOS.
It's more "mature" than V8 in terms of predating it. (V8 was not a fork of it and was started from scratch, but V8 was designed to replace it in the Blink fork from WebKit.)
It has different performance goals and performance characteristics, but "less tested" seems uncharitable and it is certainly used in plenty of "real-world tasks" daily in iOS and macOS.
My team has been using it in prod for about a year now. There were some minor bugs in the runtime's implementation of buffers in 1.22 (?), but that was about the only issue we ran into.
The nice things:
1. It's fast.
2. The standard library is great. (This may be less of an advantage over Deno.)
3. There's a ton of momentum behind it.
4. It's closer to Node.js than Deno is, at least last I tried. There were a bunch of little Node <> Deno papercuts. For example, Deno wanted .ts extensions on all imports.
5. I don't have to think about JSR.
The warts:
1. The package manager has some issues that make it hard for us to use. I've forgotten why now, but this in particular bit us in the ass: https://github.com/oven-sh/bun/issues/6608. We use PNPM and are very happy with it, even if it's not as fast as Bun's package manager.
Overall, Deno felt to me like they were building a parallel ecosystem that I don't have a ton of conviction in, while Bun feels focused on meeting me where I am.
As far as I know, modern Node compat in Deno is also quite great - I just import packages via 'npm:package' and they work, even install scripts work. Although I remember that in the past Deno's Node compat was worse, yes.
I really want to like Deno and will likely try it again, but last time I did it was just a bit of a pain anytime I wanted to use something built for npm (which is most packages out there), whereas bun didn't have that problem.
There's certainly an argument to be made that, like any good tool, you have to learn Deno and can't fall back on just reusing node knowledge, and I'd absolutely agree with that, but in that case I wanted to learn the package, not the package manager.
Edit: Also it has a nice standard library, not a huge win because that stuff is also doable in Deno, but again, its just a bit less painless
Looking at Bun's website (the comparison table under "What's different about Bun?") and what people have said here, the only significant benefit of Bun over Node.js seems to be that it's more batteries-included - a bigger standard library, more tools, some convenience features like compiling JSX and stripping TypeScript types on-the-fly, etc.
It's not clear to me why that requires creating a whole new runtime, or why they made the decisions they did, like choosing JSC instead of V8, or using a pre-1.0 language like Zig.
It just works. Whatever JavaScript/TypeScript file or dependencies I throw at it, it will run it without needing to figure out CJS or ESM, tsconfig, etc.
Same. I had a little library I wrote to wrap indexedDB and deno wouldn't even compile it because it referenced those browser apis. I'm sure it's a simple flag or config file property, or x, or y, or z, but the simple fact is, bun didn't fail to compile.
Between that and the discord, I have gotten the distinct impression that deno is for "server javascript" first, rather than just "javascript" first. Which is understandable, but not very catering to me, a frontend-first dev.
Even for server ~~java~~typescript, I almost always reach for Bun nowadays. Used to be because of typestripping, which node now has too, but it's very convenient to write a quick script, import libraries and not have to worry about what format they are in.
Is JSC less tested? I thought it was used in Safari, which has some market share.
I used bun briefly to run the output of my compiler, because it was the only javascript runtime that did tail calls. But I eventually added a tail call transform to my compiler and switched to node, which runs 40% faster for my test case (the compiler building itself).
Twice as fast at executing JavaScript? There's absolutely zero chance this is true. A JavaScript engine that's twice as fast as V8 in general doesn't exist. There may be 5 or 10 percent difference, but nothing really meaningful.
You might want to revise what you consider to be "absolutely zero chance". Bun has an insanely fast startup time, so it definitely can be true for small workloads. A classic example of this was on Bun's website for a while[1] - it was "Running 266 React SSR tests faster than Jest can print its version number".
Keep in mind that it's not just a matter of comparing the JS engine. The runtime that is built around the engine can have a far greater impact on performance than the choice of v8 vs. JSC vs. anything else. In many microbenchmarks, Bun routinely outperforms Node.js and Deno in most tasks by a wide margin.
I find comments like this fascinating, because you're implicitly evaluating a counterfactual where Bun was built with Rust (or some other "interesting" language). Maybe Bun would be better if it were built in Rust. But maybe it would have been slower (either at runtime or development speed) and not gotten far enough along to be acquired by one of the hottest companies in the world. There's no way to know. Why did Anthropic choose Bun instead of Deno, if Deno is written in a better language?
We can think of they making bun an internal tool, push roadmap items that fit their internal products, whatever, which doesn't answer the getting back money of the acquisition.
Profit in those products has to justify having now their own compiler team for a JavaScript runtime.
Don't engage with this guy, he shows up in every one of these threads to pattern match back to his heyday without considering any of the nuance of what is actually different this time.
Good question, hard to say, but I think it's mainly because of Zig. At its core Zig is marketed as a competitor to C, not C++/Rust/etc, which makes me think it's harder to write working code that won't leak or crash than in other languages. Zig embraces manual memory management as well.
> At its core Zig is marketed as a competitor to C, not C++/Rust/etc
What gives you this impression?
I directly created Zig to replace C++. I used C++ before I wrote Zig. I wrote Zig originally in C++. I recently ported Chromaprint from C++ to Zig, with nice performance results. I constantly talk about how batching is superior to RAII.
Everyone loves to parrot this "Zig is to C as Rust is to C++" nonsense. It's some kind of mind virus that spreads despite any factual basis.
I always figured Bun was the "enterprise software" choice, where you'd want to use Bun tools and libraries for everything and not need to bring in much from the broader NPM library ecosystem.
Deno seems like the better replacement for Node, but it'd still be at risk of NPM supply chain attacks which seems to be the greater concern for companies these days.
Yes, both can pull in open source libraries and I can't imagine either dropping that ability. Though they do seem to have different eagerness and competency on Node compatibility and Bun seems better on that front.
From a long term design philosophy prospective, Bun seems to want to have a sufficiently large core and standard library where you won't need to pull in much from the outside. Code written for Node will run on Bun, but code using Bun specific features won't run on Node. It's the "embrace, extend, ..." approach.
Deno seems much more focused on tooling instead of expanding core JS, and seems to draws the line at integrations. The philosophy seems to be more along the lines of having the tools be better about security when pulling in libraries instead of replacing the need for libraries. Deno also has it's own standard library, but it's just a library and that library can run on Node.
They tried to realign package management with web standards and tools that browsers can share (URLs and importmaps and "cache, don't install"). They didn't offer compatibility with existing package managers (notably and notoriously npm) until late in that game and took multiple swings at URL-based package repositories (deno.land/x/ and JSR), with JSR eventually realizing it needed stronger npm compatibility.
Bun did prioritize npm compatibility earlier.
Today though there seems to be a lot of parity, and I think things like JSR and strong importmaps support start to weigh in Deno's favor.
Is it just me, but I don't find npm that slow? Sure it's not a speed demon, but I rarely need to do npm install anyways so it's not a bottleneck for me.
For deploy, usually running the attached terraform script takes more time.
So while a speed increase is welcome, but I don't feel it gives me such a boost.
I've been using Bun since 2022 just to be trendy for recruitment (it worked, and still works despite it almost being 2026)
Bun is fast, and its worked as a drop in replacement for npm in large legacy projects too.
I only ever encountered one issue, which was pretty dumb, Amazon's CDK has hardcoded references to various package manager's lock files, and Bun wasn't one of them
From the comments here it sounds like most people think the amount Anthropic paid for the company was probably not much more than the VC funding which Bun raised.
How would the payout split work? It wouldn’t seem fair to the investors if the founder profited X million while the investors get their original money returned. I understand VC has the expectation that 99 out of 100 of investments will net them no money. But what happens in the cases where money is made, it just isn’t profitable for the VC firm.
What’s to stop everyone from doing this? Besides integrity, why shouldn’t every founder just cash out when the payout is life-changing?
Is there usually some clause in the agreements like “if you do not return X% profit, the founder forfeits his or her equity back to the shareholders”?
Hard to say it makes no sense when you don't know how much they were acquired for. I would guess it is a trivial amount relative to Anthropic's war chest.
I've seen a few of these seemingly random acquisitions lately, and I congratulate the companies and individuals that are acquired during this gold rush, but it definitely feels awkwardly artificial.
Quote from the CEO of Anthropic in March 2025:
"I think we'll be there in three to six months where AI is writing 90% of the code and then in 12 months we may be in a world where AI is writing essentially all of the code"
I think this wound up being close enough to true, it's just that it actually says less than what people assumed at the time.
It's basically the Jevons paradox for code. The price of lines of code (in human engineer-hours) has decreased a lot, so there is a bunch of code that is now economically justifiable which wouldn't have been written before. For example, I can prompt several ad-hoc benchmarking scripts in 1-2 minutes to troubleshoot an issue which might have taken 10-20 minutes each by myself, allowing me to investigate many performance angles. Not everything gets committed to source control.
Put another way, at least in my workflow and at my workplace, the volume of code has increased, and most of that increase comes from new code that would not have been written if not for AI, and a smaller portion is code that I would have written before AI but now let the AI write so I can focus on harder tasks. Of course, it's uneven penetration, AI helps more with tasks that are well-described in the training set (webapps, data science, Linux admin...) compared to e.g. issues arising from quirky internal architecture, Rust, etc.
At an individual level, I think it is for some people. Opus/Sonnet 4.5 can tackle pretty much any ticket I throw at it on a system I've worked on for nearly a decade. Struggles quite a bit with design, but I'm shit at that anyway.
It's much faster for me to just start with an agent, and I often don't have to write a line of code. YMMV.
Sonnet 3.7 wasn't quite at this level, but we are now. You still have to know what you're doing mind you and there's a lot of ceremony in tweaking workflows, much like it had been for editors. It's not much different than instructing juniors.
From the article, Claude Code is being used extensively to develop Bun already.
> Over the last several months, the GitHub username with the most merged PRs in Bun's repo is now a Claude Code bot. We have it set up in our internal Discord and we mostly use it to help fix bugs. It opens PRs with tests that fail in the earlier system-installed version of Bun before the fix and pass in the fixed debug build of Bun. It responds to review comments. It does the whole thing.
You do still need people to make all the decisions about how Bun is developed, and to use Claude Code.
> You do still need people to make all the decisions about how Bun is developed, and to use Claude Code.
Yeah but do you really need external hires to do that? Surely Anthropic has enough experienced JavaScript developers internally they could decide how their JS toolchain should work.
Actually, this is thinking too small. There's no reason that each developer shouldn't be able to customize their own developer tools however they want. No need for any one individual to control this, just have devs use AI to spin up their own npm-compatible package management tooling locally. A good day one onboarding task!
Same. I don’t understand how people aren’t getting this yet. I’m spending all day thinking, planning and engineering while spending very little time typing code. My productivity is through the roof. All the code in my commits is of equal quality to what I would produce myself, why wouldn’t it be? Sure one can just ask AI to do stuff and not review it and iterate, but why on earth would one do that? I’m starting to feel that anyone who’s not getting this positive experience simply isn’t good at development to begin with.
"Wasting" is doing a lot of work in that sentence.
They're effectively bringing on a team that's been focused on building a runtime for years. The models they could throw at the problem can't be tapped on the shoulder, and there's no guarantee they'd do a better job at building something like Bun.
Let me refer you back to the GP, where the CEO of Anthropic says AI will be writing most code in 12 months. I think the parent comment you replied to was being somewhat facetious.
Maybe he was correct in the extremely literal sense of AI producing more new lines of code than humans, because AI is no doubt very good at producing huge volumes of Stuff very quickly, but how much of that Stuff actually justifies its existence is another question entirely.
Why do people always stop this quote at the breath? The rest of it says that he still thinks they need tech employees.
> .... and in 12 months, we might be in a world where the ai is writing essentially all of the code. But the programmer still needs to specify what are the conditions of what you're doing. What is the overall design decision. How we collaborate with other code that has been written. How do we have some common sense with whether this is a secure design or an insecure design. So as long as there are these small pieces that a programmer has to do, then I think human productivity will actually be enhanced
(He then said it would continue improving, but this was not in the 12 month prediction.)
I actually like claude code, but that was always a risky thing to say (actually I recall him saying their software is 90% AI produced) considering their cli tool is literally infested with bugs. (Or it least it was last time I used it heavily. Maybe they've improved it since.)
Is this why everyone only seems to know the first half of Dario's quote? The guy in that video is commenting on a 40 second clip from twitter, not the original interview.
I'm curious what people think of quotes like these. Obviously it makes an explicit, falsifiable prediction. That prediction is false. There are so many reasons why someone could predict that it would be false. Is it just optimistic marketing speech, or do they really believe it themselves?
Everybody knows that marketing speech is optimistic. Which means if you give realistic estimates, then people are going to assume those are also optimistic.
What languages and frameworks? What is the domain space you're operating in? I use Cursor to help with some tasks, but mainly only use the autocomplete. It's great; no complaints. I just don't ever see being able to turn over anywhere close to 90% with the stuff we work on.
Hah. It can’t be “I need to spend more time to figure out how to use these tools better.” It is always “I’m just smarter than other people and have a higher standard.”
My stack is React/Express/Drizzle/Postgres/Node/Tailwind. It's built on Hetzner/AWS, which I terraformed with AI.
It's a private repo, and I won't make it open source just to prove it was written with AI, but I'd be happy to share the prompts. You can also visit the site, if you'd like: https://chipscompo.com/
The tools produce mediocre, usually working in the most technical sense of the word, and most developers are pretty shit at writing code that doesn't suck (myself included).
I think it's safe to say that people singularly focused on the business value of software are going to produce acceptable slop with AI.
I don't remember saying I worked with nextjs, shadcn, clerk (I don't even know what that one is), vercel or even JS/TS so I'm not sure how you can be right but I should know better than to feed the trolls.
I suspect you do not know how to use AI for writing code. No offence intended - it is a journey for everyone.
You have to be setup with the right agentic coding tool, agent rules, agent tools (MCP servers), dynamic context acquisition and workflow (working with the agent operate from a plan rather than simple prompting and hoping for the best).
But if you're lazy, don't put the effort in to understand what you're working with and how to approach it with an engineering mindset - you'll be be left on the outside complaining and telling people how it's all hype.
Always the same answer. It's the user not the AI being blown out of proportion. Tell me, where are all those great amazin applications that were coded 95-100% by AI? Where is the great progress the great new algorithms the great new innovations hiding?
My stack is React/Express/Drizzle/Postgres/Node/Tailwind. It's built on Hetzner/AWS, which I terraformed with AI. Probably 90-95% of it is AI driven.
It's a private repo, and I won't make it open source just to prove it was written with AI, but I'd be happy to share the prompts. You can also visit the site, if you'd like: https://chipscompo.com/
"For now, I’ll go dogfood my shiny new vibe-coded black box of a programming language on the Advent of Code problem (and as many of the 2025 puzzles as I can), and see what rough edges I can find. I expect them to be equal parts “not implemented yet” and “unexpected interactions of new PL features with the old ones”.
If you’re willing to jump through some Python project dependency hoops, you can try to use FAWK too at your own risk, at Janiczek/fawk on GitHub."
That doesn't sound like some great success. It mostly compiles and doesn't explode. Also I wouldn't call a toy "innovation" or "revolution".
Thanks for this! I've been looking for a good guide to an LLM based workflow, but the modern style of YouTube coding videos really grates on me. I think I might even like this :D
This one is a bit old now so a number of things have changed (I mostly use Claude Code now, Dynamic context (Skills) etc...) but here's a brief TLDR I did early this year https://www.youtube.com/watch?v=dDSLw-6vR4o
How much time do you think you saved versus writing it yourself if you factored in the time you spent setting up your AI tooling, writing prompts, contexts etc?
1. I didn't say it was a best example, I replied to a comment asking me to "Post a repo" - I posted a repo. 2. Straw man argument. I was asked for a repo, I posted a repo and clearly you didn't look at the code as it's not an "AI code generator".
1. I didn’t ask for a repo.
2. Still wasn’t me. Maybe an AI agent can help you check usernames?
3. Sorry, a plugin for an AI code generator, which is even worse of an example.
The standard argument here is that the maintainers of the core technology are likely to do a better job of hosting it because they have deeper understanding of how it all works.
Hosting is a commodity. Runtimes are too. In this case, the strategy is to make a better runtime, attract developers, and eventually give them a super easy way to run their project in the cloud. Eg: bun deploy, which is a reserved no op command. I really like Buns DX.
I mean if you're getting X number of users per day and you don't need to pay for bandwidth or anything, there's gotta be SOME way to monetize down the line.
If your userbase or the current CEO likes it or not.
No, but faced with either a loss or a modest return, they'll take the modest return (unless it's more beneficial to not come tax season). Unicorns are called unicorns for a reason.
Extrapolating and wildly guessing, we could end up with using all that mostly idle CPU/RAM (the non-VRAM) on the beefy GPUs doing inference on agentic loops where the AI runs small JS scripts in a sandbox (which Bun is the best at, with its faster startup times and lower RAM use, not to mention its extensive native bindings that Node.js/V8 do not have) essentially allowing multiple turns to happen before yielding to the API caller. It would also go well with Anthropic's advanced tool use that they recently announced. This would be a big competitive advantage in the age of agents.
Anyone know how much Anthropic paid for Bun? I assume it was at least $26M, so Bun could break even and pay back its own investors, but I didn't see a number in the announcements from Anthropic or Bun.
I don't really see how Bun fits as an acquisition for an AI company. This seems more like "we have tons of capital and we want to buy something great" than "Bun is essential to our core business model".
If Anthropic wants to own code development in the future, owning the full platform (including the runtime) makes sense.
Programming languages all are a balance between performance/etc and making it easy for a human to interact with. This balance is going to shit as AI writes more code (and I assume Anthropic wants a future where humans might not even see the code, but rather an abstraction of it... after all, all code we look at is an abstraction on some level).
Even outside of code development, Anthropic seems to be very strongly leaning into code interpreter over native tool calling for advancing agentic LLM abilities (e.g. their "skills" approach). Given that those necessitate a runtime of sorts, owning/having access to a runtime like Bun that could e.g. allow them to very seamlessly integrate that functionality into their products better, this acquisition doesn't seem like the worst idea.
They will own it, and then what? Will Claude Code end every response with "by the way, did you know that you can switch to bun for 21.37x faster builds?"
TypeScript is the most popular programming language on the most popular software hosting platform though, owning the best runtime for that seems like it would fit Pareto's rule well enough:
I think there's a potential argument to be made that Anthropic isn't trying to make it easier to write TS code, but rather that their goal is a level higher and the average person wouldn't even know what "language" is running it (in the same way most TS devs don't need to care the many layers their TS code is compiled via).
It's a JS runtime, not specifically servers though? They essentially can bundle Claude Code with this, instead of ever relying on someone installing NodeJS and then running npm install.
Claude will likely be bundled up nicely with Bun in the near future. I could see this being useful to let even a beginner use claude code.
Edit:
Lastly, what I meant originally is that most front-end work happens with tools like Node or Bun. At first I was thinking they could use it to speed up generating / pulling JS projects, but it seems more likely Claude Code and bun will have a separate project where they integrate both and make Claude Code take full advantage of Bun itself, and Bun will focus on tight coupling to ensure Claude Code is optimally running.
Sure, but Bun was funded by VCs and needed to figure out how to monetize, what Anthropic did is ensure it is maintained and now they have fresh talent to improve Claude Code.
Server here I used loosely - it obviously runs on any machine (eg if you wanted to deploy an application with it as a runtime). But it’s not useful for web dev itself which was my point.
Frontend work by definitions n doesn’t happen with either Node nor Bun. Some frontend tooling might be using a JS runtime but the value add of that is minimal and a lot of JS tooling is actually being rewritten in Rust for performance anyway.
It doesn't make sense, and you definitely didn't say why it'd make sense... but enough people are happy enough to see the Bun team reach an exit (especially one that doesn't kill Bun) that I think the narrative that it makes sense will win out.
I see it as two hairy things canceling out: the accelerating trend of the JS ecosystem being hostage to VCs and Rauch is nonsensical, but this time a nonsensical acquisition is closing the loop as neatly as possible.
(actually this reminds me of Harry giving Dobby a sock: on so many levels!)
Claude Code running on Bun is an obvious justification, but Buns features (high performance runtime, fast starts, native TS) are also important for training and inference. For instance, in inference you develop a logical model in code that maps to a reasoning sequence, and then execute the code to validate and refine the model, then use this to inform further reasoning. Bun, which is highly integrated and highly focused on performance, is an ideal fit for this. Having Bun in house means that you can use the feedback from all of automation driven execution of Bun to drive improvements to its core.
Every time I see people mention things like this in node vs bun or deno conversations I wonder if they even tried them.
>The single executable application feature currently only supports running a single embedded script using the CommonJS module system.
>Users can include assets by adding a key-path dictionary to the configuration as the assets field. At build time, Node.js would read the assets from the specified paths and bundle them into the preparation blob. In the generated executable, users can retrieve the assets using the sea.getAsset() and sea.getAssetAsBlob() APIs.
Meanwhile, here's all I need to do to get an exe out of my project right now with, assets and all:
> bun build ./bin/start.ts --compile --outfile dist/myprogram.exe
> [32ms] bundle 60 modules
> [439ms] compile dist/myprogram.exe
it detects my dynamic imports of jsons assets (language files, default configuration) and bundles them accordingly in the executable. I don't need a separate file to declare assets, declare imports, or do anything other than just run this command line. I don't need to look at the various bundlers and find one that works fine with my CLI tool and converts its ESM/TypeScript to CJS, Bun just knows what to do.
Node is death through thousand cuts compared to the various experiences offered by Bun.
Node adds quite the startup latency over Bun too and is just not too pleasant for making CLI scripts.
They evidently evaluated Node.js in comparison to Bun (and Deno) earlier this year and came to a technical decision about which one worked best for their product.
Microsoft owns npm outright and controls every aspect of the infrastructure that node.js relies on. It also sits on the board (and is one of the few platinum members) of the Linux Foundation, which controls openjs. It is certainly MS.
it boils down to - we didn't have full conviction that over the long run we will prove superior to node.js, however a.i company burning a lot of cash, has invested in us by basing their toolchain on us - so they have no option to acquire-hire us.
Why not something like c#: native, fast, crossplatform, strongly-typed, great tooling, supports both scripting (ie single file-based) and compiled to a binary with no dependency whatsoever (nativeAOT), great errors and error stacks, list goes on.
All great for AI to recover during its iterations of generating something useful.
Sadly, this will be the trend with things moving forward. JS is perceived as a good language and LLMs are meant to make them even easier to write. It is not about the mertis of a language. It's about which languages LLMs are "good" at.
There’s like 100x more JS developers than C# developers. JS can also run code very quickly, where with an AOT language, you need to AOT compile it. For tool calls, eval-as-a-service, running in browser JS is far ahead of C#.
AI are good at JS because basically there is a ton of JS code available publicly without usage restriction: the JS code published to be executed in your browser. Most of JS code attached to web pages has no explicit license, but the implicit license is that anyone can download it and run it. Same for HTML and CSS. So using that public code to train models is a no brainer.
One other angle yet mentioned: JS is browser native. No matter how slow it is, browser is now the LCD. Similar server-client codebase, while ugly, is another plus.
Same reason AIs also use Python and DBMSes offer JS or Py UDFs easily, interpreted languages take no build time and are more portable. JS is also very popular.
Might also be a context window thing. Idk how much boilerplate C# has, but others like Java spam it.
You could make a better argument for Go (compiles to native for multiple targets, zero actual dependencies (no need for a platform or virtual machine on the target)
I use Claude Code CLI daily - it's genuinely changed how I work. The $1B number sounds crazy but honestly tracks with how good the tool is. Curious how Bun integration will show up in practice beyond the native installer.
Curious about the deal value/price — any clues whether it was just to make existing investors even (so say up to $30M) or are we talking some multiple? But if it's a multiple, even 2x sounds a bit crazy.
One option is that the current Bun shareholders didn't see a profitable future and didn't even care if they were made even and a return of the remaining cash was adequate.
Another option is that this was an equity deal where Bun shareholders believe there is still a large multiple worth up potential upside in the current Anthropic valuation.
i don’t get it either - bun being the foundation of tons of AI tools is like a best possible outcome, what were they hoping for when they raised the money? Or is this just an admission of “hey, that was silly, we need to land this however we can”? Or do they share major investors and the therefore this is just a consolidation? (Edit: indeed, KP did indeed invest $100M in Anthropic this year. I’m also confused - article states Bun raised 26M but the KP seed round was 7, did they do the A too but unannounced? Notably, the seed was summer 2022 and chatgpt was Nov 30, so the world is different, did the hypothesis change?)
It's more honest than the Replicate answer but I think inevitably if you can't raise the next round and you get distracted by the shiny AI that this is the path taken by many teams. There is absolutely nothing wrong with that. There was an exuberant time when all the OSS things were getting funded, and now all AI things get funded. For many engineer founders, it's a better fit to go build deep technical stuff inside a bigger company. If I had that chance I would probably have taken it too. Good luck to the Bun team!
Bun has completely changed my outlook on the JS ecosystem. Prior to Bun, there was little focus on performance. Now the entire space rallies around it.
> Prior to Bun, there was little focus on performance.
This is just completely insane. We went through more than a decade of performance competition in the JS VM space, and the _only_ justification that Google had for creating V8 was performance.
> The V8 engine was first introduced by Google in 2008, coinciding with the launch of the Google Chrome web browser. At the time, web applications were becoming increasingly complex, and there was a growing need for a faster, more efficient JavaScript engine. Google recognized this need and set out to create an engine that could significantly improve JavaScript performance.
I guess this is the time we live in. Vibe-coded projects get bought by vibe-coded companies and are congratulated in vibe-coded comments.
> Vibe-coded projects get bought by vibe-coded companies
this is so far from the truth. Bun, Zig, and uWebsockets are passion projects run by individuals with deep systems programming expertise. furthest thing from vibe coding imaginable.
> a decade of performance competition in the JS VM space
this was a rising tide that lifted all boats, including Node, but Node is built with much more of the system implemented in JS, so it is architecturally incapable of the kind of performance Bun/uWebsockets achieves.
> Bun, Zig, and uWebsockets are passion projects run by individuals with deep systems programming expertise. furthest thing from vibe coding imaginable.
Sure, I definitely will not throw projects like Zig into that bucket, and I don't actually think Bun is vibe-coded. At least that _used_ to be true, we'll see I guess...
> Node is built with much more of the system implemented in JS, so it is architecturally incapable of the kind of performance Bun/uWebsockets achieves
That sounds like an implementation difference, not an architectural difference. If they wanted to, what would prevent Node or a third party from implementing parts of the stdlib in a faster language?
My mistake, I was thinking of the wider ecosystem not the runtime, ie formatters, bundles and linters like Biome, oxc, etc being written in Rust or other compiled languages. That's where I saw the biggest speedup, because developers of them decided to use a compiled language to write them in instead of JS via a JS runtime where you'll inherently be limited by even a JIT language.
Machine code yes (along with Spidermonkey, JSC and Nashorn), the timeframe around 2005-2010 saw the introduction of JIT'ed JS runtimes. Back then however JS was firmly single-threaded, it was only with the introduction of SharedArrayBuffer that JS really started to receive multithreading features (outside of SharedArrayBuffer and other shareable/sendable types, a runtime could opt to run stuff like WebWorkers/WebAudioWorkers in separate processes).
Early Node f.ex. had a multi-process setup built in, Node initially was about pushing the async-IO model together with a fast JS runtime.
Why Bun (and partially Deno) exists is because TypeScript helps so damn much once projects gets a tad larger, but usage with Node hot-reloading was kinda slow, multiple seconds from saving a file until your application reloads. Even mainline node nowadays has direct .ts file loading and type erasing to quicken the workflow.
All vendors will have to implement test time code execution, solution exploration, etc. as it's a low hanging fruit with huge gains, so I see it as a great hire.
Love Bun, happy for you guys!
So, what if Claude Code starts using Bun in all applicable situations? If model providers train their models to use a tech stack beneficial to their business interests?
Wondering to what degree this was done to support Anthropic’s web crawler. Would assume that having a whole JS runtime rather than just a HTTP client could be rather useful. Just hypothesising here, no clue what they use for their crawler.
There's no reason to run agents on expensive AI platforms or on GPUs - when you can have the AI create an agent in JS and thus runs with very high performance and perfect repeatability on far less expensive CPUs.
At the very least there must be some part of the agent tasks that can be run in JS, such as REST APIs, fetching web results, parsing CSV into a table, etc.
Has CC always used Bun? When it tries it out many months ago it was an npm install not bun install in their instructions (although I did use bun install myself). Just odd that if they were using bun, why the installation wasn’t specifically a “bun install” (I suppose they were trying to keep it vanilla for the npm masses?)
This decision is honestly very confusing to me as a constant user of Claude Code (I have 3 of them open at the moment.)
So many of the issues with it seem to be because ... they wrote the damn thing in JavaScript?
Claude is pretty good at a constrained task with tests -- couldn't you just port it to a different language? With Claude?
And then just ... the huge claude.json which gets written on every message, like ... SQLite exists! Please, please use it! The scrollback! The Keyboard handling! Just write a simple Rust or Go or whatever CLI app with an actual database and reasonable TUI toolkit? Why double down and buy a whole JavaScript runtime?
Ink (and modern alternatives) probably are the best TUI toolkit. If you want to write a UI that's genuinely good, you need e.g. HTML, or some way to express divs and flex box. There isn't really another way to build professional grade UIs; I love immediate mode UI for games, but the breadth of features handled by the browser UI ecosystem is astonishing. It is a genuinely hard problem.
And if you're expressing hierarchical UI, the best way to do it is HTML and CSS. It has the richest ecosystem, and it is one of the most mature technologies in existence. JS / TS are the native languages for those tools. Everything is informed by this.
Of course, there are other options. You could jam HTML and CSS into (as you mention) Rust, or C, or whatever. But then the ecosystem is extremely lacking, and you're reinventing the wheel. You could use something simpler, like QML or handrolled. But then you lose the aforementioned breadth of features and compatibilities with all the browser code ever written.
TypeScript is genuinely, for my money, the best option. The big problem is that the terminal backends aren't mature (as you said, scrollback, etc). But, given time and money, that'll get sorted out. It's much easier to fix the terminal stuff than to rewrite all of the browser.
I like JS for this use case, and React on web, but really not fond of the Ink usage. Idk if it's Ink itself or the way it gets used, but somehow people are making CLIs that lag and waste terminal space now.
Ink seems to be the root cause of a major issue with the Claude Code CLI where it flickers horribly when it needs to repeatedly clear the screen and redraw.
The idea that you need or want HTML or CSS to write a TUI is missing the entire point of what made TUIs great in the first place. They were great precisely because they were clean, fast, simple, focused -- and didn’t require an entire web stack to draw colored boxes.
I'm not so sure about that. I've written some nontrivial TUIs in my time, the largest one being [1], and as the project got more complicated I did find myself often thinking "It sure would be nice if I could somehow just write this stuff with CSS instead of tiny state machines and control codes for coloration". There's no reason these languages couldn't compile down to a TUI as lean as hand-coloring everything yourself.
Yes, for simple projects, absolutely. But when you're shipping something as widely adopted as CC, I disagree. At the end of the day, you're making a UI. It happens to be rendered via the terminal. You still need accessibility, consistent layouts, easy integration with your backend services, inputs, forms, and so on. If you don't need that stuff, there are lots of other, simpler options. But if you do, your other options begin to resemble a half baked, bug filled reimplementation of the web. So just use the web.
“Port it to a different language” a language that’s more out of distribution? Bad devex. Store data as an unreadable binary file? Bad devex.
Stay in distribution and in the wave as much as possible.
Good devex is all you need. Claude code team iterates and ships fast, and these decisions make total sense when you realize that dev velocity is the point.
I have to admit this was my first thought, too. I'm pretty obsessed with Claude Code, but the actual app is so incredibly poorly engineered for something that doesn't even do that much.
Rust, Go, whatever -- writing a good TUI isn't that hard of a problem. Buying an entire VC funded JS runtime company isn't how you solve it.
So many comments about reasoning here, yet none about the very obvious one, it's not stability of the infrastructure, it's future direction of a product like Claude Code. They need to know how to continue their optimisation machine to fit developers needs the best way possible (for good or for worse).
I guess we should wait for some opt-out telemetry some time soon. It'll be nothing too crazy at first, but we'll see how hungry they are for the data.
Congratulations to the team. Knowing some of the folks on the Bun team I can not say I am surprised. They are the top 0,001% of engineers, writing code out of love. I’m hugely bullish on Anthropic, this is a great first acquisition.
but they are a company that burns billions every year in losses and this seems like a pretty random acquisition.
Bun is the product that depends on providing that good, stable, cross-platform JS runtime and they were already doing a good job. Why would Anthropic's acquisition of them make them better at what they were already doing?
I'm wondering if Bun would be a good embedded runtime for Claude to think in. If it does sandboxing, or if they can add sandboxing, then they can standardize on a language and runtime for Claude Code and Claude Desktop and bake it into training like they do with other agentic things like tool calls. It'd be too risky to do unless they owned the runtime.
uv is very forkable - dual-licensed under Apache and MIT, high quality codebase, it's Rust rather than Python but the Python community has an increasing amount of Rust experience these days.
That's why I'm not personally too nervous about the strategic risk to the Python community of having such a significant piece of the ecosystem from a relatively young VC-backed company.
Honestly, given the constant rollercoaster of version management and building tools for Python the move to something else would be expected rather than surprising.
I’ve seems like a great tool, but I remember thinking the same about piping, too.
uv is a revolution in every possible positive sense of the word in the Python world and I've been here since 1.5. it is imperative that bitter oldtimers like us try it, I did and the only regret I've got is that I didn't do it sooner.
I also tried it and am now using it for new projects. But I was just fine with Poetry too. Yes, uv is faster and probably better code. But my use-cases didn't necessitate to re-create the venvs frequently, so the slowness of Poetry didn't matter that much to me, and I am not using the "one-off script" kind of approaches that uv enables (writing the dependencies in a comment in the script itself).
So, yeah, uv is nice, but for me didn't fundamentally change that much.
Honestly, that is an understatement. `uv run` has transformed how I use Python since 99% of the time I don't need to setup or manage an environment and dependencies. A have tons of one-off Python scripts (with their dependencies in PEP 723 metadata at the top of the file) that just work with `uv run`.
I get how it might not be as useful in a production deployment where the system/container will be setup just for that Python service, but for less structured use-cases, `uv` is a silver bullet.
This reads more like Anthropic wanted to hire Jarred and Jarred wants to work with AI rather than build a Saas product around bun. I doubt it has anything to do with what is best for bun the project. Considering bun always seemed to value performance more than all else, the only real way for them to continue pursuing that value would be to move into the actual js engine design. This seems like a good pivot for Jarred personally and likely a loss for bun.
It doesn't read like that to me at all. This reads to me like Anthropic realizing that they have $1bn in annual revenue from Claude Code that's dependent on Bun, and acquiring Bun is a great and comparatively cheap way to remove any risk from that dependency.
I haven't had any issue moving projects between node, bun, and deno for years. I don't agree that the risk of bun failing as a company affects anthropic at all. Bun has a permissible license that anthropic could fork from, anthropic likely knew that oven had a long runway and isn't in immediate danger, and switching to a new js cli tool is not the huge lift most people think it is in 2025. Why pay for something you are already getting for free and can expect to keep getting for free for at least four years, and buy for less if it fails later?
This argument doesn’t make much sense to me. Claude Code, like any product, presumably has dozens of external dependencies. What’s so special about Bun specifically that motivated an acquisition?
A dependency that forms the foundation of your build process, distribution mechanisms, and management of other dependencies is a materially different risk than a dependency that, say, colorizes terminal output.
I’m doubtful that alone motivated an acquisition, it was surely a confluence of factors, but Bun is definitely a significant dependency for Claude Code.
> MIT code, let Bun continue develop it, once project is abandoned hire the developers.
Why go through the pain of letting it be abandoned and then hiring the developers anyway, when instead you can hire the developers now and prevent it from being abandoned in the first place (and get some influence in project priorities as well)?
If they found themselves pushing PRs to bun that got ignored and they wanted to speed up priority on things they needed, if the acq was cheap enough, this is the way to do it.
I'm also curious if Anthropic was worried about the funding situation for Bun. The easiest way to allay any concerns about longevity is to just acquire them outright.
It's not easy to "just" fork a huge project like Bun. You'll need to commit several devs to it, and they'll have to have Zig and JSC experience, a hard combo to hire for. In many ways, this is an acquihire.
Nah, it reads like the normal logic behind the consulting model for open source monetization, except that Bun was able to make it work with just one customer. Good for them, though it comes with some risks, especially when structured as an acquisition.
So Anthropic sees its CLI (in TypeScript) as the first-class product and maybe planning to expand the claude code with more JS based agents / ecosystem? Especially owning the runtime gives a lot of control over developer experience.
What matters: it's staying open source and MIT licensed. I sincerely hope it stays that way. Congrats to the Bun team on making a great tool and getting the recognition they deserve.
> Being part of Anthropic gives Bun: Long-term stability.
Let's see. I don't want to always be the downer but the AI industry is in a state of rapid flux with some very strong economic headwinds. I wouldn't confidently say that hitching your wagon to AI gives you long term stability. But as long as the rest of us keep the ability to fork an open source project I won't complain too much.
(for those who are disappointed: this is why you stick with Node. Deno and Bun are both VC funded projects, there's only one way that goes. The only question is timeline)
Nothing gives you long term stability in tech. You have to constantly work at staying stable, and it isn't always up to anything the company is in control of, no matter what ownership they have.
Considering that 1) Bun is written in Zig, 2) Zig has a strict no-AI policy [1], and 3) Bun has joined Claude, it seems that Bun and Zig are increasingly culturally apart.
You’re reading a code of conduct for contributing to the zig project. I don’t think everything there is guidance for everything written in zig, eg ‘English is encouraged’ is something one might not want for a project written in zig by native French-speakers, and I don’t think that’s something zig would want to suggest to them. I read the AI part is much more motivated by the asymmetries of open source project contribution than any statement about the language itself. Fly-by AI contributions are bad because they make particularly poor use of maintainer time. Similar to the rule on proposing language changes, which can suck up lots of reading/thinking/discussion time. When you have people regularly working together (eg those people in anthropic working on bun) the incentives are different because there is a higher cost to wasting your colleague’s time.
Nothing I found says anything about Zig folks being inherently against AI. It just looks like they don’t want to deal with “AI Slop” in contributions to their project, which is very understandable.
Godspeed. Seems like a good pairing. Bun is sort of the only part of the JS ecosystem I like, and Code has become such an important tool for my work, that I think good things will come out of this match. Go Bundler as well.
I’m curious to what the acquisition price was. Bun said they’ve raised $26 million so I’m assuming the price tag has to be a lot higher than that for investors to agree to an acquisition.
Wouldn’t it make more sense to write the same functionality using a more performant, no-gc language? Aren’t competitors praised for their CLIs being faster for that reason?
With AI tooling, we are in the era where rapid iteration on product matters more than optimal runtime performance. Given that, implementing your AI tooling in a language that maximizes engineer productivity makes sense, and I believe GC does that.
JS/TS has a fundamental advantage, because there is more open source JS/TS than any other language, so LLMs training on JS/TS have more to work with. Combine that with having the largest developer community, which means you have more people using LLMs to write JS/TS than any other language, and people use it more because it works better, then the advantage compounds as you retrain on usage data.
One would expect that "AI tooling" is there for rapid iteration and one can use it with performant languages. We already had "rapid iteration" with GC languages.
If "AI tooling" makes developers more productive regardless of language, then it's still more productive to use a more productive language. If JS is more productive than C++, then "N% more productive JS" is still more productive than "N% more productive C++", for all positive N.
It seems the default is node (despite the project docs saying to use bun and all example script documentation using bun). It will use bun if told, but there’s definitely nothing saying to use node and it uses that anyway.
on the post they try to reassure the following question
"If I bet my work project or company's tech stack on Bun, will it still be around in five or ten years?"
and the thing is that we don't know if Anthropic itself will be around 5 to ten years
Sounds like the goal is to bundle up Bun with Claude Code insanely tightly, to the point where it doesn't matter if you have nodejs installed locally, but also they can optimize key things for Claude Code's Bun runtime as needed. It's a brilliant acquisition, and bun stays open source, which allows it to continue to grow, to Anthropics benefit and everyone else's.
I just ln bun to npm, npx, and node. This has the added benefit of letting ts_ls and various other tools work without requiring me to have both node and bun installed locally.
Congrats Jarred and team! You have saved humanity many hours already, and I'm sure with Anthropic's backing, you will spare us many more. Farewell would-be headaches from Node & NPM tooling and waiting for builds and tests and package updates. Exciting times ahead!
Using bun on a side project reinvigorated my love of software development during a relatively dark time in my life, and part of me wonders if I would have taken the leap onto my current path if it weren't for the joy and feeling of speed that came from working with bun!
Ha, Physics majors get the same talk about law school. It's just the selection bias of selecting for people willing to make hard pivots filtering out the under-achieving, go-with-the-flow types.
i really think this is part of the pitch deck for bun's funding. that a bigger company would acquire it for the technology. the only reason an AI company or any company for that matter would acquire it would be to:
Not saying it’s 100%, there’s still the repl missing but all of node’s API is available in the sense that it’s ABI compatible (or will be very near term).
If they keep it MIT licensed, if/when things come crashing down, I think its reasonable to think Bun would continue on in some form, even if development slows pace without paid contributors.
I’ve never understood the security utility of the Deno flags. What practical attack would they protect you from? Supply chain seems to be the idea, but how many npm packages do people use that neither:
(1) Bun is what technical startups should be. Consistently excellent decisions, hyper focused on user experience, and a truly excellent technical product.
(2) We live in a world where TUIs are causing billion dollar acquisitions. Think about that. Obviously, Bun itself is largely orthogonal to the TUIs. Just another use case. But also obviously, they wouldn't have been acquired like this without this use case.
(3) There's been questions of whether startups like Bun can exist. How will they make money? When will they have to sell out one of the three principles in (1) to do so? The answer seems to be that they don't; at least, not like we expected, and in my opinion not in a sinister way.
A sinister or corrupting sell out would be e.g. like Conan. What started as an excellent tool became a bloated, versioned mess as they were forced to implement features to support the corporate customers that sustained them.
This feels different. Of course, there will be some selling out. But largely the interests of Anthropic seem aligned with "build the best JS runtime", since Anthropic themselves must be laser focused on user experience with Claude Code. And just look at Opencode [^1] if you want to see what leaning all the way into Bun gets you. Single file binary distribution, absurdly fast, gorgeous. Their backend, OpenTUI [^2], is a large part of this, and was built in close correspondence with the Bun folks. It's not something that could exist without Bun, in my opinion.
(4) Anthropic could have certainly let Bun be a third party to which they contributed. They did not have to purchase them. But they did. There is a strange not-quite altruism in this; at worst, a casting off of the exploitation of open source we often see from the biggest companies. Things change; what seems almost altruistic now could be revealed to be sinister, or could morph into such. But for now, at least, it feels good and right.
Can anyone provide some color around this: "I started porting esbuild's JSX & TypeScript transpiler from Go to Zig"? Hypothetical benefits include monolanguage for development, better interoperability with C and C++, no garbage collection, and better performance. What turned out to be realized and relevant here? Please, no speculation or language flames or wars.
Interesting. Looking through a strategic lens, I feel like this is related to the $1,000 free credit for Claude Code Web (I used a few hundred). What the heck are they aiming for? CodeAct? (https://arxiv.org/abs/2402.01030)
Hahaha congratulations. This is amazing. The most unlikely outcome for a devtools team. Fascinating stuff.
This is promising for Astral et al who I really like but worried about their sustainability. It does point to being as close to the user as possible mattering.
I don't know for sure, but it's definitely the first tool of that value to have a persistent strobing (scroll position) bug so bad that passersby ask me if I'm okay when they see it.
Man, I had never even put words to that problem but you are right that it is beyond annoying. It seems to me like it worsens the longer the Claude instance has run - I don't seem to see it early in the session.
Yeah, issues have been open on GitHub for months. I've tried shortening my scrollback history and using other emulators but it doesn't seem to make a difference. It's pretty frustrating for a paid tool.
It doesn't make a lot of sense that they'll compare Microsoft 365 Copilot with Claude Code, though? Like it is a legit CLI tool but we should ignore it because it shares the name with something else?
Terraform gets to $600mm if you squint really hard make up stuff. Kubectl though. Whatever you want to say about kubernetes complexity, it does get a bunch of money run through it. We could also look at aws-cli, gcloud and az, and if we assign cloud budgets that get run through there, I'm sure it's in the hundreds of millions. Then there's git. Across the whole ecosystem, there's probably a cool couple billion floating through there. gh is probably much smaller. Other tools like docker and ansible come to mind, though those are not quite as popular. Cc only hits $1B ARR if you squint really hard in the first place, so I think in this handwavy realm, I'd say aws-cli comes first, then kubectl, then git, with maybe docket and terraform in the mix as well. Nonetheless, Claude is a really awesome cli tool that I use most days, I find.
Good luck, always worried about stuff like that because it happened so many times and the product got worse eventually. At the same time, ai understand how much effort went into building something like Bun and people need to fund their life's somehow, so there's that.
So far, someone from the bun team has left a bunch of comments like
> Poor quality code
...and all the tests still seem to be failing. I looked through the code that the bot had generated and to me (who to be fair is not familiar with the bun codebase) it looks like total dogshit.
But hey, maybe it'll get there eventually. I don't envy "taylordotfish" and the other bot-herders working at Oven though, and I hope they get a nice payout as part of this sale.
So you pushed a PR that breaks a bunch of tests, added a 5 layer nested if branch block that mixes concerns all over the place, then ignored the reviewer for three weeks, and you’re surprised they didn’t approve it?
> So you pushed a PR that breaks a bunch of tests, added a 5 layer nested if branch block that mixes concerns all over the place, then ignored the reviewer for three weeks, and you’re surprised they didn’t approve it?
...Did you miss the part where Bun used Claude to generate that PR?:)
Congrats. This is the first time I remember reading a genuine, authentic story about a sale. Much preferred over “this is about continuing the mission until my earn-out is complete.”
Look, if a terminal emulator can raise $67 million by riding the AI hypewave then a Javscript runtime can do the same. Nobody ever said that AI investments and acquisitions have to make any sense.
> Long-term stability. a home and resources so people can safely bet their stack on Bun.
Isn't it the opposite? Now we've tied Bun to "AI" and if the AI bubble or hype or whatever bursts or dies down it'd impact Bun.
> We had over 4 years of runway to figure out monetization. We didn't have to join Anthropic.
There's honestly a higher chance of Bun sticking out that runway than the current AI hype still being around.
Nothing against Anthropic but with the circular financing, all the debt, OpenAI's spending and over-valuations "AI" is the riskier bet than Bun and hosting.
Yeah that’s the main part that puzzled me, super happy for the team that they got a successful exit, but I wouldn’t really consider Anthropic’s situation to be stable…
Yeah, no reader of tech news will take an acquisition of a company with four years of runway as anything but decreasing the odds their product will still be around (and useful to the same audience…) in four years. Even without being tied to a company with lots of exposure to a probable bubble.
How so? Presumably Jarred got a nice enough payout that if Anthropic failed, he would not need to work. At that point, he's more than welcome to take the fully MIT licensed Bun and fork it to start another company or just continue to work on it himself if he so chooses.
I didn’t say it was definitely the end or definitely would end up worse, just that someone who’s followed tech news for a while is unlikely to take this as increasing the odds Bun survives mid-term. If the company was in trouble anyway, sure, maybe, but not if they still had fourish years in the bank.
“Acquired product thriving four years later” isn’t unheard of, but it’s not what you expect. The norm is the product’s dead or stagnant and dying by then.
> At that point, he's more than welcome to take the fully MIT licensed Bun and fork it to start another company or just continue to work on it himself if he so chooses.
Is there any historical precedent of someone doing that?
Opus 4.5 is not living in vacuum. It’s the most expensive of models for coders and there is Gemini 3 pro - with many discounts and deepseek 3.2 that is 50x cheaper and not much behind.
> I say don't muddy the water with the public panic over "will it won't it" bubble burst predictions.
It does matter. The public ultimately determines how much they get in funding if at all.
> The effective demand for Opus 4.5 is bottomless; the models will only get better.
The demand for the Internet is bottomless. Doesn't mean Dotcom didn't crash.
There are lots of scenarios this can play out, e.g. Anthropic fails to raise a certain round because money dried up. OpenAI buys Anthropic but decides they don't need Bun and closes out the project.
I am more shocked about the origin story compared to the acquisition.
> Almost five years ago, I was building a Minecraft-y voxel game in the browser. The codebase got kind of large, and the iteration cycle time took 45 seconds to test if changes worked. Most of that time was spent waiting for the Next.js dev server to hot reload.
Why in the hell would anyone be using Next.js to make a 3D game... Jarred has always seemed pretty smart, but this makes no sense. He could've saved so much time and avoided building a whole new runtime by simply not using the completely wrong tool for the job.
Maybe same for anthropic, they can simply write agent using Rust/Go. Instead they decide to buy and develop a JavaScript runtime.
A lot of people seem confused about this acquisition because they think of Bun as a node.js compatible bundler / runtime and just compare it to Deno / npm. But I think its a really smart move if you think of where Bun has been pushing into lately which is a kind of cloud-native self contained runtime (S3 API, SQL, streaming, etc). For an agent like Claude Code this trajectory is really interesting as you are creating a runtime where your agent can work inside of cloud services as fluently as it currently does with a local filesystem. Claude will be able to leverage these capabilities to extend its reach across the cloud and add more value in enterprise use cases
Yea, they just posted this a few days ago:
https://www.anthropic.com/engineering/advanced-tool-use
They discussed how running generated code is better for context management in many cases. The AI can generate code to retrieve, process, and filter the data it needs rather than doing it in-context, thus reducing context needs. Furthermore, if you can run the code right next to the server where the data is, it's all that much faster.
I see Bun like a Skynet: if it can run anywhere, the AI can run anywhere.
Java can run anywhere too
AI tools value simplicity, fast bootstrapping and iterations, this rules out the JVM which has the worst build system and package repositories I've ever had the displeasure of needing to use. Check in gradle binaries in 2025? Having to wait days for packages to sync? Wrappers on every project for Windows/Linux? It's broken beyond repair.
By contrast `bun install` is about as good as it gets.
It’s relevant enough that I feel I can roll out this bash.org classic…
<Alanna> Saying that Java is nice because it works on all OS's is like saying that anal sex is nice because it works on all genders
EDIT: someone has (much to my joy) made an archive of bash.org so here is a link[1], but I must say I’m quite jealous of today’s potential 1/10,000[2] who will discover bash.org from my comment!
[1] https://bash-org-archive.com/?338364
[2] https://xkcd.com/1053
Perhaps my biggest claim to fame is being #11 on the bash.org top 100.
That's hilarious. My comment is mostly a joke, but also trying to say that "runs everywhere" isn't that impressive anymore.
Yeah everyone proclaims to IANAL nowadays.
Not discovered from scratch, but was a big fan when it was alive and kicking. Went there from time to time to get some mood boosters. So was very sad when found that it's gone (original one). Thanks a lot for sharing that bash-org-archive.com exists, what a great fun going down this memory lane.
I’ve been browsing the archive since I left that comment, they really were the good old days weren’t they. IRC was my introduction to geekdom, and I don’t think it would be unreasonable to say it shaped my life. Here I am 30-ish years later, an old man yelling at clouds — and I wouldn’t change much!
If anyone ever requested/used an eggdrop(?) bot from #farmbots or #wildbots on quakenet then thanks to you too; that was certainly one of the next steps down the path I took. A (probably very injectable) PHP blog and a bunch of TCL scripts powering bots, man I wish I could review that code now.
I found another appropriate XKCD: https://xkcd.com/1682/
wait - how do you search the quotes??
I don’t think there is a search function, I got the exact wording from a web search (I think “bash Java anal”, arguably a dangerous search!) and then after submitting I wondered if there is an archive of the quotes.
Not in the browser, and no – webassembly doesn't count, otherwise you can say the same about Go and others.
Wasm does count, and you can say the same about Go and others.
Sure, they run, but they can't touch the DOM or do much that's very interesting without JavaScript.
Js just runs as is. Atwood's Law and all that.
I remember a time ...
May I ask, what is this obsession with targeting the browser? I've also noticed a hatred of k8s here, and while I truly understand it, I'd take the complication of managing infrastructure over frontend fads any day.
HN has a hatred of K8s? That’s new to me
This is a site for startups. They have no business running k8s, in fact, many of the lessons learned get passed on from graybeards to the younger generation along those lines. Perhaps I'm wrong! I'd love to talk shop somewhere.
Why doesn’t wasm count?
Compile step makes things more complicated.
As opposed to minimized JS.
why would the tool minify the script it generated?
Same problem, different orders of magnitude.
java did run in the browser once.... it was embedded directly on the browser there was also nsapi
you could also run java with js if you are brave enough https://kreijstal.github.io/java-tools/
Java runs in the browser currently, after a transpilation step (same as .ts):
https://teavm.org/
Java is not for sale.
Java can be depended on without buying anything.
Oracle lawyers want you to think so.
Ahem, Temurin/OpenJDK disagree
Java's price is your time which you will need tons of as Java is highly verbose. The ultimate enterprise language
try java 25, and update your priors :)
No amount of updates will wash away the stink of Oracle from Java.
Again, Temurin/OpenJDK disagree
run code anywhere hamstrung by 90s syntax and hidden code indirections
Haven’t checked in on Java in a while?
From what I gather everyone is still stuck on Java 8 so no need to check?
Where do you gather this from? We are a startup, on Java and on 25.
No, everyone isn’t. You really should check.
This is absolutely untrue. Code from JDK 8 runs fine on JDK 25 (just released LTS). It is true that if you did something silly that locks you into certain dependency versions, you may be stuck, but this is not the majority of applications.
i haven't. do people still use the "class" keyword?
Jesus wept, for the nerds joyfully want skyney
Yea - if you want a paranoidly-sandboxed, instant-start, high-concurrency environment, not just on beefy servers but on resource-constrained/client devices as well, you need experts in V8 integration shenanigans.
Cloudflare Workers had Kenton Varda, who had been looking at lightweight serverless architecture at Sandstorm years ago. Anthropic needs this too, for all the reasons above. Makes all the sense in the world.
Bun isn't based on V8, it's JavaScriptCore, but your point still stands.
Who would have predicted KDE could become the foundation of both AI and gaming
Also the worlds most popular web browsers
Gaming = talking about the Steam Deck?
you left out the best part...what happened to Kenton? He looked at lightweight serverless architecture..and then what?
I built Cloudflare Workers?
This is going to be a HN Classic.
This is how I found out about HN Classic! https://news.ycombinator.com/classic
Isn't what you're describing just a set of APIs with native bindings that the LLM can call?
I'm not sure I understand why it's necessary to even couple this to a runtime, let alone own the runtime?
Can't you just do it as a library and train/instruct the LLM to prefer using that library?
Mostly, just Jarred Sumner makes it worth it for Anthropic.
It's fine but why is Js a good language for agents? I mean sure its faster than python but wouldn't something that compiles to native be much better?
JS has the fastest, most robust and widely deployed sandboxing engines (V8, followed closely by JavaScriptCore which is what Bun uses). It also has TypeScript which pairs well with agentic coding loops, and compiles to the aforementioned JavaScript which can run pretty much anywhere.
Note that "sandboxing" in this case is strictly runtime sandboxing - it's basically like having a separate process per event loop (as if you ran separate Node processes). It does not sandbox the machine context in which it runs (i.e. it's not VM-level containment).
When you say runtime sandboxing, are you referring to JavaScript agents? I haven't worked all that much with JavaScript execution environments outside of the browser so I'm not sure about what sandboxing mechanics are available.
https://nodejs.org/api/vm.html
Bun claims this feature is for running untrusted code (https://bun.com/reference/node/vm), while Node says "The node:vm module is not a security mechanism. Do not use it to run untrusted code." I'm not sure whom to believe.
It's interesting to see the difference in how both treat the module. It feels similar to a realm which makes me lean by default to not trusting it for untrusted code execution.
It looks like Bun also supports Shadow Realms which from my understanding was more intended for sandboxing (although I have no idea how resources are shared between a host environment and Shadow Realms, and how that might potentially differ from the node VM module).
The reference docs are auto generated from node’s TypeScript types. node:vm is better than using the same global object to run untrusted code, but it’s not really a sandbox
Doesn’t Bun use JavaScriptCore though? Perhaps their emulation, rather implementation, leans more towards security.
Running it in a chroot or a scoped down namespace is all your need most of the time anyways.
> It also has TypeScript which pairs well with agentic coding loops, (...)
I've heard that TypeScript is pretty rough on agentic coding loops because the idiomatic static type assertion code ends up requiring huge amounts of context to handle in a meaningful way. Is there any truth to it?
What exactly is the claim? For example, TypeScript isn't very verbose, and it's stripped at compilation; it doesn't generate assertion code.
Not to mention the saturation of training data
[dead]
> It also has TypeScript which pairs well with agentic coding loops
The language syntax has nothing to do with it pairing well with agentic coding loops.
Considering how close Typescript and C# are syntactically, and C#'s speed advantage over JS among many other things would make C# the main language for building Agents. It is not and that's because the early SDKs were JS and Python.
Typescript is probably generally a good LLM language because - static types - tons and tons of training data
Kind of tangent but I used to think static types were a must-have for LLM generated code. But the most magical and impressively awesome thing I’ve seen for LLM code generation is “calva backseat driver”, a vscode extension that lets copilot evaluate clojure expressions and generally do REPL stuff.
It can write MUCH cleaner and more capable code, using all sorts of libraries that it’s unfamiliar with, because it can mess around and try stuff just like a human would. It’s mind blowingly cool!!
> C#'s speed advantage over JS among many other things would make C# the main language
Nobody cares about this, JS is plenty fast for LLM needs. If maximum performance was necessary, you're better off using Go because of fast compiler and better performance.
> Nobody cares about this
And that was my point. The choice of using JS/TS for LLM stuff was made for us based on initial wave of SDK availabilities. Nothing to do with language merits.
It's widespread and good enough. The language just doesn't matter that much in most cases
This is one of those, "in theory, there's no difference between theory and practice. In practice, there is" issues.
In their, quality software can be written in any programming language.
In practice, folks who use Python or JavaScript as their application programming language start from a position of just not carrying very much about correctness or performance. Folks who use languages like Java or C#, do. And you can see the downstream effects of this in the difference in the production-grade developer experience and the quality of packages on offer in PIP and NPM versus Maven and NuGet.
As a developer that switches between java, python and typescript every day I think this is fairly myopic opinion. Being siloed to one lang for long enough tends to brings out our tribalistic tendencies, tread carefully.
I've seen codebases of varying quality in nearly every language, "enterprise" and otherwise. I've worked at a C# shop and it was no better or worse than the java/kotlin/typescript ones I've worked at.
You can blame the "average" developer in a language for "not caring ", but more likely than not you're just observing the friction imposed by older packaging systems. Modern languages are usually coupled with package managers that make it trivial to publish language artifacts to package hubs, whereas gradle for example is it's own brand of hell just to get your code to build.
That's not a fair comparison. In your example, you're talking about the average of developers in a language. In this situation, it's specific developers choosing between languages. Having the developers you already have choose language A or B makes no difference to their code quality (assuming they're proficient with both)
These are statements these developers will make themselves. They will say they don't like more strictly typed languages because they feel constrained and slowed down in development. They will argue that the performance hit is worth the trade offs.
[flagged]
Exactly! In the Java ecosystem, your intelligence is measured by how elaborate an interface hell you can conjure just to do CRUD.
Chill out buddy. You're going to pop a vein here.
A typical backend developer using C#/Java is likely solving more complicated problems and having all the concerns of an enterprise system to worry about and maintain.
Dismissing a dev or a system because it is enterprisy is a weak argument to make against a language. A language being used a lot in an enterprise to carry the weight of the business is a sign the language is actually great and reliable enough.
I don't know where you're getting the impression that Java and C# are somehow only for "enterprise monkey who barely knows outside of their grotesque codebase"
Could also be a way to expand the customer for Claude Code from coding assistant to vibe coding, a la Replit creating a hosted app. CC working more closely with Bun could make all that happen much faster:
> Our default answer was always some version of "we'll eventually build a cloud hosting product.", vertically integrated with Bun’s runtime & bundler.
>Claude will be able to leverage these capabilities to extend its reach across the cloud and add more value in enterprise use cases
100%. even more robust if paired with an overlay network which provides identity based s3 access (rather than ip address/network based). else server may not have access to s3/cloud resource, at least for many enterprises with s3 behind vpn/direct connect.
ditto for cases when want agent/client side to hit s3 directly, bypassing the server, and agent/client may not have permitted IP in FW ACL, or be on vpn/wan.
This is an insanely good take I never thought of.
That's a really cool use case and seems super helpful. working cloud native is a chore sometimes. having to fiddle with internal apis, acl/permissions issues.
The writeup makes it sound like an acquihire, especially the "what changes" part.
ChatGPT is feeling the pressure of Gemini [0]. So it's a bit strange for Anthropic to be focusing hard on its javascript game. Perhaps they see that as part of their advantage right now.
[0] https://timesofindia.indiatimes.com/technology/tech-news/goo...
What the ? I am either too old, or stupid, or both, to understand this. I'd expect this bullshit from Consultants.
This matches some previous comments around LLMs driving adoption of programming languages or frameworks. If you ask Claude to write a web app, why not have it use your own framework, that it was trained on, by default?
Users are far more likely to ask it about shadcn, or material, than about node/deno/bun. So, what is this about?
Currently Claude etc. can interact with services (including AWS) via MCPs.
What the user you're replying to is saying the Bun acquisition looks silly as a dev tool for Node. However if you look at their binding work for services like s3[0], the LLM will be able to interact directly with cloud services directly (lower latency, tighter integration, simplified deployment).
0: https://bun.com/docs/runtime/s3
That doesn't make sense either. Agents already have access to MCPs and Tools. Your example is solved by having an S3 wrapper as a set of tools.
But this embeds the tools in the agent's runtime.
Claude Code would be a much worse product if it didn't ship with web search and filesystem operations but instead required you to configure an MCP kit.
Consider this farther along that path.
An AI company scoops up frontend tech. Do you really think it was because of s3?
Bun is not really frontend tech
As a commandline end user who prefers to retreive data from the www as text-only, I see deno and bun as potential replacements (for me, not necessarily for anyone else) for the so-called "modern" browser in those rare cases where I need to interpret Javascript^1
At present the browser monstrosity is used to (automatically, indiscriminantly) download into memory and run Javascripts from around the web. At least with a commandline web-capable JS runtime monstrosity the user could in theory exercise more control over what scripts are downloaded and if and when to run them. Perhaps more user control over permissions to access system resources as well (cf. corporate control)
1. One can already see an approach something like this being used in the case of
https://github.com/yt-dlp/yt-dlp/wiki/EJS
where a commandline JS runtime is used without the need for any graphics layer (advertising display layer)
Is this something I’d have to own a tv to understand?
> At the time of writing, Bun's monthly downloads grew 25% last month (October, 2025), passing 7.2 million monthly downloads. We had over 4 years of runway to figure out monetization. We didn't have to join Anthropic.
I believe this completely. They didn't have to join, which means they got a solid valuation.
> Instead of putting our users & community through "Bun, the VC-backed startups tries to figure out monetization" – thanks to Anthropic, we can skip that chapter entirely and focus on building the best JavaScript tooling.
I believe this a bit less. It'll be nice to not have some weird monetization shoved into bun, but their focus will likely shift a bit.
> They didn't have to join, which means they got a solid valuation.
Did they? I see a $7MM seed round in 2022. Now to be clear that's a great seed round and it looks like they had plenty of traction. But it's unclear to me how they were going to monetize enough to justify their $7MM investment. If they continued with the consultancy model, they would need to pay back investors from contracts they negotiate with other companies, but this is a fraught way to get early cashflow going.
Though if I'm not mistaken, Confluent did the same thing?
They had a second round that was $19m in late 2023. I don't doubt for a second that they had a long runway given the small team.
I don't like all of the decisions they made for the runtime, or some of the way they communicate over social media/company culture, but I do admire how well-run the operation seems to have been from the outside. They've done a lot with (relatively) little, which is refreshing in our industry. I don't doubt they had a long runway either.
Thanks I scrolled past that in the announcement page.
With more runway comes more investor expectations too though. Some of the concern with VC backed companies is whether the valuation remains worthwhile. $26mm in funding is plenty for 14 people, but again the question is whether they can justify their valuation.
Regardless happy for the Oven folks and Bun has been a great experience (especially for someone who got on the JS ecosystem quite late.) I'm curious what the structure of the acquisition deal was like.
I don't really know any details about this acquisition, and I assume it's the former, but acquihires are also done for other reasons than "it was the only way".
Can't edit my comment anymore but Bun posted a pretty detailed explanation of their motivation here: https://bun.com/blog/bun-joins-anthropic
Sounds like "monetizing Bun is a distraction, so we're letting a deep-pocketed buyer finance Bun moving forward".
Anthropic is still a new company and so far they seem "friendly". That being said, I still feel this can go either way.
Yeah, now they are part of Anthropic, who haven't figured out monetization themselves. Shikes!
I'm a user of Bun and an Anthropic customer. Claude Code is great and it's definitely where their models shine. Outside of that Anthropic sucks,their apps and web are complete crap, borderline unusable and the models are just meh. I get it, CC's head got probably a powerplay here given his department is towing the company and his secret sauce, according to marketing from Oven, was Bun. In fact VSCode's claude backend is distributed in bun-compiled binary exe, and the guy is featured on the front page of the Bun website since at least a week or so. So they bought the kid the toy he asked for.
Anthropic needs urgently, instead, to acquire a good team behind a good chatbot and make something minimally decent. Then make their models work for everything else as well as they do with code.
> Yeah, now they are part of Anthropic, who haven't figured out monetization themselves.
Anthropic are on track to reach $9BN in annualised revenue by the end of the year, and the six-month-old Claude Code already accounts for $1BN of that.
> I believe this a bit less.
They weren’t acquired and got paid just to build tooling as before and now completely ignoring monetization until the end of times.
Maybe they were though. Maybe Anthropic just wanted to bring a key piece of the stack in-house.
Good for them, could be bad for actual users.
Given the worries about LLM focused companies reaching profitability I have concerns that Bun's runway will be hijacked... I'd hate for them to go down with the ship when the bubble pops.
This is my fear. It's one thing to lose a major sponsor. It's another to get cut due to a focus on profitability later down the line.
"We were maybe gonna fuck ya, buy now we promise we wont"
I work on Bun.
Happy to answer any questions
I'm sort of surprised to see that you used Claude Code so much. I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc. And I know Bun started with an extreme attention to detail around performance.
I would have thought LLM-generated code would run a bit counter to both of those. I had sort of carved the world into "vibe coders" who care about the eventual product but don't care so much about the "craft" of code, and people who get joy out of the actual process of coding and designing beautiful abstractions and data structures and all that, which I didn't really think worked with LLM code.
But I guess not, and this definitely causes me to update my understanding of what LLM-generated code can look like (in my day to day, I mostly see what I would consider as not very good code when it comes from an LLM).
Would you say your usage of Claude Code was more "around the edges", doing things like writing tests and documentation and such? Or did it actually help in real, crunchy problems in the depths of low level Zig code?
I am not your target with this question (I don't write Zig) but there is a spectrum of LLM usage for coding. It is possible to use LLMs extensively but almost never ship LLM generated code, except for tiny trivial functions. One can use them for ideation, quick research, or prototypes/starting places, and then build on that. That is how I use them, anyway
Culturally I see pure vibe coders as intersecting more with entrepreneurfluencer types who are non-technical but trying to extend their capabilities. Most technical folks I know are fairly disillusioned with pure vibe coding, but that's my corner of the world, YMMV
> Culturally I see pure vibe coders as intersecting more with entrepreneurfluencer types who are non-technical but trying to extend their capabilities. Most technical folks I know are fairly disillusioned with pure vibe coding, but that's my corner of the world, YMMV
Anyone who has spent time working with LLMs knows that the LinkedIn-style vibecoding where someone writes prompts and hits enter until they ship an app doesn't work.
I've had some fun trying to coax different LLMs into writing usable small throwaway apps. It's hilarious in a way to the contrast between what an experienced developer sees coming out of LLMs and what the LinkedIn and Twitter influencers are saying. If you know what you're doing and you have enough patience you really can get an LLM to do a lot of the things you want, but it can require a lot of handholding, rejecting bad ideas, and reviewing.
In my experience, the people pushing "vibecoding" content are influencers trying to ride the trend. They use the trend to gain more followers, sell courses, get the attention of a class of investors desperate to deploy cash, and other groups who want to believe vibecoding is magic.
I also consider them a vocal minority, because I don't think they represent the majority of LLM users.
fwiw, copilots licence only explicitly permits using its suggestions the way you say.
putting everyone using the generated outputs into a sort of unofficial grey market: even when using first-party tools. Which is weird.
Can you link to more info about this?
I'll give you a basic example where it saved me a ton of time to vibe code instead of doing it myself, and I believe it would hold true for anyone.
Creating ~50 different types of calculators in JavaScript. Gemini can bang out in seconds what would take me far longer (and it's reasonable at basic tailwind style front-end design to boot). A large amount of work smashed down to a couple of days of cumulative instruction + testing in my spare time. It takes far long to think of how I want something to function in this example than it does for Gemini to successfully produce it. This is a use case scenario where something like Gemini 3 is exceptionally capable, and far exceeds the capability requirements needed to produce a decent outcome.
Do I want my next operating system vibe coded by Gemini 3? Of course not. Can it knock out front-end JavaScript tasks trivially? Yes, and far faster than any human could ever do it. Classic situation of using a tool for things it's particularly well suited.
Here's another one. An SM-24 Geophone + Raspberry PI 5 + ADC board. Hey Gemini / GPT, I need to build bin files from the raw voltage figures + timestamps, then using flask I need a web viewer + conversion on the geophone velocity figures for displacement and acceleration. Properly instructed, they'll create a highly functional version of that with some adjustments/iteration in 15-30 minutes. I basically had them recreate REW RTA mode for my geophone velocity data, and there's no way a person could do it nearly as fast. It requires some checking and iteration, and that's assumed in the comparison.
Yeah I had OpenAI crank out 100 different fizzbuzz implementations in a dozen seconds—-and many of them worked! No chance a developer would have done it that fast, and for anyone who needs to crank out fizzbuzz implementations at scale this is the tool to beat. The haters don’t know what they’re talking about.
Handmade Cities founder here.
We never associated with Bun other than extending an invitation to rent a job booth at a conference: this was years ago when I had a Twitter account, so it's fair if Jarred doesn't remember.
If Handmade Cities had the opportunity to collaborate with Bun today, we would not take it, even prior to this acquisition. HMC wants to level up systems while remaining performant, snappy and buttery smooth. Notable examples include File Pilot [0] or my own Terminal Click (still early days) [1], both coming from bootstrapped indie devs.
I'll finish with a quote from a blog post [2]:
> Serious Handmade projects, like my own Terminal Click, don’t gain from AI. It does help at the margins: I’ve delegated website work since last year, and I enjoy seamless CI/CD for my builds. This is meaningful. However, it fails at novel problems and isn’t practical for my systems programming work.
All that said, I congratulate Bun even as we disagree on philosophy. I imagine it's no small feat getting acquired!
[0] https://filepilot.tech
[1] https://terminal.click
[2] https://handmadecities.com/news/summer-update-2025/
Finding this comment interesting, parent comment didn't suggest any past association but it seemingly uses project reference as pivot point to do various outgroup counter signaling / neg bun?
I understand the concern, but really? I found this quote enough to offer proper comments:
> had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types
Folks at Bun are "Zig people" for obvious reasons, and a link was made with Handmade software. This happened multiple times before with Bun specifically, so my response is not a "pivot" of any kind. I've highlighted and constrasted our differences to prevent further associations inside a viral HN thread. That's not unreasonable.
I also explicitly congratulated them for the acquisition.
Indeed, you cleared up exactly the misconception I had. Thanks for chiming in to clarify
I like that the filepilot download is 2.1MB. That really illustrates the difference between handmade style stuff and well, most other stuff.
back in my day we used to write code on punch cards.
> I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc.
I feel like an important step for a language is when people outside of the mainline language culture start using it in anger. In that respect, Zig has very much "made it."
That said, if I were to put on my cynical hat, I do wonder how much of that Anthropic money will be donated to the Zig Software Foundation itself. After all, throwing money at maintaining and promoting the language that powers a critical part of their infrastructure seems like a mutually beneficial arrangement.
> I had a vague idea that "Zig people" were generally "Software You Can Love" or "Handmade Software Movement" types, about small programs, exquisitely hand-written, etc, etc.
In my experience, the extreme anti-LLM people and extreme pro-vibecoding people are a vocal online minority.
If you get away from the internet yelling match, the typical use case for LLMs is in the middle. Experienced developers use them for some small tasks and also write their own code. They know when to switch between modes and how to make the most of LLMs without deferring completely to their output.
Most of all: They don't go around yelling about their LLM use (or anti-use) because they're not interesting in the online LLM wars. They just want to build things with the tools available.
more people should have such a healthy approach not only to llms but to life in general. Same reason I partake less and less in online discourse: its so tribal and filled with anger that its just not worth it to contribute anymore. Learning how to be in the middle did wonders to me as a programmer and I think as a person as well.
"exquisitely hand-written"
This sounds so cringe. We are talking about computer code here lol
I'm not sure about exquisite and small.
Bun genuinely made me doubt my understanding of what good software engineering is. Just take a look at their code, here are a few examples:
- this hand-rolled JS parser of 24k dense, memory-unsafe lines: https://github.com/oven-sh/bun/blob/c42539b0bf5c067e3d085646... (this is a version from quite a while ago to exclude LLM impact)
- hand-rolled re-implementation of S3 directory listing that includes "parsing" XML via hard-coded substrings https://github.com/oven-sh/bun/blob/main/src/s3/list_objects...
- MIME parsing https://github.com/oven-sh/bun/blob/main/src/http/MimeType.z...
It goes completely contrary to a lot of what I think is good software engineering. There is very little reuse, everything is ad-hoc, NIH-heavy, verbose, seemingly fragile (there's a lot of memory manipulation interwoven with business logic!), with relatively few tests or assurances.
And yet it works on many levels: as a piece of software, as a project, as a business. Therefore, how can it be anything but good engineering? It fulfils its purpose.
I can also see why it's a very good fit for LLM-heavy workflows.
I can't speak as much about the last two examples, but writing a giant parser file is pretty common in Zig from what I've seen. Here's Zig's own parser, for example[1]. I'm also not sure what you mean by memory unsafe, since all slices have bounds checks. It also looks like this uses an arena allocator, so lifetime tracking is pretty simple (dump everything onto the allocator, and copy over the result at the end). Granted, I could be misunderstanding the code, but that's the read I get of it.
[1] https://codeberg.org/ziglang/zig/src/commit/be9649f4ea5a32fd...
It used to be arena-allocated but now it's using a different technique which I outlined in this talk: https://vimeo.com/649009599
Why can't you make CLI autocompletions work? It's so basic, but the ticket has languished for almost as long as bun has existed!
Amazing news, congrats! Been using Bun for a long while now and I love it.
Is there anything I could do to improve this PR/get a review? I understand you are def very busy right now with the acquisition, but wanted to give my PR the best shot:
https://github.com/oven-sh/bun/pull/24514
Are you at liberty to divulge how much Anthropic paid for Bun?
Thanks, Jarred. Seeing what you built with Bun has been a real inspiration, the way one focused engineer can shift an entire ecosystem. It pushed me back into caring about the lower-level side of things again, and I’m grateful for that spark. Congrats on the acquisition, and excited to see what’s next
Is this acquihiring?
No. Anthropic need Bun to be healthy because they use it for Claude Code.
Isn't that still "acqui-hiring" according to common usage of the term?
Sometimes people use the term to mean that the buyer only wants some/all of the employees and will abandon or shut down the acquired company's product, which presumably isn't the case here.
But more often I see "acqui-hire" used to refer to any acquisition where the expertise of the acquired company are the main reason to the acquisition (rather than, say, an existing revenue stream), and the buyer intends to keep the existing team dynamics.
Acquihiring usually means that the product the team are working on will be ended and the team members will be set to work on other aspects of the existing company.
That is part of the definition given in the first paragraph of the Wikipedia article, but I think it’s a blurry line when the acquired company is essentially synonymous with a single open source project and the buyer wants the team of experts to continue developing that open source project.
No it isn’t. That’s not an acquihire. They’re keeping the product.
I think it’s an acquihire, and they also like Bun.
But it seems like that could happen faster internally than publicly?
I consider this more of a strategic acquisition.
Hi Jarred. Congratulations on the acquisition! Did (or will) your investors make any profit on what they put into Bun?
I've never personally used Bun. I use node.js I guess. What makes Bun fundamentally better at AI than, say, bundling a node.js app that can run anywhere?
If the answer is performance, how does Bun achieve things quicker than Node?
Easier deployment, you may generate a single binary.
on Bun's website, the runtime section features HTTP, networking, storage -- all are very web-focused. any plans to start expanding into native ML support? (e.g. GPUs, RDMA-type networking, cluster management, NFS)
Probably not. When we add new APIs in Bun, we generally base the interface off of popular existing packages. The bar is very high for a runtime to include libraries because the expectation is to support those APIs ~forever. And I can’t think of popular existing JS libraries for these things.
Congrats on the payday :)
Do you think Anthropic might request you implement private APIs?
You said elsewhere that there were many suitors. What is the single most important thing about Anthropic that leads you to believe they will be dominant in the coming years?
No idea about his feelings but believing that they will be dominant wouldn't have to be the reason he chose them. I could easily imagine that someone would decide based on (1) they offered enough money and (2) values alignment.
How much of your day-to-day is spent contributing code to the Bun codebase and do you expect it to decrease as Anthropic assigns more people to work on Bun?
Hi Jarred,
I contributed to Bun one time for SQLite. I've a question about the licensing. Will each contributor continue to retain their copyright, or will a CLA be introduced?
Thanks
With Bun's existing OSS license and contribution model, all contributors retain their copyright and Bun retains the license to use those contributions. An acquisition of this kind cannot change the terms under which prior contributions were made without explicit agreement from all contributors. If Bun did switch to a CLA in the future, just like with any OSS project, that would only impact future contributions made after that CLA went into effect and it depends entirely on the terms established in that hypothetical CLA.
Does this acquisition preclude implementing an s3 style integration for AWS bedrock? Also is IMDSv2 auth on the roadmap?
Any chance there will be some kind of updating mechanism for 'compiled' bun executables?
I have a PR that’s been sitting for awhile that exposes the extra options from the renameat2 and renameatx_np syscalls which is a good way to implement self-updaters that work even when multiple processes are updating the same path on disk at the same time. These syscalls are supported on Linux & macOS but I don’t think there’s an equivalent on Windows. We use these syscalls internally for `bun install` to make adding packages into the global install cache work when multiple `bun install` processes are running simultaneously
No high-level self updater api is planned right now, but yes for at least the low level parts needed to make a good one
Hi Jarred, thanks for all your work on Bun.
I know that one thing you guys are working on or are at least aware of is the size of single-file executables. From a technical perspective, is there a path forward on this?
I'm not familiar with Bun's internals, but in order to get the size down, it seems like you'd have to somehow split up/modularize Bun itself and potentially JavaScriptCore as well (not sure how big the latter is). That way only the things that are actually being used by the bundled code are included in the executable.
Is this even possible? Is the difficulty on the Bun/Zig side of things, or JSC, or something else? Seems like a very interesting (and very difficult) technical problem.
how the helldid you got that og name here in hn
asking the real questions
One more thing I hope doesn't change, is the fun Release videos :-) I really enjoy them. They're very apple-y, and for just a programming tool.
"work on Bun." LOL.
Congratulations.
Yeah why are you not out on a boat somewhere enjoying this moment? Go have fun please.
Acq's typically have additional stips you have to follow - they probably have new deadlines and some temporary stress for the next few months.
yes, acquisitions rarely result in an immediate cash payout.
What happens to Bun in a scenario where Anthropic goes under?
Any thoughts on the claude "soul document" that was leaked this week?
I wonder if this is a sign of AI companies trying to pivot?
> Bun will ship faster.
That'll last until FY 2027. This is an old lie that acquirers encourage the old owner to say because they have no power to enforce it, and they didn't actually say it so they're not on the hook. It's practically a cheesy pickup line, and given the context, it kinda is.
This is why we can't have nice things
I would like to clarify that I wish I weren't right but I probably am.
I’ll be honest, while I have my doubts about the match of interests and cohesion between an AI company and a JS runtime company I have to say this is the single best acquisition announcement blog post I’ve seen in 20 years or so.
Very direct, very plain and detailed. They cover all the bases about the why, the how, and what to expect. I really appreciate it.
Best of luck to the team and hopefully the new home will support them well.
But how is another company that is also VC backed and losing money providing stability for Bun?
How long before we hear about “Our Amazing Journey”?
On the other hand, I would rather see someone like Bun have a successful exit where the founders seem to have started out with a passion project, got funding, built something out they were excited about and then exit than yet another AI company by non technical founders who were built with the sole purpose of getting funding and then exit.
Anthropic may be losing money, but a company with $7bn revenue run rate (https://www.anthropic.com/news/statement-dario-amodei-americ...) is a whole lot healthier than a company with a revenue of 0.
If I had the cash, I could sell dollar bills for 50 cents and do a $7b run rate :)
If that was genuinely happening here - Anthropic were selling inference for less than the power and data center costs needed to serve those tokens - it would indeed be a very bad sign for their health.
I don't think they're doing that.
Estimates I've seen have their inference margin at ~60% - there's one from Morgan Stanley in this article, for example: https://www.businessinsider.com/amazon-anthropic-billions-cl...
>The bank's analysts then assumed Anthropic gross profit margins of 60%, and estimated that 75% of related costs are spent on AWS cloud services.
Not estimate, assumption.
Those are estimates. Notice they didn’t assume 0% or a million %. They chose numbers that are a plausible approximation of the true unknown values, also known as an estimate.
If Morgan Stanley are willing to stake their credibility on an assumption I'm going to take that assumption seriously.
This is pretty silly thing to say. Investment banks suffer zero reputational damage when their analysts get this sort of thing wrong. They don’t even have to care about accuracy because there will never be a way to even check this number, if anyone even wanted to go back and rate their assumptions, which also never happens.
Fair enough. I was looking for a shortcut way of saying "I find this guess credible", see also: https://news.ycombinator.com/item?id=46126597
Calling this unmotivated assumption an "estimate" is just plain lying though, regardless of the faith uou have in the source of the assumption.
I've seen a bunch of other estimates / claims of a %50-60 margin for Anthropic on serving. This was just the first one I found a credible-looking link I could drop into this discussion.
The best one is from the Information, but they're behind a paywall so not useful to link to. https://www.theinformation.com/articles/anthropic-projects-7...
They had pretty drastic price cuts on Opus 4.5. It's possible they're now selling inference at a loss to gain market share, or at least that their margins are much lower. Dario claims that all their previous models were profitable (even after accounting for research costs), but it's unclear that there's a path to keeping their previous margins and expanding revenue as fast or faster than their costs (each model has been substantially more expensive than the previous model).
It wouldn't surprise me if they found ways to reduce the cost of serving Opus 4.5. All of the model vendors have been consistently finding new optimizations over the last few years.
I sure hope serving Opus 4.5 at the current cost is sustainable. It’s the first model I can actually use for serious work.
I've been wondering about this generally... Are the per-request API prices I'm paying at a profit or a loss? My billing would suggest they are not making a profit on the monthly fees (unless there are a bunch of enterprise accounts in group deals not being used, I am one of those I think)
but those AI/ML researchers aka LLM optimization staff are not cheap. their salaries have skyrocketed, and some are being fought for like top-tier soccer stars and actors/actresses
The leaders of Anthropic, OpenAI and DeepMind all hope to create models that are much more powerful than the ones they have now.
A large portion of the many tens of billions of dollars they have at their disposal (OpenAI alone raised 40 billion in April) is probably going toward this ambition—basically a huge science experiment. For example, when an AI lab offers an individual researcher a $250 million pay package, it can only be because they hope that the researcher can help them with something very ambitious: there's no need to pay that much for a single employee to help them reduce the costs of serving the paying customers they have now.
The point is that you can be right that Anthropic is making money on the marginal new user of Claude, but Anthropic's investors might still get soaked if the huge science experiment does not bear fruit.
> their investors might still take a bath if the very-ambitious aspect of their operations do not bear fruit
Not really. If the technology stalls where it is, AI still have a sizable chunk of the dollars previously paid to coders, transcribers, translators and the like.
Surely you understand the bet Anthropic is making, and why it's a bit different than selling dollars at a discount
Because discounted dollar bills are still a tangible asset, but churning language models are intangible?
Maybe for those of us not-too-clever ones, what is the bet? Why is it different? Would be pretty great to have like a clear articulation of this!
The bet, (I would have thought) obviously, is that AI will be a huge part of humanity’s future, and that Anthropic will be able to get a big piece of that pie.
This is (I would have thought) obviously different from selling dollars for $0.50, which is a plan with zero probability of profit.
Edit: perhaps the question was meant to be about how Bun fits in? But the context of this sub-thread has veered to achieving a $7 billion revenue.
The question is/was about how they intend to obtain that big piece of pie, what that looks like.
You are saying that you can raise $7b debt at double-digit interest rate. I am doubtful. While $7b is not a big number, the Madoff scam is only ~$70b in total over many years.
> the Madoff scam is only ~$70b in total
Incorrect - that was the fraudulent NAV.
An estimate for true cash inflow that was lost is about $20 billion (which is still an enormous number!)
No, I'm scamming myself. Halving my fortune because I believe karma will somehow repay me ten fold some time later.
Somehow? I've been keeping an eye on my inbox, waiting to get a karma vesting plan from HN, for ages. What's this talk of somehow?
you have anthropic confused with something like lovable.
anthropic's unit margins are fine, many lovable-like businesses are not.
Or I'm just saying revenue numbers alone don't prove anything useful when you have deep pockets.
They don't need revenue, they need a community. I don't know how this acquisition will affect that.
I am fairly skeptical about many AI companies, but as someone else pointed out, Anthropic has 10x'ed their revenue for the past 3 years. 100m->1b->10b. While past performance no predictor of future results, their product is solid and to me looks like they have found PMF.
Idk, I’m no business expert by any means, but I’m a hell of a lot more _scared_ by a company burning so much that’s $7b is still losing
Often it happens that VCs buy out companies from funds belonging to a fresh because the selling fund wants to show performance to their investors until "the big one", or move cash one from wealthy pocket to another one.
"You buy me this, next time I save you on that", etc...
"Raised $19 million Series A led by Khosla Ventures + $7 million"
"Today, Bun makes $0 in revenue."
Everything is almost public domain (MIT) and can be forked without paying a single dollar.
Questionable to claim that the technology is the real reason this was bought.
It's an acquihire. If Anthropic is spending significant resources, or see that they will have to, to improve Bun internally already it makes a lot of sense. No nefarious undertones required.
An analogous example off the top of my head is Shopify hired Rafael Franca to work on Rails full-time.
If it was an acquihire, still a lot less slimy than just offering the employees they care about a large compensation package and leaving the company behind as a husk like Amazon, Google and Microsoft have done recently.
Is it? What's wrong with hiring talent for a higher salary?
You have no responsibility for an unrelated company's operations; if that was important to them they could have paid their talent more.
From the acquirer’s perspective, you’re right. (Bonus: it diminishes your own employees’ ability to leave and fundraise to compete with you.)
From an ecosystem perspective, acquihires trash the funding landscape. And from the employees’ perspective, as an investor, I’d see them being on an early founding team as a risk going forward. But that isn’t relevant if the individual pay-off is big.
> And from the employees’ perspective, as an investor, I’d see them being on an early founding team as a risk going forward.
Every employee is a flight risk if you don't pay them a competitive salary; that's just FUD from VC bros who are getting their playbook (sell the company to the highest bidder and let early employees get screwed) used against them.
> Every employee is a flight risk if you don't pay them a competitive salary
Not relevant to acquihires, who typically aren’t hired away with promises of a salary but instead large signing bonuses, et cetera, and aren’t typically hired individually but as teams. (You can’t solve key man problems with compensation alone, despite what every CEO compensation committee will lead one to think.)
> that's just FUD
What does FUD mean in this context? I’m precisely relaying a personal anecdote.
> aren’t hired away with promises of a salary but instead large signing bonuses
Now you're being nitpicky. Take the vesting period of the sign on bonus, divide the bonus amount by that and add it to the regular salary and you get the effective salary.
> aren’t typically hired individually but as teams.
So? VC bros seem to forget the labor market is also a free market as soon it hurts their cashout opportunity.
> What does FUD mean in this context? I’m precisely relaying a personal anecdote.
Fear, Uncertainty and Doubt. Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future.
> Now you're being nitpicky. Take the vesting period of the sign on bonus, divide the bonus amount by that and add it to the regular salary and you get the effective salary
These aren't the same things and nobody negotating and acquisition or acqhihire converts in this way. (I've done both.)
> Fear, Uncertainty and Doubt. Your anecdote is little more than a scare story. It can be summarized as: if you don't let us cashout this time, we'll hold this against you in some undefined future
It's a personal anecdote. There shouldn't be any uncertainty about what I personally believe. I've literally negotiated acquihires. If you're getting a multimillion dollar payout, you shouldn't be particularly concerned about your standing in the next founding team unless you're a serial entrepreneur.
Broader online comment, invoking FUD seems like shorthand for objecting to something without knowing (or wanting to say) why.
And the secretary, sales, project managers, etc who get left behind because the founders and key people were taken away? In an acquisition, they may still be let go. But they also would make money from their equity
You want those people specifically. To get them, you need to hire them for a lot more money than you pay your current folks. That causes a lot of resentment with folks and messes up things like salary bands, etc.
But since they own equity in the current company, you can give them a ton of money by buying out that equity/paying acquisition bonuses that are conditional on staying for specific amounts of time, etc. And your current staff doesn't feel left out because "it's an acquisition" the way they would if you just paid some engineers 10x or 100x what you pay them.
Who should be paying the founders more? The ones that made a deal with the VCs? They would be hired away from the company.
I left out the part that the motivations for the acquirers were not to save money or to be slimy. It was the only way to get around overzealous government regulators making it harder to acquirer companies.
The real risk is not that Anthropic will run out of money, but that they will change their strategy to something that isn't Bun-based, and supporting Bun won't make sense for them any more.
Is there anything you’d need from bun in the future that can’t be done by forking it?
> But how is another company that is also VC backed and losing money providing stability for Bun?
Reminds me of when Tron, the crypto company, bought BitTorrent.
The difference is that Tron is a scam and BitTorrent Inc was nothing special either.
Match made in heaven considering BitTorrent Inc bundles crypto miners and other malware with μTorrent.
GIF of Pam from the office saying, “They’re the same picture.”
I misread Amazon, implying that Amazon might buy Anthropic, and I think that's what will end up happening.
In my three or four non chatbot related projects, I’ve found Amazon’s Nova models to be just as good as Anthropic’s.
Ditto, and I got to know Bun via HN. It seemed intriguing, but also "why another JS runtime" etc.
If Bun embraces the sweet spot around edge computing, modern JS/TS and AI services, I think their future ahead looks bright.
Bun seems more alive than Deno, FWIW.
I admit, it is a good acquisition announcement. I can’t remember the last acquisition announcement that was kept for more than 1-2 years. Leadership changes, priorities shift…
One thing I like about this, despite it meaning Bun will be funded, is Anthropic is a registered public benefit corporation. While this doesn't mean Anthropic cant fuck over the users of Bun, it at least puts in some roadblocks. The path of least-resistance here should be to improve Bun for users, not to monetize it to the point where it's no longer valuable.
> Anthropic is a registered public benefit corporation
Does that mean anything at all?
OpenAI is a public benefit corporation.
I had the same impression: bottom line up front, didn’t bury the lede, no weasel language.
I wonder what this means for Deno.
Will this make it more or less likely for people to use Bun vs Deno?
And now that Bun doesn't need to run a profitable cloud company will they move faster and get ahead of Deno?
Bun and Deno's goals seem quite different, I don't expect that to change. Bun is a one stop shop with an ever increasing number of built-in high-level APIs. Deno is focused on low level APIs, security, and building out a standard lib/ecosystem that (mostly) supports all JS environments.
People who like Bun for what it is are probably still going to, and same goes for Deno.
That being said I don't see how Anthropic is really adding long term stability to Bun.
I think Deno's management have been somewhat distracted by their ongoing lawsuits with Oracle over the release of the Javascript trademark.
I started out with Deno and when I discovered Bun, I pivoted. Personally I don't need the NodeJS/NPM compatability. Wish there was a Bun-lite which was freed of the backward compatability.
Ironically, this was early Deno - but then adoption required backwards compatibility.
I'm in a similar position.
I use Hono, Zod, and Drizzle which AFAIK don't need Node compat.
IIRC I've only used Node compat once to delete a folder recursively with rm.
What do you dislike about having node compatibility?
> Will this make it more or less likely for people to use Bun vs Deno?
I'm not sure it will make much of a difference in the short term.
For those who were drawn to Bun by hype and/or some concerns around speed, they will continue to use Bun.
For me personally, I will continue to use Node for legacy projects and will continue using Deno for current projects.
I'm not interested in Bun for it's hype (since hype is fleeting). I have a reserved interested in Bun's approach to speed but I don't see it being a significant factor since most JS speed concerns come from downloading dependencies (which is a once-off operation) and terrible JS framework practices (which aren't resolved by changing engines anyway).
----------------------------
The two largest problems I see in JS are:
1. Terrible security practices
2. A lack of a standard library which pushes people into dependency hell
Deno fixes both of those problems with a proper permission model and a standard library.
----------------------------
> And now that Bun doesn't need to run a profitable cloud company will they move faster and get ahead of Deno?
I think any predictions between 1-10 years are going to be a little too chaotic. It all depends on how the AI bubble goes away.
But after 10 years, I can see runtimes switching from their current engines to one based on Boa, Kiesel or something similar.
Prediction Bun is absorbed in house and used by Anthropic to have faster/cheaper places for Claude to run code.
It fades away as a direct to developer tool.
This is a good thing for Deno.
Anthropic has been trying to win the developer marketshare, and has been quite successful with Claude Code. While I understand the argument that this acquisition is to protect their usage in CC or even just to acquire the team, I do hope that part of their goal is to use this to strengthen their brand. Being good stewards of open source projects is a huge part of how positively I view a company.
> Being good stewards of open source projects is a huge part of how positively I view a company.
Maybe an easier first step would be to open source Claude Code...?
As someone who have been using Deno for the last few years, is there anything that Bun does better? Bun seems to use a different runtime (JSC) which is less tested than V8, which makes me assume it might perform worse in real-world tasks (maybe not anymore?). The last time I checked Bun's source code, it was... quite messy and spaghetti-like, plus Zig doesn't really offer many safety features, so it's not that hard to write incorrect code. Zig does force some safety with ReleaseSafe IIRC, but it's still not the same as even modern C++, let alone Rust.
I'll admit I'm somewhat biased against Bun, but I'm honestly interested in knowing why people prefer Bun over Deno.
I haven't used Deno, but I do use Bun purely as a replacement for npm. It does the hard-linking thing that seems to be increasingly common for package managers these days (i.e. it populates your local node_modules with a bunch of hard links to its systemwide cache), which makes it vastly quicker and more disk-efficient than npm for most usage.
Even with a cold cache, `bun install` with a large-ish dependency graph is significantly faster than `npm install` in my experience.
I don't know if Deno does that, but some googling for "deno install performance vs npm install" doesn't turn up much, so I suspect not?
As a runtime, though, I have no opinion. I did test it against Node, but for my use case (build tooling for web projects) it didn't make a noticeable difference, so I decided to stick with Node.
Deno does all that. Hell, yarn does too, or pnpm as the sibling mentioned.
Sure, but pnpm is very slow compared to bun.
Deno does that. It also refrains from keeping a local node_modules at all until/unless you explicitly ask it to for whatever compatibility reason. There are plugins to things like esbuild to use the Deno resolver and not need a node_modules at all (if you aren't also using the Deno-provided bundler for whatever reason such as it disappeared for a couple versions and is still marked "experimental").
pnpm does all that on top of node. Also disables postinstall scripts by default, making the recent security incidents we've seen a non-issue.
As the victim of the larger pre-Shai-Hulud attack, unfortunately the install script validation wouldn't have protected you. Also, if you already have an infected package on the whitelist, a new infection in the install script will still affect you.
A whitelist in package.json is only a partial assist
I’m not sure why but bun still feels snappier.
This is why: https://bun.com/blog/behind-the-scenes-of-bun-install
Aside from speed, what would the major selling points be on migrating from pnpm to bun?
Are there any popular packages that require postinstall scripts that this hurts?
IIRC bun zig code base has a lot of fine optimization too. I think the lead did a conference explaining his work. Or maybe i'm confused.
https://bun.com/blog/behind-the-scenes-of-bun-install
oh thanks yes, i couldn't find it, i was already lost thinking it was a conference by andrew kelley .. thanks a lot
I decided to stick with Node in general. I don't see any compelling reason to change it.
Faster install and less disk space due to hardlink? Not really all that important to me. Npm comes with a cache too, and I have the disk space. I don't need it to be faster.
With the old-school setup I can easily manually edit something in node_modules to quickly test a change.
No more node_modules? It was a cool idea when yarn 2 initially implemented it, but at the end of the day I prefer things to just work rather than debug what is and isn't broken by the new resolver. At the time my DevOps team also wasn't too excited about me proposing to put the dependencies into git for the zero-install.
Search for pointer exceptions or core dumps on Bun's GitHub issues and you'll see why people (should) use Deno over Bun, if only because Rust is a way more safe language than Zig.
This is a non sequitur. Both Rust and Zig and any other language has the ability to end in an exception state. Whether it be kernel exception, pointer exception, or Rust's panic! - these things exist.
The reason why you see so many GitHub issues about it is because that's where the development is. Deno is great. Bun is great. These two things can both be great and we don't have to choose sides. Deno has it's use case. Bun has it's. Deno want's to be secure and require permissions. Bun just wants to make clean, simple, projects. This fight between Rust vs The World is getting old. Rust isn't any "safer" when Deno can panic too.
Don't make a false equivalence, how many times does one get a panic from Deno versus a segmentation fault in Bun? It's not a similar number, and it's simply wrong to say that both are just as unsafe when that's plainly untrue.
Anecodtally? Zero segfaults with bun since I started using it back in beta.
I use Bun in production. Well, one of my clients.
We have yet to witness a segfault. Admitedly it's a bunch of micro services and not many requests/s (around 5k AVG).
The only time I got a segfault in Bun is when I used bun:ffi to wrap glfw and wgpu-native so I can threejs on the desktop. Ironically, the segfault was in wgpu. Which is Rust. But to be fair it was because the glfw surface had dirty flags for OpenGL and didn’t have the Vulkan extensions. So anyone would have faulted.
> This is a non sequitur. Both Rust and Zig and any other language has the ability to end in an exception state.
There are degrees to this though. A panic + unwind in Rust is clean and _safe_, thus preferable to segfaults.
Java and Go are another similar example. Only in the latter can races on multi-word data structures lead to "arbitrary memory corruption" [1]. Even in those GC languages there's degrees to memory safety.
1: https://go.dev/ref/mem
I'll take a small panic and unwind any day over a total burnout crash. Matters in code and life.
I agree. Pointing at Github issues is a strange metric to me. If we want to use that as a canary then you shouldn't use Deno (2.4k open issues) or Bun (4.5k open issues) at all.
I haven't verified this, but I would be willing to bet that most of Bun's issues here have more to do with interfacing with JavaScriptCore through the C FFI than Zig itself. this is as much a problem in Rust as it is in Zig. in fact, it has been argued that writing unsafe Zig is safer than writing unsafe Rust: https://zackoverflow.dev/writing/unsafe-rust-vs-zig/
As someone who has researched the internals of Deno and Bun, your unverified vibe thoughts are flat out wrong. Bun is newer and buggier and that's just the way things go sometimes. You'll get over it.
[flagged]
[flagged]
Easily bundling and serving frontend code from your backend code is very appealing: https://bun.com/docs/bundler/fullstack
Despite the page title being "Fullstack dev server", it's also useful in production (Ctrl-F "Production Mode").
I tried several times to port Node projects to Deno. Each time compatibility had "improved" but I still didn't have a working build after a few days of effort.
I don't know how Deno is today. I switched to Bun and porting went a lot smoother.
Philosophically, I like that Bun sees Node compatibility as an obvious top priority. Deno sees it as a grudging necessity after losing the fight to do things differently.
Which makes sense given that a big impetus for Deno's existence was the creator of Node/Deno (Ryan Dahl) wanting to correct things he viewed as design mistakes in Node.
I’ve been using Deno too. Although npm support has improved and it’s fine for me, I think Deno has more of a “rewrite the world” philosophy. For example, they created their own package registry [1] and their own web framework [2]. Bun seems much more focused on preexisting JavaScript projects.
[1] https://jsr.io/ [2] https://fresh.deno.dev/
It's interesting that people have directly opposite opinions on whether Deno or Bun are meant to be used with the existing ecosystem - https://news.ycombinator.com/item?id=46125049
I don’t think these are mutually exclusive takes. Bun is essentially taking Node and giving it a standard library and standard tooling. But you can still use regular node packages if you want. Whereas Deno def leaned into the clean break for a while
[dead]
> Bun seems to use a different runtime (JSC) which is less tested than V8, which makes me assume it might perform worse in real-world tasks (maybe not anymore?).
JSC is still the JS engine for WebKit-based browsers, especially Safari, and per Apple App Store regulations the only JS engine supposedly allowable in all of iOS.
It's more "mature" than V8 in terms of predating it. (V8 was not a fork of it and was started from scratch, but V8 was designed to replace it in the Blink fork from WebKit.)
It has different performance goals and performance characteristics, but "less tested" seems uncharitable and it is certainly used in plenty of "real-world tasks" daily in iOS and macOS.
My team has been using it in prod for about a year now. There were some minor bugs in the runtime's implementation of buffers in 1.22 (?), but that was about the only issue we ran into.
The nice things:
1. It's fast.
2. The standard library is great. (This may be less of an advantage over Deno.)
3. There's a ton of momentum behind it.
4. It's closer to Node.js than Deno is, at least last I tried. There were a bunch of little Node <> Deno papercuts. For example, Deno wanted .ts extensions on all imports.
5. I don't have to think about JSR.
The warts:
1. The package manager has some issues that make it hard for us to use. I've forgotten why now, but this in particular bit us in the ass: https://github.com/oven-sh/bun/issues/6608. We use PNPM and are very happy with it, even if it's not as fast as Bun's package manager.
Overall, Deno felt to me like they were building a parallel ecosystem that I don't have a ton of conviction in, while Bun feels focused on meeting me where I am.
It has wayyyyy better nodejs compatibility (day 1 goal)
As far as I know, modern Node compat in Deno is also quite great - I just import packages via 'npm:package' and they work, even install scripts work. Although I remember that in the past Deno's Node compat was worse, yes.
Pretty sure one of the Deno day 1 goals was to correct mistakes made during the early days of Node.js.
I really want to like Deno and will likely try it again, but last time I did it was just a bit of a pain anytime I wanted to use something built for npm (which is most packages out there), whereas bun didn't have that problem.
There's certainly an argument to be made that, like any good tool, you have to learn Deno and can't fall back on just reusing node knowledge, and I'd absolutely agree with that, but in that case I wanted to learn the package, not the package manager.
Edit: Also it has a nice standard library, not a huge win because that stuff is also doable in Deno, but again, its just a bit less painless
Looking at Bun's website (the comparison table under "What's different about Bun?") and what people have said here, the only significant benefit of Bun over Node.js seems to be that it's more batteries-included - a bigger standard library, more tools, some convenience features like compiling JSX and stripping TypeScript types on-the-fly, etc.
It's not clear to me why that requires creating a whole new runtime, or why they made the decisions they did, like choosing JSC instead of V8, or using a pre-1.0 language like Zig.
I had memory leaks in bun and not in deno or node for the same code. ymmv
It just works. Whatever JavaScript/TypeScript file or dependencies I throw at it, it will run it without needing to figure out CJS or ESM, tsconfig, etc.
I haven't had that experience with deno (or node)
Same. I had a little library I wrote to wrap indexedDB and deno wouldn't even compile it because it referenced those browser apis. I'm sure it's a simple flag or config file property, or x, or y, or z, but the simple fact is, bun didn't fail to compile.
Between that and the discord, I have gotten the distinct impression that deno is for "server javascript" first, rather than just "javascript" first. Which is understandable, but not very catering to me, a frontend-first dev.
Even for server ~~java~~typescript, I almost always reach for Bun nowadays. Used to be because of typestripping, which node now has too, but it's very convenient to write a quick script, import libraries and not have to worry about what format they are in.
Is JSC less tested? I thought it was used in Safari, which has some market share.
I used bun briefly to run the output of my compiler, because it was the only javascript runtime that did tail calls. But I eventually added a tail call transform to my compiler and switched to node, which runs 40% faster for my test case (the compiler building itself).
I've found it to be at least twice as fast with practically no compat issues.
Twice as fast at executing JavaScript? There's absolutely zero chance this is true. A JavaScript engine that's twice as fast as V8 in general doesn't exist. There may be 5 or 10 percent difference, but nothing really meaningful.
You might want to revise what you consider to be "absolutely zero chance". Bun has an insanely fast startup time, so it definitely can be true for small workloads. A classic example of this was on Bun's website for a while[1] - it was "Running 266 React SSR tests faster than Jest can print its version number".
[1]: https://x.com/jarredsumner/status/1542824445810642946
Keep in mind that it's not just a matter of comparing the JS engine. The runtime that is built around the engine can have a far greater impact on performance than the choice of v8 vs. JSC vs. anything else. In many microbenchmarks, Bun routinely outperforms Node.js and Deno in most tasks by a wide margin.
It depends on what. Bun has some major optimisations. You’ll have to read into them if you don’t believe me. The graphs don’t come from nowhere
Agreed, the language would be interesting during the 1990's, nowadays not so much.
The tools that the language offers to handle use after free is hardly any different from using Purify, Insure++ back in 2000.
I find comments like this fascinating, because you're implicitly evaluating a counterfactual where Bun was built with Rust (or some other "interesting" language). Maybe Bun would be better if it were built in Rust. But maybe it would have been slower (either at runtime or development speed) and not gotten far enough along to be acquired by one of the hottest companies in the world. There's no way to know. Why did Anthropic choose Bun instead of Deno, if Deno is written in a better language?
Because maybe they reached out to them, and they didn't took the money, while Bun folks business model wasn't working out?
Who knows?
Besides, how are they going to get back the money spent on the acquisition?
Many times the answer to acquisitions has nothing to do with technology.
> Claude Code, FactoryAI, OpenCode, and others are all built with Bun.
Anthropic chose to use Bun to build their tooling.
We can think of they making bun an internal tool, push roadmap items that fit their internal products, whatever, which doesn't answer the getting back money of the acquisition.
Profit in those products has to justify having now their own compiler team for a JavaScript runtime.
Don't engage with this guy, he shows up in every one of these threads to pattern match back to his heyday without considering any of the nuance of what is actually different this time.
Look an admirer!
> I'll admit I'm somewhat biased against Bun?
Why? Genuine question, sorry if it was said/implied in your original message and I missed it.
Good question, hard to say, but I think it's mainly because of Zig. At its core Zig is marketed as a competitor to C, not C++/Rust/etc, which makes me think it's harder to write working code that won't leak or crash than in other languages. Zig embraces manual memory management as well.
> At its core Zig is marketed as a competitor to C, not C++/Rust/etc
What gives you this impression?
I directly created Zig to replace C++. I used C++ before I wrote Zig. I wrote Zig originally in C++. I recently ported Chromaprint from C++ to Zig, with nice performance results. I constantly talk about how batching is superior to RAII.
Everyone loves to parrot this "Zig is to C as Rust is to C++" nonsense. It's some kind of mind virus that spreads despite any factual basis.
Rust is more of a competitor to C++ than C. Manual memory management is sometimes really helpful and necessary. Zig has a lot of safety features.
I mean, they said they looked at the source code and thought it was gross, so there’s a justification for their concern, at least.
I always figured Bun was the "enterprise software" choice, where you'd want to use Bun tools and libraries for everything and not need to bring in much from the broader NPM library ecosystem.
Deno seems like the better replacement for Node, but it'd still be at risk of NPM supply chain attacks which seems to be the greater concern for companies these days.
If you want to download open source libraries to be used in your Bun project then they will come from npm, at least by default. [1].
So it seems odd to say that Bun is less dependent on the npm library ecosystem.
[1] It’s possible to use jsr.io instead: https://jsr.io/docs/using-packages
Yes, both can pull in open source libraries and I can't imagine either dropping that ability. Though they do seem to have different eagerness and competency on Node compatibility and Bun seems better on that front.
From a long term design philosophy prospective, Bun seems to want to have a sufficiently large core and standard library where you won't need to pull in much from the outside. Code written for Node will run on Bun, but code using Bun specific features won't run on Node. It's the "embrace, extend, ..." approach.
Deno seems much more focused on tooling instead of expanding core JS, and seems to draws the line at integrations. The philosophy seems to be more along the lines of having the tools be better about security when pulling in libraries instead of replacing the need for libraries. Deno also has it's own standard library, but it's just a library and that library can run on Node.
That’s true of some parts of Deno’s standard libraries, but major functionality like Deno.test and Deno.serve are Deno-specific API’s.
Here are the Bun API’s:
https://bun.com/docs/runtime/bun-apis
Here are the Deno API’s:
https://docs.deno.com/api/deno/
Stopped following Deno while they were rejecting the need for a package management solution. Used Bun instead.
Isn’t because packages are one of the problems deno tried to fix?
They tried to realign package management with web standards and tools that browsers can share (URLs and importmaps and "cache, don't install"). They didn't offer compatibility with existing package managers (notably and notoriously npm) until late in that game and took multiple swings at URL-based package repositories (deno.land/x/ and JSR), with JSR eventually realizing it needed stronger npm compatibility.
Bun did prioritize npm compatibility earlier.
Today though there seems to be a lot of parity, and I think things like JSR and strong importmaps support start to weigh in Deno's favor.
> is there anything that Bun does better?
Telling prospective employees that if you're not ready to work 60-hour weeks, then what the fuck are you doing here? for one.
> Zig does force some safety with ReleaseSafe IIRC
which Bun doesn't use, choosing to go with `ReleaseFast` instead.
Is it just me, but I don't find npm that slow? Sure it's not a speed demon, but I rarely need to do npm install anyways so it's not a bottleneck for me.
For deploy, usually running the attached terraform script takes more time.
So while a speed increase is welcome, but I don't feel it gives me such a boost.
The speed shows up for large projects. Especially if you end up with multiple node_modules directories in your dev sandbox.
I've been using Bun since 2022 just to be trendy for recruitment (it worked, and still works despite it almost being 2026)
Bun is fast, and its worked as a drop in replacement for npm in large legacy projects too.
I only ever encountered one issue, which was pretty dumb, Amazon's CDK has hardcoded references to various package manager's lock files, and Bun wasn't one of them
https://github.com/aws/aws-cdk/issues/31753
This wasn't fixed till the end of 2024 and as you can see, only accidentally merged in but tolerated. It was promptly broken by a bun breaking change
https://github.com/aws/aws-cdk/issues/33464
but don't let Amazon's own incompetency be the confirmation bias you were looking for about using a different package manager in production
you can use SST to deploy cloud resources on AWS and any cloud, and that package works with bun
My first thought went to how openai used Rust to build their CLI tool and Anthropic's CEO bought influence over Zig as a reaction.
That would require them to hire/buy Zig team. Which is not the case.
> bought influence over Zig as a reaction
Elaborate? I believe Zig's donors don't get any influence and decision making power.
From the comments here it sounds like most people think the amount Anthropic paid for the company was probably not much more than the VC funding which Bun raised.
How would the payout split work? It wouldn’t seem fair to the investors if the founder profited X million while the investors get their original money returned. I understand VC has the expectation that 99 out of 100 of investments will net them no money. But what happens in the cases where money is made, it just isn’t profitable for the VC firm.
What’s to stop everyone from doing this? Besides integrity, why shouldn’t every founder just cash out when the payout is life-changing?
Is there usually some clause in the agreements like “if you do not return X% profit, the founder forfeits his or her equity back to the shareholders”?
All VC's have preferred shares, meaning in case of liquation like now, they get their investment back, and then the remainder gets shared.
Additionally, depending on round, they also have multiples, like 2x meaning they get at least 2x their investment before anyone else gets anything
Probably not much more than their valuation, which is the key difference since the investor will still get a net return.
To be honest, I never thought of Bun as something that someone would buy or invest in. What product do they sell?
This acquisition makes no sense.
Investors must be happy because Bun never had to find out how to become profitable.
It’s enough Anthropic finds it profitable to run Claude Code on it.
Hard to say it makes no sense when you don't know how much they were acquired for. I would guess it is a trivial amount relative to Anthropic's war chest.
> This acquisition makes no sense.
except this sense:
> Investors must be happy because Bun never had to find out how to become profitable.
But what is the upside for anthropic?
I've seen a few of these seemingly random acquisitions lately, and I congratulate the companies and individuals that are acquired during this gold rush, but it definitely feels awkwardly artificial.
Quote from the CEO of Anthropic in March 2025: "I think we'll be there in three to six months where AI is writing 90% of the code and then in 12 months we may be in a world where AI is writing essentially all of the code"
I think this wound up being close enough to true, it's just that it actually says less than what people assumed at the time.
It's basically the Jevons paradox for code. The price of lines of code (in human engineer-hours) has decreased a lot, so there is a bunch of code that is now economically justifiable which wouldn't have been written before. For example, I can prompt several ad-hoc benchmarking scripts in 1-2 minutes to troubleshoot an issue which might have taken 10-20 minutes each by myself, allowing me to investigate many performance angles. Not everything gets committed to source control.
Put another way, at least in my workflow and at my workplace, the volume of code has increased, and most of that increase comes from new code that would not have been written if not for AI, and a smaller portion is code that I would have written before AI but now let the AI write so I can focus on harder tasks. Of course, it's uneven penetration, AI helps more with tasks that are well-described in the training set (webapps, data science, Linux admin...) compared to e.g. issues arising from quirky internal architecture, Rust, etc.
That's ridiculous. Not it isn't even close.
At an individual level, I think it is for some people. Opus/Sonnet 4.5 can tackle pretty much any ticket I throw at it on a system I've worked on for nearly a decade. Struggles quite a bit with design, but I'm shit at that anyway.
It's much faster for me to just start with an agent, and I often don't have to write a line of code. YMMV.
Sonnet 3.7 wasn't quite at this level, but we are now. You still have to know what you're doing mind you and there's a lot of ceremony in tweaking workflows, much like it had been for editors. It's not much different than instructing juniors.
Why didn't they just use AI to write their own Bun instead of wasting 8-9 figures on this company? Makes no sense.
From the article, Claude Code is being used extensively to develop Bun already.
> Over the last several months, the GitHub username with the most merged PRs in Bun's repo is now a Claude Code bot. We have it set up in our internal Discord and we mostly use it to help fix bugs. It opens PRs with tests that fail in the earlier system-installed version of Bun before the fix and pass in the fixed debug build of Bun. It responds to review comments. It does the whole thing.
You do still need people to make all the decisions about how Bun is developed, and to use Claude Code.
> You do still need people to make all the decisions about how Bun is developed, and to use Claude Code.
Yeah but do you really need external hires to do that? Surely Anthropic has enough experienced JavaScript developers internally they could decide how their JS toolchain should work.
Actually, this is thinking too small. There's no reason that each developer shouldn't be able to customize their own developer tools however they want. No need for any one individual to control this, just have devs use AI to spin up their own npm-compatible package management tooling locally. A good day one onboarding task!
They likely have other things to do.
Deciding what to Implement
and
Implementing the Decisions
are complementary, one of these is being commoditised.
And, in fact, decimated.
Personally I am benefitting almost beyond measure because I can spend my time as the architect rather than the builder.
Same. I don’t understand how people aren’t getting this yet. I’m spending all day thinking, planning and engineering while spending very little time typing code. My productivity is through the roof. All the code in my commits is of equal quality to what I would produce myself, why wouldn’t it be? Sure one can just ask AI to do stuff and not review it and iterate, but why on earth would one do that? I’m starting to feel that anyone who’s not getting this positive experience simply isn’t good at development to begin with.
"Wasting" is doing a lot of work in that sentence.
They're effectively bringing on a team that's been focused on building a runtime for years. The models they could throw at the problem can't be tapped on the shoulder, and there's no guarantee they'd do a better job at building something like Bun.
Let me refer you back to the GP, where the CEO of Anthropic says AI will be writing most code in 12 months. I think the parent comment you replied to was being somewhat facetious.
Because 90% is not 100%.
Maybe he was correct in the extremely literal sense of AI producing more new lines of code than humans, because AI is no doubt very good at producing huge volumes of Stuff very quickly, but how much of that Stuff actually justifies its existence is another question entirely.
Why do people always stop this quote at the breath? The rest of it says that he still thinks they need tech employees.
> .... and in 12 months, we might be in a world where the ai is writing essentially all of the code. But the programmer still needs to specify what are the conditions of what you're doing. What is the overall design decision. How we collaborate with other code that has been written. How do we have some common sense with whether this is a secure design or an insecure design. So as long as there are these small pieces that a programmer has to do, then I think human productivity will actually be enhanced
(He then said it would continue improving, but this was not in the 12 month prediction.)
Source interview: https://www.youtube.com/live/esCSpbDPJik?si=kYt9oSD5bZxNE-Mn
I actually like claude code, but that was always a risky thing to say (actually I recall him saying their software is 90% AI produced) considering their cli tool is literally infested with bugs. (Or it least it was last time I used it heavily. Maybe they've improved it since.)
It’s writing 90% of my code now but it’s 100% reliant on me to do that effectively.
Do you have a source for the quote?
https://www.youtube.com/watch?v=9Shl1-ZJI6E
Is this why everyone only seems to know the first half of Dario's quote? The guy in that video is commenting on a 40 second clip from twitter, not the original interview.
I posted a link and transcription of the rest of his "three to six months" quote here: https://news.ycombinator.com/item?id=46126784
Thank you.
I'm curious what people think of quotes like these. Obviously it makes an explicit, falsifiable prediction. That prediction is false. There are so many reasons why someone could predict that it would be false. Is it just optimistic marketing speech, or do they really believe it themselves?
Everybody knows that marketing speech is optimistic. Which means if you give realistic estimates, then people are going to assume those are also optimistic.
Given the horrible stability of Windows this year, it seems like Microsoft went all in on that
Why didn't they have the AI write a JS runtime instead of this acquisition?
The big picture of “build a runtime” is an easier idea than “what would make this runtime better and how should the parts interact”.
Accurate for me. Accurate for basically every startup from the past 12 months. Prob not for legacy codebases, though.
AI writes about 90% of my code.
What languages and frameworks? What is the domain space you're operating in? I use Cursor to help with some tasks, but mainly only use the autocomplete. It's great; no complaints. I just don't ever see being able to turn over anywhere close to 90% with the stuff we work on.
My stack is React/Express/Drizzle/Postgres/Node/Tailwind. It's built on Hetzner/AWS, which I terraformed with AI.
You can see my site here, if you'd like: https://chipscompo.com/
Only 10% to go for a full replacement.
Probably about 95% of mine now. Much better than I could for the most part.
Weird, AI writes terrible code for me that would never pass a code review. I guess people have different standards for good code.
Hah. It can’t be “I need to spend more time to figure out how to use these tools better.” It is always “I’m just smarter than other people and have a higher standard.”
Show us your repos.
My stack is React/Express/Drizzle/Postgres/Node/Tailwind. It's built on Hetzner/AWS, which I terraformed with AI.
It's a private repo, and I won't make it open source just to prove it was written with AI, but I'd be happy to share the prompts. You can also visit the site, if you'd like: https://chipscompo.com/
https://github.com/sammcj/mcp-devtools
Spot on.
The tools produce mediocre, usually working in the most technical sense of the word, and most developers are pretty shit at writing code that doesn't suck (myself included).
I think it's safe to say that people singularly focused on the business value of software are going to produce acceptable slop with AI.
Or maybe he's working in a space that is less out of distribution than the work you're doing?
You’re right, I’m not making a nextjs/shadcn/clerk/vercel ai wrapper startup.
I don't remember saying I worked with nextjs, shadcn, clerk (I don't even know what that one is), vercel or even JS/TS so I'm not sure how you can be right but I should know better than to feed the trolls.
I suspect you do not know how to use AI for writing code. No offence intended - it is a journey for everyone.
You have to be setup with the right agentic coding tool, agent rules, agent tools (MCP servers), dynamic context acquisition and workflow (working with the agent operate from a plan rather than simple prompting and hoping for the best).
But if you're lazy, don't put the effort in to understand what you're working with and how to approach it with an engineering mindset - you'll be be left on the outside complaining and telling people how it's all hype.
Always the same answer. It's the user not the AI being blown out of proportion. Tell me, where are all those great amazin applications that were coded 95-100% by AI? Where is the great progress the great new algorithms the great new innovations hiding?
My stack is React/Express/Drizzle/Postgres/Node/Tailwind. It's built on Hetzner/AWS, which I terraformed with AI. Probably 90-95% of it is AI driven.
It's a private repo, and I won't make it open source just to prove it was written with AI, but I'd be happy to share the prompts. You can also visit the site, if you'd like: https://chipscompo.com/
Well, there was this: https://martin.janiczek.cz/2025/11/21/fawk-llms-can-write-a-...
From the link:
"For now, I’ll go dogfood my shiny new vibe-coded black box of a programming language on the Advent of Code problem (and as many of the 2025 puzzles as I can), and see what rough edges I can find. I expect them to be equal parts “not implemented yet” and “unexpected interactions of new PL features with the old ones”.
If you’re willing to jump through some Python project dependency hoops, you can try to use FAWK too at your own risk, at Janiczek/fawk on GitHub."
That doesn't sound like some great success. It mostly compiles and doesn't explode. Also I wouldn't call a toy "innovation" or "revolution".
How many agents, tools, MCP & ACP servers, claude hooks, and workflows do I need to set up before English becomes a good programming language?
One agent, a few sub-agents, 1 MCP server, no "ACP" (never seen that used), no hooks, one workflow that I usually follow.
Do you know of any YouTube videos where you would say they do a very good job of showing off this style of coding?
I made this one recently: https://www.youtube.com/watch?v=qy4ci7AoF9Y - notes here: https://simonwillison.net/2025/Nov/6/upgrading-datasette-plu...
My best writing on this topic is still this though (which doesn't include a video): https://simonwillison.net/2025/Mar/11/using-llms-for-code/
Thanks for this! I've been looking for a good guide to an LLM based workflow, but the modern style of YouTube coding videos really grates on me. I think I might even like this :D
Always enjoy reading your blog Simon!
This one is a bit old now so a number of things have changed (I mostly use Claude Code now, Dynamic context (Skills) etc...) but here's a brief TLDR I did early this year https://www.youtube.com/watch?v=dDSLw-6vR4o
Post a repo
https://github.com/sammcj/mcp-devtools
How much time do you think you saved versus writing it yourself if you factored in the time you spent setting up your AI tooling, writing prompts, contexts etc?
Your best example of something you made with AI is another AI code generator… definitely not beating the AI bubble allegations anytime soon.
1. I didn't say it was a best example, I replied to a comment asking me to "Post a repo" - I posted a repo. 2. Straw man argument. I was asked for a repo, I posted a repo and clearly you didn't look at the code as it's not an "AI code generator".
1. I didn’t ask for a repo. 2. Still wasn’t me. Maybe an AI agent can help you check usernames? 3. Sorry, a plugin for an AI code generator, which is even worse of an example.
What is the business model behind open source projects like bun? How can a company "aquire" it and why does it do that?
In the article they write about the early days
Why do investors invest into people who build something that they give away for free?The post mentions why - Bun eventually wanted to provide some sort of cloud-hosting saas product.
Everyone could offer a cloud-hosted saas product that involves bun, right?
Why invest into a company that has the additional burden of developing bun, why not in a company that does only the hosting?
The standard argument here is that the maintainers of the core technology are likely to do a better job of hosting it because they have deeper understanding of how it all works.
There's also the trick Deno has been trying, where they can use their control of the core open source project to build features that uniquely benefit their cloud hosting: https://til.simonwillison.net/deno/deno-kv#user-content-the-...
Hosting is a commodity. Runtimes are too. In this case, the strategy is to make a better runtime, attract developers, and eventually give them a super easy way to run their project in the cloud. Eg: bun deploy, which is a reserved no op command. I really like Buns DX.
Yep. This strategy can work, and it has also backfired before, like with Docker trying to monetize something they gave away for free.
Except Amazon would beat them to it
Free now isn't free forever. If something has inherent value then folks will be willing to pay for it.
I mean if you're getting X number of users per day and you don't need to pay for bandwidth or anything, there's gotta be SOME way to monetize down the line.
If your userbase or the current CEO likes it or not.
Ads. Have you seen the dotenv JavaScript package?
Either for a modest return when it sells or as a tax write off when it fails.
VCs do not invest for a modest return.
No, but faced with either a loss or a modest return, they'll take the modest return (unless it's more beneficial to not come tax season). Unicorns are called unicorns for a reason.
The question was why do investors invest
Extrapolating and wildly guessing, we could end up with using all that mostly idle CPU/RAM (the non-VRAM) on the beefy GPUs doing inference on agentic loops where the AI runs small JS scripts in a sandbox (which Bun is the best at, with its faster startup times and lower RAM use, not to mention its extensive native bindings that Node.js/V8 do not have) essentially allowing multiple turns to happen before yielding to the API caller. It would also go well with Anthropic's advanced tool use that they recently announced. This would be a big competitive advantage in the age of agents.
I almost read this as anthropic will be using our idle CPU/GPU resources for their own training tasks ;)
Anyone know how much Anthropic paid for Bun? I assume it was at least $26M, so Bun could break even and pay back its own investors, but I didn't see a number in the announcements from Anthropic or Bun.
I don't really see how Bun fits as an acquisition for an AI company. This seems more like "we have tons of capital and we want to buy something great" than "Bun is essential to our core business model".
If Anthropic wants to own code development in the future, owning the full platform (including the runtime) makes sense.
Programming languages all are a balance between performance/etc and making it easy for a human to interact with. This balance is going to shit as AI writes more code (and I assume Anthropic wants a future where humans might not even see the code, but rather an abstraction of it... after all, all code we look at is an abstraction on some level).
Even outside of code development, Anthropic seems to be very strongly leaning into code interpreter over native tool calling for advancing agentic LLM abilities (e.g. their "skills" approach). Given that those necessitate a runtime of sorts, owning/having access to a runtime like Bun that could e.g. allow them to very seamlessly integrate that functionality into their products better, this acquisition doesn't seem like the worst idea.
They will own it, and then what? Will Claude Code end every response with "by the way, did you know that you can switch to bun for 21.37x faster builds?"
They're baking the LORA as we speak, and it'll default to `bun install` too
Acquisition of Apple Swift division incoming?
TypeScript is the most popular programming language on the most popular software hosting platform though, owning the best runtime for that seems like it would fit Pareto's rule well enough:
https://github.blog/news-insights/octoverse/octoverse-a-new-...
I think there's a potential argument to be made that Anthropic isn't trying to make it easier to write TS code, but rather that their goal is a level higher and the average person wouldn't even know what "language" is running it (in the same way most TS devs don't need to care the many layers their TS code is compiled via).
According to a JetBrains dev survey (I forget the year) roughly 58% of devs deploy to the web. That's a big money pie right there.
Bun isn’t on the web. It’s a server runtime.
It's a JS runtime, not specifically servers though? They essentially can bundle Claude Code with this, instead of ever relying on someone installing NodeJS and then running npm install.
Claude will likely be bundled up nicely with Bun in the near future. I could see this being useful to let even a beginner use claude code.
Edit:
Lastly, what I meant originally is that most front-end work happens with tools like Node or Bun. At first I was thinking they could use it to speed up generating / pulling JS projects, but it seems more likely Claude Code and bun will have a separate project where they integrate both and make Claude Code take full advantage of Bun itself, and Bun will focus on tight coupling to ensure Claude Code is optimally running.
They could do that already, nothing in the license prohibited them from doing so.
Sure, but Bun was funded by VCs and needed to figure out how to monetize, what Anthropic did is ensure it is maintained and now they have fresh talent to improve Claude Code.
Server here I used loosely - it obviously runs on any machine (eg if you wanted to deploy an application with it as a runtime). But it’s not useful for web dev itself which was my point.
Frontend work by definitions n doesn’t happen with either Node nor Bun. Some frontend tooling might be using a JS runtime but the value add of that is minimal and a lot of JS tooling is actually being rewritten in Rust for performance anyway.
Why acquire Swift when you can write iOS apps in Typescript instead?
Which would use something like Bun ;)
It doesn't make sense, and you definitely didn't say why it'd make sense... but enough people are happy enough to see the Bun team reach an exit (especially one that doesn't kill Bun) that I think the narrative that it makes sense will win out.
I see it as two hairy things canceling out: the accelerating trend of the JS ecosystem being hostage to VCs and Rauch is nonsensical, but this time a nonsensical acquisition is closing the loop as neatly as possible.
(actually this reminds me of Harry giving Dobby a sock: on so many levels!)
Claude Code running on Bun is an obvious justification, but Buns features (high performance runtime, fast starts, native TS) are also important for training and inference. For instance, in inference you develop a logical model in code that maps to a reasoning sequence, and then execute the code to validate and refine the model, then use this to inform further reasoning. Bun, which is highly integrated and highly focused on performance, is an ideal fit for this. Having Bun in house means that you can use the feedback from all of automation driven execution of Bun to drive improvements to its core.
Looks like they are acquiring the team rather than the product
No, they're clearly acquiring the technology. They're betting Claude Code on Bun, they have an invested interest in the health of Bun.
Why would they want to bet on nascent technology whereas Node.js bas existed for a god 15 years?
Because they needed something that could produce a single binary that works on every platform. They started shipping Claude Code with Bun back in July: https://x.com/jarredsumner/status/1943492457506697482
They could use the Node.js equivalent: https://nodejs.org/api/single-executable-applications.html#s...
Every time I see people mention things like this in node vs bun or deno conversations I wonder if they even tried them.
>The single executable application feature currently only supports running a single embedded script using the CommonJS module system.
>Users can include assets by adding a key-path dictionary to the configuration as the assets field. At build time, Node.js would read the assets from the specified paths and bundle them into the preparation blob. In the generated executable, users can retrieve the assets using the sea.getAsset() and sea.getAssetAsBlob() APIs.
Meanwhile, here's all I need to do to get an exe out of my project right now with, assets and all:
> bun build ./bin/start.ts --compile --outfile dist/myprogram.exe
> [32ms] bundle 60 modules
> [439ms] compile dist/myprogram.exe
it detects my dynamic imports of jsons assets (language files, default configuration) and bundles them accordingly in the executable. I don't need a separate file to declare assets, declare imports, or do anything other than just run this command line. I don't need to look at the various bundlers and find one that works fine with my CLI tool and converts its ESM/TypeScript to CJS, Bun just knows what to do.
Node is death through thousand cuts compared to the various experiences offered by Bun.
Node adds quite the startup latency over Bun too and is just not too pleasant for making CLI scripts.
I agree, they seem to have never tried it at all! Bun DX is the best, and Bun is the trend setter. Others are just catching up!
They evidently evaluated Node.js in comparison to Bun (and Deno) earlier this year and came to a technical decision about which one worked best for their product.
I highly doubt that the JS ecosystem is driven mostly by hype so I highly doubt the nodejs solution even put on a table in an internal issue tracker.
Claude Code shipped on top of Node.js for the first four months of its existence.
Why wouldn't they consider their options for bundling that version into a single binary using Node.js tooling before adopting Bun?
it starts fast and does better job than nodejs for their product
Because Microsoft already owns that.
Are you referring to node? MS doesn't own that. It's maintained by Joyent, who in turn is owned by Samsung.
Joyent handed Node.js over to a foundation in 2015, and that foundation merged into the JS Foundation to become the OpenJS Foundation in 2019.
I'm not sure if Joyent have any significant role in Node.js maintenance any more.
Oops, thank you :)
regardless, it's certainly not MS.
Microsoft owns npm outright and controls every aspect of the infrastructure that node.js relies on. It also sits on the board (and is one of the few platinum members) of the Linux Foundation, which controls openjs. It is certainly MS.
That was my thinking is, this would be useful for Claude Code.
It does actually.
Claude Code is a 1B+ cash machine and Anthropic directly uses Bun for it.
Acquiring Bun lowers the risk of the software being unmaintained as Bun made $0 and relied on VC money.
Makes sense, but this is just another day in San Francisco of a $0 revenue startup being bought out.
Does this acquisition mean Claude Code the CLI is more valuable than entiriety of Bun?
Claude Code has an annual run rate of $1bn. Bun currently has an annual run rate of $0.
It certainly generated more revenue, so this is not surprising?
> It certainly generated more revenue, so this is not surprising?
Anything is greater than 0
No, just that people who borrowed bun 7 million dollars want some of it back...
it boils down to - we didn't have full conviction that over the long run we will prove superior to node.js, however a.i company burning a lot of cash, has invested in us by basing their toolchain on us - so they have no option to acquire-hire us.
quite the uncharitable take.
On the opposite spectrum it's just that Claude and Bun are great technologies that joined forces.
>If most new code is going to be written, tested, and deployed by AI agents
That perspective following “in two-three years” makes me shudder, honestly.
Genuine question: why js?
Why not something like c#: native, fast, crossplatform, strongly-typed, great tooling, supports both scripting (ie single file-based) and compiled to a binary with no dependency whatsoever (nativeAOT), great errors and error stacks, list goes on.
All great for AI to recover during its iterations of generating something useful.
Genuinely perplexed.
If I was to pick a language, I'd pick the one all developers agree is the best.
Ahahahahhahahahahhahahahaahaha. Please tell me this is tongue-in-cheek and just more subtle than I give HN credit for. Please.
Not all devs, not even most, but I certainly think this
Sadly, this will be the trend with things moving forward. JS is perceived as a good language and LLMs are meant to make them even easier to write. It is not about the mertis of a language. It's about which languages LLMs are "good" at.
There’s like 100x more JS developers than C# developers. JS can also run code very quickly, where with an AOT language, you need to AOT compile it. For tool calls, eval-as-a-service, running in browser JS is far ahead of C#.
AI are good at JS because basically there is a ton of JS code available publicly without usage restriction: the JS code published to be executed in your browser. Most of JS code attached to web pages has no explicit license, but the implicit license is that anyone can download it and run it. Same for HTML and CSS. So using that public code to train models is a no brainer.
One other angle yet mentioned: JS is browser native. No matter how slow it is, browser is now the LCD. Similar server-client codebase, while ugly, is another plus.
Same reason AIs also use Python and DBMSes offer JS or Py UDFs easily, interpreted languages take no build time and are more portable. JS is also very popular.
Might also be a context window thing. Idk how much boilerplate C# has, but others like Java spam it.
Because js became an everything language that everyone can write and its the only language you ever need.
I dislike it also..
You could make a better argument for Go (compiles to native for multiple targets, zero actual dependencies (no need for a platform or virtual machine on the target)
C# has AOT compilation producing native, single file assemblies. A bit behind on this compared to Go, but it's there.
C# no longer requires .net installed or bundled inside exe.
Like I’ve said: NativeAOT
https://learn.microsoft.com/en-us/dotnet/core/deploying/nati...
Go is the most portable compiled language out there and makes a lot of compromises with the interpreted lang world. But it's got its own issues.
>zero actual dependencies
on Linux only with CGO_ENABLED=0 and good luck using some non web related 3rd party module which can be used with CGO disabled.
Atwood’s Law
I use Claude Code CLI daily - it's genuinely changed how I work. The $1B number sounds crazy but honestly tracks with how good the tool is. Curious how Bun integration will show up in practice beyond the native installer.
Curious about the deal value/price — any clues whether it was just to make existing investors even (so say up to $30M) or are we talking some multiple? But if it's a multiple, even 2x sounds a bit crazy.
One option is that the current Bun shareholders didn't see a profitable future and didn't even care if they were made even and a return of the remaining cash was adequate.
Another option is that this was an equity deal where Bun shareholders believe there is still a large multiple worth up potential upside in the current Anthropic valuation.
Plus many other scenarios.
i don’t get it either - bun being the foundation of tons of AI tools is like a best possible outcome, what were they hoping for when they raised the money? Or is this just an admission of “hey, that was silly, we need to land this however we can”? Or do they share major investors and the therefore this is just a consolidation? (Edit: indeed, KP did indeed invest $100M in Anthropic this year. I’m also confused - article states Bun raised 26M but the KP seed round was 7, did they do the A too but unannounced? Notably, the seed was summer 2022 and chatgpt was Nov 30, so the world is different, did the hypothesis change?)
It's more honest than the Replicate answer but I think inevitably if you can't raise the next round and you get distracted by the shiny AI that this is the path taken by many teams. There is absolutely nothing wrong with that. There was an exuberant time when all the OSS things were getting funded, and now all AI things get funded. For many engineer founders, it's a better fit to go build deep technical stuff inside a bigger company. If I had that chance I would probably have taken it too. Good luck to the Bun team!
Bun has completely changed my outlook on the JS ecosystem. Prior to Bun, there was little focus on performance. Now the entire space rallies around it.
Congrats to Jarred and the team!
> Prior to Bun, there was little focus on performance.
This is just completely insane. We went through more than a decade of performance competition in the JS VM space, and the _only_ justification that Google had for creating V8 was performance.
> The V8 engine was first introduced by Google in 2008, coinciding with the launch of the Google Chrome web browser. At the time, web applications were becoming increasingly complex, and there was a growing need for a faster, more efficient JavaScript engine. Google recognized this need and set out to create an engine that could significantly improve JavaScript performance.
I guess this is the time we live in. Vibe-coded projects get bought by vibe-coded companies and are congratulated in vibe-coded comments.
> Vibe-coded projects get bought by vibe-coded companies
this is so far from the truth. Bun, Zig, and uWebsockets are passion projects run by individuals with deep systems programming expertise. furthest thing from vibe coding imaginable.
> a decade of performance competition in the JS VM space
this was a rising tide that lifted all boats, including Node, but Node is built with much more of the system implemented in JS, so it is architecturally incapable of the kind of performance Bun/uWebsockets achieves.
> Bun, Zig, and uWebsockets are passion projects run by individuals with deep systems programming expertise. furthest thing from vibe coding imaginable.
Sure, I definitely will not throw projects like Zig into that bucket, and I don't actually think Bun is vibe-coded. At least that _used_ to be true, we'll see I guess...
Don't read a snarky comment so literally ;)
> Node is built with much more of the system implemented in JS, so it is architecturally incapable of the kind of performance Bun/uWebsockets achieves
That sounds like an implementation difference, not an architectural difference. If they wanted to, what would prevent Node or a third party from implementing parts of the stdlib in a faster language?
That's because it's not written in JS at all but a compiled systems language, no wonder it's gonna be fast.
Virtually all JavaScript engines are written in compiled languages. (Most runtimes for that matter nut just JS)
My mistake, I was thinking of the wider ecosystem not the runtime, ie formatters, bundles and linters like Biome, oxc, etc being written in Rust or other compiled languages. That's where I saw the biggest speedup, because developers of them decided to use a compiled language to write them in instead of JS via a JS runtime where you'll inherently be limited by even a JIT language.
One important original point of node was that v8 made JS very fast by compiling to machine code, plus it’s had multithreading built in for a decade.
Machine code yes (along with Spidermonkey, JSC and Nashorn), the timeframe around 2005-2010 saw the introduction of JIT'ed JS runtimes. Back then however JS was firmly single-threaded, it was only with the introduction of SharedArrayBuffer that JS really started to receive multithreading features (outside of SharedArrayBuffer and other shareable/sendable types, a runtime could opt to run stuff like WebWorkers/WebAudioWorkers in separate processes).
Early Node f.ex. had a multi-process setup built in, Node initially was about pushing the async-IO model together with a fast JS runtime.
Why Bun (and partially Deno) exists is because TypeScript helps so damn much once projects gets a tad larger, but usage with Node hot-reloading was kinda slow, multiple seconds from saving a file until your application reloads. Even mainline node nowadays has direct .ts file loading and type erasing to quicken the workflow.
> Prior to Bun, there was little focus on performance
v8 is one of the most advanced JIT runtimes in the world. A lot of people have spent a lot of time focusing on its performance.
That is the most absurd thing I've heard in 20 years. Chrome literally was launched on performance, for JS and beyond.
The reality is that the insane "JS ecosystem" will rally around whatever is the latest hotness.
All vendors will have to implement test time code execution, solution exploration, etc. as it's a low hanging fruit with huge gains, so I see it as a great hire. Love Bun, happy for you guys!
But will they fix command line autocompletions?
So, what if Claude Code starts using Bun in all applicable situations? If model providers train their models to use a tech stack beneficial to their business interests?
Wondering to what degree this was done to support Anthropic’s web crawler. Would assume that having a whole JS runtime rather than just a HTTP client could be rather useful. Just hypothesising here, no clue what they use for their crawler.
There's no reason to run agents on expensive AI platforms or on GPUs - when you can have the AI create an agent in JS and thus runs with very high performance and perfect repeatability on far less expensive CPUs.
At the very least there must be some part of the agent tasks that can be run in JS, such as REST APIs, fetching web results, parsing CSV into a table, etc.
Agents already do this exact thing, except that the go-to language for Claude to write one-off scripts in is usually Python.
Am I missing something - I thought that GPUs are for training the weights
Being able to create an agent in any language to run on any hardware has always been possible hasn't it?
This somewhat answers the question of "how on earth is a JS runtime company going to profit?"
Has CC always used Bun? When it tries it out many months ago it was an npm install not bun install in their instructions (although I did use bun install myself). Just odd that if they were using bun, why the installation wasn’t specifically a “bun install” (I suppose they were trying to keep it vanilla for the npm masses?)
they acquihired the team and derisked their investment in building claude code on top of bun. makes sense to me.
moreover, now they can make investments in order to make it an an even more efficient and secure runtime for model workspaces.
This decision is honestly very confusing to me as a constant user of Claude Code (I have 3 of them open at the moment.)
So many of the issues with it seem to be because ... they wrote the damn thing in JavaScript?
Claude is pretty good at a constrained task with tests -- couldn't you just port it to a different language? With Claude?
And then just ... the huge claude.json which gets written on every message, like ... SQLite exists! Please, please use it! The scrollback! The Keyboard handling! Just write a simple Rust or Go or whatever CLI app with an actual database and reasonable TUI toolkit? Why double down and buy a whole JavaScript runtime?
Ink (and modern alternatives) probably are the best TUI toolkit. If you want to write a UI that's genuinely good, you need e.g. HTML, or some way to express divs and flex box. There isn't really another way to build professional grade UIs; I love immediate mode UI for games, but the breadth of features handled by the browser UI ecosystem is astonishing. It is a genuinely hard problem.
And if you're expressing hierarchical UI, the best way to do it is HTML and CSS. It has the richest ecosystem, and it is one of the most mature technologies in existence. JS / TS are the native languages for those tools. Everything is informed by this.
Of course, there are other options. You could jam HTML and CSS into (as you mention) Rust, or C, or whatever. But then the ecosystem is extremely lacking, and you're reinventing the wheel. You could use something simpler, like QML or handrolled. But then you lose the aforementioned breadth of features and compatibilities with all the browser code ever written.
TypeScript is genuinely, for my money, the best option. The big problem is that the terminal backends aren't mature (as you said, scrollback, etc). But, given time and money, that'll get sorted out. It's much easier to fix the terminal stuff than to rewrite all of the browser.
I like JS for this use case, and React on web, but really not fond of the Ink usage. Idk if it's Ink itself or the way it gets used, but somehow people are making CLIs that lag and waste terminal space now.
Ink seems to be the root cause of a major issue with the Claude Code CLI where it flickers horribly when it needs to repeatedly clear the screen and redraw.
I don't know why it's even necessary for this.
https://github.com/atxtechbro/test-ink-flickering
Issue on Claude Code GitHub:
https://github.com/anthropics/claude-code/issues/769
The idea that you need or want HTML or CSS to write a TUI is missing the entire point of what made TUIs great in the first place. They were great precisely because they were clean, fast, simple, focused -- and didn’t require an entire web stack to draw colored boxes.
I'm not so sure about that. I've written some nontrivial TUIs in my time, the largest one being [1], and as the project got more complicated I did find myself often thinking "It sure would be nice if I could somehow just write this stuff with CSS instead of tiny state machines and control codes for coloration". There's no reason these languages couldn't compile down to a TUI as lean as hand-coloring everything yourself.
[1]: https://taskusanakirja.com/
I'm certainly not advocating for a return to C + ncurses, but there's a wide ocean of options between that and HTML+CSS+JS in the terminal.
Yes, for simple projects, absolutely. But when you're shipping something as widely adopted as CC, I disagree. At the end of the day, you're making a UI. It happens to be rendered via the terminal. You still need accessibility, consistent layouts, easy integration with your backend services, inputs, forms, and so on. If you don't need that stuff, there are lots of other, simpler options. But if you do, your other options begin to resemble a half baked, bug filled reimplementation of the web. So just use the web.
“Port it to a different language” a language that’s more out of distribution? Bad devex. Store data as an unreadable binary file? Bad devex.
Stay in distribution and in the wave as much as possible.
Good devex is all you need. Claude code team iterates and ships fast, and these decisions make total sense when you realize that dev velocity is the point.
I have to admit this was my first thought, too. I'm pretty obsessed with Claude Code, but the actual app is so incredibly poorly engineered for something that doesn't even do that much.
Rust, Go, whatever -- writing a good TUI isn't that hard of a problem. Buying an entire VC funded JS runtime company isn't how you solve it.
Boggles the mind.
So many comments about reasoning here, yet none about the very obvious one, it's not stability of the infrastructure, it's future direction of a product like Claude Code. They need to know how to continue their optimisation machine to fit developers needs the best way possible (for good or for worse).
I guess we should wait for some opt-out telemetry some time soon. It'll be nothing too crazy at first, but we'll see how hungry they are for the data.
Don't they already have a ton of telemetry from Claude Code itself? I'd be shocked and expect an instant fork if Anthropic telemetry was added to Bun.
Oh no ... unfortunately this likely means a Bun.AI API in my JS runtime.
Congratulations to the team. Knowing some of the folks on the Bun team I can not say I am surprised. They are the top 0,001% of engineers, writing code out of love. I’m hugely bullish on Anthropic, this is a great first acquisition.
I don't get it. Why would Anthropic need to own a JS runtime?
Because they have a product that makes $1bn+ a year that depends on having a good, stable, cross-platform JS runtime.
I'm still confused. Why not just pour a ton of resources into it since it's open source. I guess dev mindshare? It is a great product
Pouring a ton of resources into an open source project that raised $26m in VC doesn't guarantee that the project will stick around. Acquiring it does.
Buying Bun to ensure it sticks around doesn't pass the smell test unless they had very few months of runway left
Bun had four years of runway left.
You're describing Node.js which has existed for the last 15 years
And is owned by Microsoft. The theory is that by symmetry Anthropic should own a node competitor.
Microsoft doesn't own node.
but they are a company that burns billions every year in losses and this seems like a pretty random acquisition.
Bun is the product that depends on providing that good, stable, cross-platform JS runtime and they were already doing a good job. Why would Anthropic's acquisition of them make them better at what they were already doing?
> Why would Anthropic's acquisition of them make them better at what they were already doing?
Because now the Bun team don't have to redirect their resources to implementing a sustainable business model.
>but they are a company that burns billions every year in losses
No they don't.
Ok but node is even more stable and mature - compare node api parity in bun and also issue of bun vs node
But they are not using node any more?
That doesn’t require or benefit from acquiring Bun. Node continues to exist and serve fine.
I'm wondering if Bun would be a good embedded runtime for Claude to think in. If it does sandboxing, or if they can add sandboxing, then they can standardize on a language and runtime for Claude Code and Claude Desktop and bake it into training like they do with other agentic things like tool calls. It'd be too risky to do unless they owned the runtime.
Why would Sun then Oracle own Java? Why would Microsoft own .net? Why would Apple own swift?
IOW look where the puck is going.
When I saw the headline I was ready to be mad, but after reading the post, I'm cautiously on board with this.
Shopify should buy Ruby on Rails because they depends on it
didn't they try a hostile takeover of the ruby gems thing (forgot the name)?
Hope nobody buys Astral or Python is f*cked.
Then it would probably be back to Poetry. Or some other newcomer, or maybe a fork of uv.
uv is very forkable - dual-licensed under Apache and MIT, high quality codebase, it's Rust rather than Python but the Python community has an increasing amount of Rust experience these days.
That's why I'm not personally too nervous about the strategic risk to the Python community of having such a significant piece of the ecosystem from a relatively young VC-backed company.
If you froze uv today it’ll take years for anything to get to a state where the switch would be worth it.
Honestly, given the constant rollercoaster of version management and building tools for Python the move to something else would be expected rather than surprising.
I’ve seems like a great tool, but I remember thinking the same about piping, too.
uv is a revolution in every possible positive sense of the word in the Python world and I've been here since 1.5. it is imperative that bitter oldtimers like us try it, I did and the only regret I've got is that I didn't do it sooner.
I also tried it and am now using it for new projects. But I was just fine with Poetry too. Yes, uv is faster and probably better code. But my use-cases didn't necessitate to re-create the venvs frequently, so the slowness of Poetry didn't matter that much to me, and I am not using the "one-off script" kind of approaches that uv enables (writing the dependencies in a comment in the script itself).
So, yeah, uv is nice, but for me didn't fundamentally change that much.
Our entire business runs on Python without a drop of Astral in the mix. No one would even notice.
you should try uv, really impressive tool
Honestly, that is an understatement. `uv run` has transformed how I use Python since 99% of the time I don't need to setup or manage an environment and dependencies. A have tons of one-off Python scripts (with their dependencies in PEP 723 metadata at the top of the file) that just work with `uv run`.
I get how it might not be as useful in a production deployment where the system/container will be setup just for that Python service, but for less structured use-cases, `uv` is a silver bullet.
I don't want to even think about it. uv has been a revelation!
#1, uv is open-source and it could easily be forked and kept up to date.
#2, if you don't like uv, you can switch to something else.
uv probably has the least moat around it of anything. Truly a meritocracy: people use it because it's good, not because they're stuck with it.
Never used any of their tools.
Python is doing great, other than still doing baby steps into having a JIT in CPython.
Finally, an event capable of killing the Python demon!
Congratulations to the bun team!
> I started porting esbuild's JSX & TypeScript transpiler from Go to Zig
How was Go involved there before Zig?
esbuild is still a Go app today: https://github.com/evanw/esbuild
The first hints of what become Bun were when Jared experimented at porting that to Zig.
So this is a rug pull we were afraid of? Bun got me into javascript ecosystem after years of hating on it. This sucks.
This reads more like Anthropic wanted to hire Jarred and Jarred wants to work with AI rather than build a Saas product around bun. I doubt it has anything to do with what is best for bun the project. Considering bun always seemed to value performance more than all else, the only real way for them to continue pursuing that value would be to move into the actual js engine design. This seems like a good pivot for Jarred personally and likely a loss for bun.
It doesn't read like that to me at all. This reads to me like Anthropic realizing that they have $1bn in annual revenue from Claude Code that's dependent on Bun, and acquiring Bun is a great and comparatively cheap way to remove any risk from that dependency.
I haven't had any issue moving projects between node, bun, and deno for years. I don't agree that the risk of bun failing as a company affects anthropic at all. Bun has a permissible license that anthropic could fork from, anthropic likely knew that oven had a long runway and isn't in immediate danger, and switching to a new js cli tool is not the huge lift most people think it is in 2025. Why pay for something you are already getting for free and can expect to keep getting for free for at least four years, and buy for less if it fails later?
This argument doesn’t make much sense to me. Claude Code, like any product, presumably has dozens of external dependencies. What’s so special about Bun specifically that motivated an acquisition?
A dependency that forms the foundation of your build process, distribution mechanisms, and management of other dependencies is a materially different risk than a dependency that, say, colorizes terminal output.
I’m doubtful that alone motivated an acquisition, it was surely a confluence of factors, but Bun is definitely a significant dependency for Claude Code.
MIT code, let Bun continue develop it, once project is abandoned hire the developers.
If they don't want to maintain; GitHub fork with more motivated people.
> MIT code, let Bun continue develop it, once project is abandoned hire the developers.
Why go through the pain of letting it be abandoned and then hiring the developers anyway, when instead you can hire the developers now and prevent it from being abandoned in the first place (and get some influence in project priorities as well)?
If they found themselves pushing PRs to bun that got ignored and they wanted to speed up priority on things they needed, if the acq was cheap enough, this is the way to do it.
I'm also curious if Anthropic was worried about the funding situation for Bun. The easiest way to allay any concerns about longevity is to just acquire them outright.
Really? What risk is even there?
Except bun is OSS, so they could have just forked if something happened
It's not easy to "just" fork a huge project like Bun. You'll need to commit several devs to it, and they'll have to have Zig and JSC experience, a hard combo to hire for. In many ways, this is an acquihire.
Nah, it reads like the normal logic behind the consulting model for open source monetization, except that Bun was able to make it work with just one customer. Good for them, though it comes with some risks, especially when structured as an acquisition.
So Anthropic sees its CLI (in TypeScript) as the first-class product and maybe planning to expand the claude code with more JS based agents / ecosystem? Especially owning the runtime gives a lot of control over developer experience.
What matters: it's staying open source and MIT licensed. I sincerely hope it stays that way. Congrats to the Bun team on making a great tool and getting the recognition they deserve.
> Being part of Anthropic gives Bun: Long-term stability.
Let's see. I don't want to always be the downer but the AI industry is in a state of rapid flux with some very strong economic headwinds. I wouldn't confidently say that hitching your wagon to AI gives you long term stability. But as long as the rest of us keep the ability to fork an open source project I won't complain too much.
(for those who are disappointed: this is why you stick with Node. Deno and Bun are both VC funded projects, there's only one way that goes. The only question is timeline)
Nothing gives you long term stability in tech. You have to constantly work at staying stable, and it isn't always up to anything the company is in control of, no matter what ownership they have.
> Nothing gives you long term stability in tech.
Sure. But everything is relative. For instance, Node has much more likelihood of long term stability than Bun, given its ownership.
> Node has much more likelihood of long term stability than Bun
Given how many more dependencies you need to build/maintain a Node app, your Bun application has a better chance of long term stability.
With Node almost everything is third party (db driver, S3, router, etc) and the vast majority of NPM deps have dozens if not hundreds of deps.
I’m talking about long term stability of the tool and ecosystem, not of any specific app.
Sure, that makes it a good backup strategy. But there’s little reason to use a worse tool until the time you need the backup comes.
I'm confused. I installed claude code with:
I thought claude code just used Nodejs? I didn't realise the recommended install used a different runtime.They switched to recommending this as the installation method back in July:
That install script gives you a single binary which is created using Bun.Maybe that's why I didn't have some bugs people were reporting on HN, or because I was using linux.
Genuine question, why acquisition when anthropic could simply sponsor, contribute and influence instead?
Acquisition seems like a large overhead and maybe a slight pivot to me.
Neat. I just started using bun as my default "batteries included" JavaScript engine, so it's nice they're getting this boost.
I'm only surprised that it wasn't Vercel who bought them.
Interesting that this announcement is tied in with one for Claude Code revenue.
Feels like maybe AI companies are starting to feel the questions on their capital spending? They wanna show that this is a responsible acquisition.
Considering that 1) Bun is written in Zig, 2) Zig has a strict no-AI policy [1], and 3) Bun has joined Claude, it seems that Bun and Zig are increasingly culturally apart.
[1] https://ziglang.org/code-of-conduct/#strict-no-llm-no-ai-pol...
You’re reading a code of conduct for contributing to the zig project. I don’t think everything there is guidance for everything written in zig, eg ‘English is encouraged’ is something one might not want for a project written in zig by native French-speakers, and I don’t think that’s something zig would want to suggest to them. I read the AI part is much more motivated by the asymmetries of open source project contribution than any statement about the language itself. Fly-by AI contributions are bad because they make particularly poor use of maintainer time. Similar to the rule on proposing language changes, which can suck up lots of reading/thinking/discussion time. When you have people regularly working together (eg those people in anthropic working on bun) the incentives are different because there is a higher cost to wasting your colleague’s time.
> Bun and Zig are increasingly culturally apart
That's like saying GCC and NodeJS are culturally apart, as if that has significant bearing on either?
Nothing I found says anything about Zig folks being inherently against AI. It just looks like they don’t want to deal with “AI Slop” in contributions to their project, which is very understandable.
Godspeed. Seems like a good pairing. Bun is sort of the only part of the JS ecosystem I like, and Code has become such an important tool for my work, that I think good things will come out of this match. Go Bundler as well.
This morning I found myself muttering something I won't repeat as a reaction to Claude Code's remarkably slow startup time.
Put the Bun folks directly on that please and nothing else.
I’m curious to what the acquisition price was. Bun said they’ve raised $26 million so I’m assuming the price tag has to be a lot higher than that for investors to agree to an acquisition.
I'm sure the Bun team will get Claude Code straightened out. Weird acquisition, but TBH Anthropic needed to fill this hole.
Wouldn’t it make more sense to write the same functionality using a more performant, no-gc language? Aren’t competitors praised for their CLIs being faster for that reason?
With AI tooling, we are in the era where rapid iteration on product matters more than optimal runtime performance. Given that, implementing your AI tooling in a language that maximizes engineer productivity makes sense, and I believe GC does that.
JS/TS has a fundamental advantage, because there is more open source JS/TS than any other language, so LLMs training on JS/TS have more to work with. Combine that with having the largest developer community, which means you have more people using LLMs to write JS/TS than any other language, and people use it more because it works better, then the advantage compounds as you retrain on usage data.
One would expect that "AI tooling" is there for rapid iteration and one can use it with performant languages. We already had "rapid iteration" with GC languages.
If "AI tooling" makes developers more productive regardless of language, then it's still more productive to use a more productive language. If JS is more productive than C++, then "N% more productive JS" is still more productive than "N% more productive C++", for all positive N.
Codex is written in Rust
A single bun? Is that really newsworthy?
I use bun in a project but Claude Code always uses node to run throwaway scripts. Maybe they can persuade it to use bun as part of this acquisition?
I bet CC will become a binary with bun included and it'll use it's internal JS engine to run most scripts.
Oddly I saw it try to use bun the other day, and was confused because everything in the project is in node.
I always tell it to use Bun and it works? Am I misunderstanding?
It seems the default is node (despite the project docs saying to use bun and all example script documentation using bun). It will use bun if told, but there’s definitely nothing saying to use node and it uses that anyway.
So, we can anticipate that the new Anthropic browser will now have the interpreter Ken Thompson previewed for us 41-odd years ago?
on the post they try to reassure the following question "If I bet my work project or company's tech stack on Bun, will it still be around in five or ten years?" and the thing is that we don't know if Anthropic itself will be around 5 to ten years
:(
This wasn’t very high up on my list for acquisitions but props to the bun team for cashing in on the AI hype somehow!
My long-term bet on Node being "boring" and "stable" continues to pay major dividends. So glad I never invested any time and effort on this ecosystem…
That is the way, when one is long time around, there are these alternatives coming and going, while the reference platforms keep going.
I finally hope Bun will start to support and work on WSL1
Sounds like the goal is to bundle up Bun with Claude Code insanely tightly, to the point where it doesn't matter if you have nodejs installed locally, but also they can optimize key things for Claude Code's Bun runtime as needed. It's a brilliant acquisition, and bun stays open source, which allows it to continue to grow, to Anthropics benefit and everyone else's.
A nice start would probably be for Claude Code to stop trying to use npm when it detects a bun lockfile and vice versa...
I just ln bun to npm, npx, and node. This has the added benefit of letting ts_ls and various other tools work without requiring me to have both node and bun installed locally.
Yeah Claude is very good, but it definitely needs to get "smarter" in some nuanced areas.
Makes sense, I had idea how else the investors would have made money on a javascript bundler/jsc frontend
Maybe they just like to work together *shrug*.
why couldn't Anthropic simply use Claude Code to write Bun over the weekend??
It is open source (MIT license), Claude should have a pretty good start on it already.
Aham, tx. Good to know - I'll switch my projects to Deno.
you know Deno is VC backed right
Congratulations to Jared. He and the team are Real Ziggers. Looking forward to a faster Claude Code!
Looks like a good time to try learning Zig again
Congrats Jarred and team! You have saved humanity many hours already, and I'm sure with Anthropic's backing, you will spare us many more. Farewell would-be headaches from Node & NPM tooling and waiting for builds and tests and package updates. Exciting times ahead!
Using bun on a side project reinvigorated my love of software development during a relatively dark time in my life, and part of me wonders if I would have taken the leap onto my current path if it weren't for the joy and feeling of speed that came from working with bun!
No strategic roadmap is ever going to tell you: "Build a $0-revenue JavaScript runtime and one day an AI company will acquire you"
It reminds me of hearing that music majors often do well in medical school. Want to go to medical school? Just major in music, duh.
Ha, Physics majors get the same talk about law school. It's just the selection bias of selecting for people willing to make hard pivots filtering out the under-achieving, go-with-the-flow types.
Lots of strategists will tell you something like: "Build something that's useful and then there will be money".
That's 100% what happened to Bun. It's useful (like really useful) and now they're getting rewarded
Honestly that's probably the best play. Monetizing dev tools directly is a nightmare.
And you risk ending up like Postman or Insomnia, once beautiful software which is now widely hated by developers.
Countdown till Astral is acquired?
i really think this is part of the pitch deck for bun's funding. that a bigger company would acquire it for the technology. the only reason an AI company or any company for that matter would acquire it would be to:
1. acquire talent.
2. control the future roadmap of bun.
i think it's really 1.
I had the same thought when openai acquired rockset.
Well, that was the playbook in the 1999-2001 dotcom days.
Which is probably why no one's going to recommend it these days
...but hey, things are different during a bubble.
okay so does that mean openai buys deno?
Who is expects Anthropic to migrate all their code to Codeberg.
Wow.
Shouts out to the fellow who half-broke the news in this submission that was presumably killed because of the aggressive paywall: https://news.ycombinator.com/item?id=46123627
And apparently the submission's source for being the only org I can tell that anticipated this: https://www.theinformation.com/articles/anthropic-advanced-t...
Bun is such a great runtime. If you haven't tried it, try it. It's got bells and whistles.
This will make sure Bun is around for many, many, years to come. Thanks Anthropic.
Why Bun?
Easy to setup and go. bun run <something.ts>
Bells and whistles. (SQL, Router, SPA, JSX, Bundling, Binaries, Streams, Sockets, S3)
Typescript Supported. (No need to tsc, bun can transpile for you)
Binary builds. (single executables for easy deployment)
Full Node.js Support. (The whole API)
Full NPM Support. (All the packages)
Native modules. (90% and getting better thanks to Zig's interop)
S3 File / SQL Builtin. (Blazingly Fast!)
You should try it. Yes, others do these things too, but we're talking about Bun.
Its not 100% nodejs compatible. I see enough non-green dots in their own official report https://bun.com/docs/runtime/nodejs-compat
And even in packages with full support you can find many github issues that bun behaves directly which leads to some bugs.
Not saying it’s 100%, there’s still the repl missing but all of node’s API is available in the sense that it’s ABI compatible (or will be very near term).
> This will make sure Bun is around for many, many, years to come.
Well, until the bubble bursts and Anthropic fizzles out or gets acquired themselves.
If they keep it MIT licensed, if/when things come crashing down, I think its reasonable to think Bun would continue on in some form, even if development slows pace without paid contributors.
...and then it's going to be time for an "incredible journey" post.
Does it have permission flags yet like deno has?
I’ve never understood the security utility of the Deno flags. What practical attack would they protect you from? Supply chain seems to be the idea, but how many npm packages do people use that neither:
* Get run by devs with filesystem permissions
* Get bundled into production
It'll be around until they realize it makes 0$, and costs them millions per year in salaries/stock. then it will quietly die
Anthropic uses a lot of bun. In fact, they bet the farm on it.
You think they wouldn't have done that napkin math before deciding to acquire it?
What a trip. Love both, so all good I guess.
Well, Bun is MIT-licensed. So once they change the license and/or kill the project, the community can fork it easily.
The point of this deal is that they do not need to change the license. Nobody will ever pay for Bun and now they don't have to force it.
Incredible news on so, so many levels!
(1) Bun is what technical startups should be. Consistently excellent decisions, hyper focused on user experience, and a truly excellent technical product.
(2) We live in a world where TUIs are causing billion dollar acquisitions. Think about that. Obviously, Bun itself is largely orthogonal to the TUIs. Just another use case. But also obviously, they wouldn't have been acquired like this without this use case.
(3) There's been questions of whether startups like Bun can exist. How will they make money? When will they have to sell out one of the three principles in (1) to do so? The answer seems to be that they don't; at least, not like we expected, and in my opinion not in a sinister way.
A sinister or corrupting sell out would be e.g. like Conan. What started as an excellent tool became a bloated, versioned mess as they were forced to implement features to support the corporate customers that sustained them.
This feels different. Of course, there will be some selling out. But largely the interests of Anthropic seem aligned with "build the best JS runtime", since Anthropic themselves must be laser focused on user experience with Claude Code. And just look at Opencode [^1] if you want to see what leaning all the way into Bun gets you. Single file binary distribution, absurdly fast, gorgeous. Their backend, OpenTUI [^2], is a large part of this, and was built in close correspondence with the Bun folks. It's not something that could exist without Bun, in my opinion.
(4) Anthropic could have certainly let Bun be a third party to which they contributed. They did not have to purchase them. But they did. There is a strange not-quite altruism in this; at worst, a casting off of the exploitation of open source we often see from the biggest companies. Things change; what seems almost altruistic now could be revealed to be sinister, or could morph into such. But for now, at least, it feels good and right.
[^1]: https://github.com/sst/opencode [^2]: https://github.com/sst/opentui
It makes total sense to me.
Can anyone provide some color around this: "I started porting esbuild's JSX & TypeScript transpiler from Go to Zig"? Hypothetical benefits include monolanguage for development, better interoperability with C and C++, no garbage collection, and better performance. What turned out to be realized and relevant here? Please, no speculation or language flames or wars.
Interesting. Looking through a strategic lens, I feel like this is related to the $1,000 free credit for Claude Code Web (I used a few hundred). What the heck are they aiming for? CodeAct? (https://arxiv.org/abs/2402.01030)
Hahaha congratulations. This is amazing. The most unlikely outcome for a devtools team. Fascinating stuff.
This is promising for Astral et al who I really like but worried about their sustainability. It does point to being as close to the user as possible mattering.
The Bun team works hard, glad to see it pay off.
Is Claude Code the first CLI tool to have a $1BN ARR?
I don't know for sure, but it's definitely the first tool of that value to have a persistent strobing (scroll position) bug so bad that passersby ask me if I'm okay when they see it.
Man, I had never even put words to that problem but you are right that it is beyond annoying. It seems to me like it worsens the longer the Claude instance has run - I don't seem to see it early in the session.
Yeah, issues have been open on GitHub for months. I've tried shortening my scrollback history and using other emulators but it doesn't seem to make a difference. It's pretty frustrating for a paid tool.
ha I thought it was just a me thing and had have accepted my fate.
This graph from the SemiAnalysis blog suggests that GitHub Copilot reached it earlier this year: https://substackcdn.com/image/fetch/$s_!BGEe!,f_auto,q_auto:...
"GitHub Copilot" encompasses so many different products now that it's hard to see it as a CLI tool.
It doesn't make a lot of sense that they'll compare Microsoft 365 Copilot with Claude Code, though? Like it is a legit CLI tool but we should ignore it because it shares the name with something else?
The GitHub Copilot CLI tool is brand new, they only launched that in September: https://github.blog/changelog/2025-09-25-github-copilot-cli-...
Prior to that GitHub Copilot was either the VS Code IDE integration or the various AI features that popped up around the GitHub.com site itself.
Terraform gets to $600mm if you squint really hard make up stuff. Kubectl though. Whatever you want to say about kubernetes complexity, it does get a bunch of money run through it. We could also look at aws-cli, gcloud and az, and if we assign cloud budgets that get run through there, I'm sure it's in the hundreds of millions. Then there's git. Across the whole ecosystem, there's probably a cool couple billion floating through there. gh is probably much smaller. Other tools like docker and ansible come to mind, though those are not quite as popular. Cc only hits $1B ARR if you squint really hard in the first place, so I think in this handwavy realm, I'd say aws-cli comes first, then kubectl, then git, with maybe docket and terraform in the mix as well. Nonetheless, Claude is a really awesome cli tool that I use most days, I find.
Associated Anthropic post: https://www.anthropic.com/news/anthropic-acquires-bun-as-cla...
Love bun! Congratulations!
bullish for js, bearish for python?
Curious, how did he pay the bills when spending these years developing Bun?
Bun was VC funded.
I thought it said he was building a voxel game in the browser?
Good luck, always worried about stuff like that because it happened so many times and the product got worse eventually. At the same time, ai understand how much effort went into building something like Bun and people need to fund their life's somehow, so there's that.
video covering it
https://www.youtube.com/watch?v=6hEiUq8jWIg
In other news - Amp Code is a separate company now - https://ampcode.com/news/amp-inc
This announcement made me check in on the arbitrary code execution bug I reported that the Bun Claude bot created a PR for about 3 weeks ago:
https://github.com/oven-sh/bun/pull/24578
So far, someone from the bun team has left a bunch of comments like
> Poor quality code
...and all the tests still seem to be failing. I looked through the code that the bot had generated and to me (who to be fair is not familiar with the bun codebase) it looks like total dogshit.
But hey, maybe it'll get there eventually. I don't envy "taylordotfish" and the other bot-herders working at Oven though, and I hope they get a nice payout as part of this sale.
So you pushed a PR that breaks a bunch of tests, added a 5 layer nested if branch block that mixes concerns all over the place, then ignored the reviewer for three weeks, and you’re surprised they didn’t approve it?
The OP directly says:
> that the Bun Claude bot created a PR for about 3 weeks ago
The PR with bad code that's also been ignored was made by the bot that Bun made, and brags about in their acquisition post.
I just reported the bug, it was the bot that was proudly mentioned in the announcement which created the PR and the code...
> So you pushed a PR that breaks a bunch of tests, added a 5 layer nested if branch block that mixes concerns all over the place, then ignored the reviewer for three weeks, and you’re surprised they didn’t approve it?
...Did you miss the part where Bun used Claude to generate that PR?:)
I misinterpreted that first comment too. To clarify:
1. User krig reports an issue against the Bun repo: https://github.com/oven-sh/bun/issues/24548
2. Bun's own automated "bunbot" filed a PR with a potential fix: https://github.com/oven-sh/bun/pull/24578
3. taylordotfish (not an employee of Bun as far as I can tell, but quite an active contributor to their repo) left a code review pointing out many flaws: https://github.com/oven-sh/bun/pull/24578#pullrequestreview-...
Right, this is accurate. Except I thought taylordotfish worked for bun, so I guess no one at bun has looked at it at all then.
I did.
Reminds me of Atlassian buying an AI browser.
First major success story for Zig language? (Not trying to diminish Bun's team success)
I'd say Ghostty is a pretty big success story as well.
Let's not forget about TigerBeetle either. They weren't bought (as far as I'm aware), but they seem to have some pretty good backing from customers.
Congrats. This is the first time I remember reading a genuine, authentic story about a sale. Much preferred over “this is about continuing the mission until my earn-out is complete.”
> If Bun breaks, Claude Code breaks. Anthropic has direct incentive to keep Bun excellent.
and when this bubble pops down goes bun
Anthropic? The AI people?
Look, if a terminal emulator can raise $67 million by riding the AI hypewave then a Javscript runtime can do the same. Nobody ever said that AI investments and acquisitions have to make any sense.
wow !
Well this just created a lot of work for me. Everything’s turning to shit at an alarming rate.
Congrats...
> Long-term stability. a home and resources so people can safely bet their stack on Bun.
Isn't it the opposite? Now we've tied Bun to "AI" and if the AI bubble or hype or whatever bursts or dies down it'd impact Bun.
> We had over 4 years of runway to figure out monetization. We didn't have to join Anthropic.
There's honestly a higher chance of Bun sticking out that runway than the current AI hype still being around.
Nothing against Anthropic but with the circular financing, all the debt, OpenAI's spending and over-valuations "AI" is the riskier bet than Bun and hosting.
Yeah that’s the main part that puzzled me, super happy for the team that they got a successful exit, but I wouldn’t really consider Anthropic’s situation to be stable…
Yeah, no reader of tech news will take an acquisition of a company with four years of runway as anything but decreasing the odds their product will still be around (and useful to the same audience…) in four years. Even without being tied to a company with lots of exposure to a probable bubble.
How so? Presumably Jarred got a nice enough payout that if Anthropic failed, he would not need to work. At that point, he's more than welcome to take the fully MIT licensed Bun and fork it to start another company or just continue to work on it himself if he so chooses.
History?
I didn’t say it was definitely the end or definitely would end up worse, just that someone who’s followed tech news for a while is unlikely to take this as increasing the odds Bun survives mid-term. If the company was in trouble anyway, sure, maybe, but not if they still had fourish years in the bank.
“Acquired product thriving four years later” isn’t unheard of, but it’s not what you expect. The norm is the product’s dead or stagnant and dying by then.
> At that point, he's more than welcome to take the fully MIT licensed Bun and fork it to start another company or just continue to work on it himself if he so chooses.
Is there any historical precedent of someone doing that?
I say don't muddy the water with the public panic over "will it won't it" bubble burst predictions.
The effective demand for Opus 4.5 is bottomless; the models will only get better.
People will always want a code model as good as we have now, let alone better.
Bun securing default status in the best coding model is a win-win-win
Opus 4.5 is not living in vacuum. It’s the most expensive of models for coders and there is Gemini 3 pro - with many discounts and deepseek 3.2 that is 50x cheaper and not much behind.
> I say don't muddy the water with the public panic over "will it won't it" bubble burst predictions.
It does matter. The public ultimately determines how much they get in funding if at all.
> The effective demand for Opus 4.5 is bottomless; the models will only get better.
The demand for the Internet is bottomless. Doesn't mean Dotcom didn't crash.
There are lots of scenarios this can play out, e.g. Anthropic fails to raise a certain round because money dried up. OpenAI buys Anthropic but decides they don't need Bun and closes out the project.
If claude code starts having ads for bun in the code it generates, I am never using it again.
To some degree have “opinionated views on tech stacks” is unavoidable in LLMs, but this seems like it moves us towards a horrible future.
Imagine if claude (or gemini) let you as a business pay to “prefer” certain tech in generated code?
Its google ads all over again.
The thing is, if they own bun, and they want people to use bun, how can they justify not preferencing it on the server side?
…and once one team does it… game on!
It just seems like a sucky future, that is now going to be unavoidable.
What?
Why?
Not to be confused with Bunn [1], the coffee maker makers.
[1] www.bunn.com
Hahahahahhaahhahahahahahhahahahahahhahahaha.
Regards.
Classic - brand new blog post:
> We’re hiring engineers.
Careers page:
> Sorry, no job openings at the moment.
It's the Anthropic careers page that you're likely looking for now:
https://www.anthropic.com/jobs?team=4050633008
Is it just me or does this page keep jumping back to the top when I try to scroll?
Same on iOS. It was probably vibe coded.
It's doing that for me as well (desktop Safari).
It's doing it to me as well in Brave on macOS.
Maybe the engineers are Claude agents.
[dead]
[dead]
[flagged]
deno won, rust won
Why the hell is a CLI coding agent built in JavaScript?
It’s wild what happens when a generation of programmers doesn’t know anything except webdev. How far from grace we have fallen.
The big advantage of a language like JavaScript of Python for a CLI tool of this nature is that they naturally support adding extensions or plugins.
That's quite a bit harder if your tool is built using a compiled language like Go.
Well not gonna use Bun anymore I guess
Why not?
Because I avoid all major AI players with everything I got as all of them are thieves.
...you do know that YC has backed several AI companies, right?
Does that make it a big AI player? I only read shit on here.
Did you donate money or time to Bun?
Why would I
There you go.
Thank you for showing exactly why acquisitions like this will continue to happen.
If you don't support tools like Bun, don't be surprised to see them raise money from VCs and get bought out by large companies.
I make 2k a month i dont have the financial freedom to support Javascript runtimes
oh well. it was cool while it lasted! I guess I'll figure out how to make deno do what I want, now.
anthropic wont win, and will just get bought out by an ibm or oracle in the end...time to migrate from bun now