Recent resurgence of tools going native is really interesting. For years (at least last decade) I've heard mantra that JIT/interpreters are getting so good that you don't need to bother with native languages anymore, clients can be written in Python/Ruby, because it's a one-off operation anyway, etc., etc.
And now everyone is rewriting everything in Go/Rust left and right.
It really comes down to the Go and Rust ecosystems being easy to work with. A decade or two ago, writing a native app meant setting up an automake/autoconf Rube Goldberg machine - especially difficult if you needed to build for multiple architectures.
I'd argue Rust and Go are even easier to work with than Python/JS/TS. The package management is better, and static linked native binaries eliminate so many deployment headaches.
It is complicated, though. Not as complicated as quantum physics, but still far more complicated than it needs to be - especially if you care about multiple architectures and platforms. At one point I was making builds for X86, AMD64 and Itanium on Windows, MacOS, Linux (which itself was subdivided into various distributions with different glibc/openssl/whatever versions). It took more work maintaining the build pipeline than working on features.
Go and Rust prove you can get most of the benefit of C/C++ without paying that complexity cost.
Both can be true. It can be easy to learn and also a complete pain to set up and get right for every new project and menu of architectures you want to support.
Having recently done some investigation along the same lines for a side project that I did not continue to work on: is it not just a matter of passing CGO_LDFLAGS=—static and building on musl?
To say what your siblings said, but more generally: basically this only works for Linux. Other OSes generally don't let you do this, you must have some degree of dynamic linking.
Go used to try and let you do it, but has walked back those implementations after all the bugs they've caused, in my understanding.
> And now everyone is rewriting everything in Go/Rust left and right.
Especially interesting for software that are 99.9% of the time waiting for inference to come back to you. Sure, makes sense to rewrite something that is heavily relying on the CPU, or where you want an easier time to deal with concurrency, but I feel like that's not what makes Codex takes long time to finish a prompt.
With that said, I also rewrote my local LLM agent software to Rust, as it's easier to deal with concurrency compared to my initial Python prototype. That it compiles into a neat binary is an additional benefit, but could have as easily gone with Go instead of Rust.
> Especially interesting for software that are 99.9% of the time waiting for inference to come back to you.
In a different domain, I’ve seen a cli tool that requests an oauth token in Python be rewritten to rust and have a huge performance boost. The rust version had requested a token and presented it back to the app in a few milliseconds, but it took Python about five seconds just to load the modules the oauth vendor recommends.
That’s a huge performance boost, never mind how much simpler it is to distribute a compiled binary.
I’ve spent some time optimizing Python performance in a web app and CLI, and yeah it absolutely sucks.
Module import cost is enormous, and while you can do lots of cute tricks to defer it from startup time in a long-running app because Python is highly dynamic, for one-time CLI operations that don’t run a daemon or something there’s just nothing you can do.
I really enjoy Python as a language and an ecosystem, and feel it very much has its place…which is absolutely not anywhere that performance matters.
EDIT: and there’s a viable alternative. Python is the ML language.
Python's startup cost is terrible. Same with Node. Go is very good, but Rust is excellent.
Even if a GC'ed language like Go is very fast at allocating/deallocating memory, Rust has no need to allocate/deallocate some amount of memory in the first place. The programmer gives the compiler the tools to optimize memory management, and machines are better at optimizing memory than humans. (Some kinds of optimizations anyway.)
TBH I'm still surprised how quickly Go programs start up given how much stuff is there in init() functions even in the standard library (e.g. unicode tables, etc)
I know far too much about python packaging while only knowing a little about it.
I agree it’s hell. But I’ve not found many comprehensive packaging solutions that aren’t gnarly in some way.
IMHO the Python Packaging community have done an excellent job of producing tools to make packaging easy for folks, especially if you’re using GitHub actions. Check out: https://github.com/pypa/cibuildwheel
Pypa have an extensive list of GitHub actions for various use cases.
I think most of us end up in the “pure hell” because we read the docs on how to build a package instead of using the tools the experts created to hide the chaos. A bit like building a deb by hand is a lot harder than using the tools which do it for you.
That’s fair. I’m also thinking about the sheer size of Python apps that make use of the GPU. I have to imagine a C++ app performing neural network shenanigans would’t be >1GB before downloading weights.
Everything follows a cycle. Client/server, flat/skeuomorphic, compiled/interpreted, cloud/colo, colorful/minimalist, microservices/monolithic, standards-driven/vendor-driven, open/proprietary, CPU/xPU. There is a natural tension in the technical universe that pushes progress in one direction that loads the spring in the other direction. Over time you'll see the cycles rhyme.
"The existing code base makes certain assumptions -- specifically, it assumes that there is automatic garbage collection -- and that pretty much limited our choices."
Feels like the big shift is Rust hitting critical mass mindshare and LLM assisted translation making these rewrites viable. Rust is a very vibable language.
An announcement of Codex CLI being rewritten in C++ would be met with confusion and derision.
Once a Rust program finally compiles it's much likelier that it's correct, in comparison to other languages, at least in the sense of avoiding unexpected runtime behavior.
I suspect it's accidents of history that those are "native". Go's advantage is in its Google backing, and Rust is just a good language design that's managed to hit on the right kind of community. As far as I can see all the reasons that native was unnecessary are still valid.
> And then they ship an Electron based GUI on top.
If this catches on, and more tools get the "chatgpt, translate this into rust, make it go brrr" treatment, hopefully someone puts in the time & money to take tauri that extra 10-20% left to make it an universal electron replacement. Tauri is great, but still has some pain points here and there.
Speaking for myself: one reason would be because it's one of the things LLMs/tools like Codex are the most useful for.
When I have a smallish application, with tests, written in one language, letting an LLM convert those files into another language is the single task I'm most comfortable handing over almost entirely. Especially when I review the tests and all tests in the new language are passing.
Codex is terrible compared to claude code, even though the individual models of anthropic are not really that much better than openAI. They should have made that their top priority instead of a rewrite which will just make the improvements take a back seat.
This has been coming for awhile now (there's been a Rust fork available to try forever); as I understand it, the impetus here is just getting the CLI installed places where Node is annoying to install.
With all the RIIR going on nodejs ecosystem, I am waiting for the blog posts on how people completely got rid of nodejs, rewriting the whole backend in Rust and how everything is so much better.
Waiting for Show HN: AbrasionNext, the framework evolution for frontend devs, with SaaS cloud deployment.
It seems Rust "just" needs RoR level web framework and Qt level GUI framework to take over the world, everything else is already conquered (with some Go conclaves still holding on).
This is one of the most ridiculous RIIR I've seen.
It's a CLI tool that makes API calls. I'd bet my bottom dollar that the performance difference between API-wrapping CLI tools in something like Ruby/Python vs Rust/C++ is negligible in perceived experience.
If they wanted people to not have a dependency on Node pre-installed, they could have shipped Single Executable Application's [0] or used a similar tool for producing binaries.
Having the CLI self-contained and easy to cross-compile is a huge improvement, as is the binary size (from what I've seen, wrapped Node binaries are huge). Also, a CLI tool should have low latency and no matter how good your engine is, an interpreter will have much higher startup latency than a proper binary.
I imagine a reasonable amount. The maintainer who is doing most of the Rust rewrite submitted a PR to one of the Ratatui widget libraries I maintain that seemed to be Codex produced[1].
This is an interesting proof of what I keep reading about.
Where people are quick at making something, like a PR, with AI, but then doing the last 10% is left in the air.
If codex was half as good as they say it is in the presentation video, surely they could’ve sent a request to the one in chatgpt from their phone while waiting for the bus, and it would’ve addressed your comments…
> Aider writes most of its own code, usually about 70-80% of the new code in each release. These statistics are based on the git commit history of the aider repo.
It's interesting how people conflate language choice with memory management strategy, compilation mechanism, distribution toolset, and type system strictness. I mean, on one hand, people explain that Rust's manual memory management makes it fast, but then also praise garbage collected Go for its speed and low startup latency. It's not that manual=fast and GC=slow: it's that Go's GC doesn't suck, and you can make fast GCed programs and slow manually memory managed ones (like games and their infamous loading screens). The equation of GC with slow is just a simplifying idea embedded in industry consciousness.
Likewise, you can make a single file distribution of a TypeScript program just fine. (Bun has built in support even.) But people don't think of it as a "thing" in that ecosystem. It's just not the culture. TypeScript means npm or Electron. That's the equivalence embedded in the industry hive mind.
To be clear, I'm not decrying this equivalence. Simplification is good. We use language as a shorthand for a bundle of choices not necessarily tied to language itself. You can compile Python with Nuitka or even interpret C. But why would you spend time choosing a point on every dimension of technical variation independently when you could just pick a known good "preset" called a language?
The most important and most overlooked part of this language choice bundle is developer culture. Sure, in principle, language choice should be orthogonal to mindset, areas of interest, and kinds of aptitude. But let's be real. It isn't. All communities of humans being and Go developers evolve shared practices, rituals, shibboleths, and priesthoods. Developer communities are no exception.
When you choose, say, Rust, you're not just choosing a language. You're choosing that collection of beliefs and practices common among people who like to use that language. Rust people, for example, care very much about, say, performance and security. TypeScript people might care more about development speed. Your C people are going to care more about ABI stability than either.
Even controlling for talent level, you get different emphases. The Codex people are making a wire format for customizing the agent. C people would probably approach the problem by making a plugin system. TypeScript people would say to just talk to them and they'll ship what you need faster than you can write your extension.
Sometimes you even see multiple clusters of practitioners. Game development and HFT might use the same C++ syntax, but I'd argue they're a world apart and less similar to each other than, say, Java and C# developers are.
That's why language debates get so heated: they're not just expressing a preference. They're going to war for their tribe. Also, nothing pisses a language community off more than someone from a different community appropriating their sacred syntax and defiling it by using it wrong.
Codex isn't so much swapping out syntax as making a bet that Rust cultural practices outcompete TypeScript ones in this niche. I'm excited to see the outcome of this natural experiment.
I enjoy this ethnographic take on programming language choice being as much about culture as technical features. I wonder what effect LLM coding agents will have here as it makes big rewrites between languages trivial and potentially allows developers to quickly shift their programs between languages to gain advantages. Echoes some of the immigration debates the right stirs up.
I'm surprised that performance wasn't on the list. The main reason I started Brokk on the JVM was for access to https://github.com/joernio/joern, but high performance code that can easily go multicore is a very nice secondary reason.
It's not totally unjustified to hone in on GC as over focusing. But it feels strongly overfocusing to me.
Efficiency is the top level goal, and that equates directly to performance in most computing tasks: more efficiency means being able to let other work happen. There's times where single threaded outright speed is better, but usually in computing we try as hard as possible to get parallelism or concurrency in our approaches such that efficiency can directly translate to overall performance.
Overall performance as a bullet seems clear enough. Yes it's occluded by a mention of GC, but I don't think the team is stupid enough to think GC is the only performance factor win they might get here, even if they don't list other factors.
Even a pretty modest bit of generosity makes me think they're doing what was asked for here. Performance very explicitly is present, and to me, I think quite clearly a clear objective.
It sounds like what they're actually doing is going closed-source, and using a Rust rewrite as cover for that.
This is interesting, because the current Codex software supports third-party model providers. This makes sense for OpenAI Codex, because is is the underdog compared to Claude Code, but perhaps they have changed their stance on this.
[edit] Seems that this take is incorrect; the source is in the tree.
I would bet it took more wall-clock time to type out that comment than it would have for any number of AI agents to snap the required equivalent of `if not re.match(...): continue` into place
// TODO: Verify server name: require `^[a-zA-Z0-9_-]+$`?
There may be several elements of server name verification to perform.
That regex does not cover the complete range of possibilities for a server name.
The author apparently decided to punt on the correctness of this low-risk test -- pending additional thought, research, consultation, or simply higher prioritization.
OK, that's a good sign. I didn't see any links to the source in the Github thread, and the only mention of contributing in that thread is an email to apply for a job. I'm not going to delete my comment, because the subtree you showed doesn't have a LICENSE file so it isn't totally clear, but I agree it does appear to at least continue to be source-available.
Reviewing the source for this tree, looks like it's been in public development for a fair amount of time, with many PRs.
It is in fact totally clear. There is a LICENSE file in the root of the repository. Adding a new subtree (directory) should not call into question whether or not that tree is covered by the preexisting license. That's silly. If they wanted to change the license then there needs to be an actual change to the license.
People will do anything to avoid reading and maintaining someone else's code. If that means rewriting in native and someone (usually VC's) is paying for it so be it. If you think working in existing TS/JS code is hard, wait until someone hands you their 100k+ line Rust code and asks you to make changes. In five years, another big shift to rewrite and change everything.
Making changes to huge rust projects is quite easy. For a substantial alteration, you make your change, the compiler tells you the 100 problems it caused, and you fix them all (~50% auto fix, 30% Claude/Codex, 20% manual), then the program probably does the thing.
Architecting the original 100kloc program well requires skill, but that effort is heavily front loaded.
Probably minimal, it's mostly lower memory use from not having to boot a separate V8 engine with its own GC, like how Electron apps have to boot a separate browser engine. But CPU-wise it's not doing anything interesting on the client.
The neat thing for me is just not needing to setup a Node environment. You can copy the native binary and it should run as-is.
There are various solutions for turning node programs into standalone binaries, by more-or-less including Node within the binary. One I've used before with success is "pkg".
If you're parsing JSON or other serialization formats, I expect that can be nontrivial. Yes, it's dominated by the llm call, but serializing is pure CPU.
Also, ideally your lightweight client logic can run on a small device/server with bounded memory usage. If OpenAI spins up a server for each codex query, the size of that server matters (at scale/cost) so shaving off mb of overhead is worthwhile.
I get why OpenAI didn't invest time and money into this, but I do wonder if there's some reason that nobody have written a JavaScript frontend for LLVM.
There shouldn't be a reason why you couldn't and it would give you performance and zero dependency install.
> but I do wonder if there's some reason that nobody have written a JavaScript frontend for LLVM
Astral folks are taking notes. (I wouldn't be surprised if they already have a super secret branch where they rewrite Python and make it 100x faster, but without AI bullshit like Mojo).
What do you mean by “a JavaScript frontend for LLVM”?
Edit: ah, I see, I read “LLM” instead of LLVM at first! It's only after I posted my question that realized my mistake.
I'm not sure it makes sense to compile JavaScript natively, due to the very dynamic nature of the language, you'd end up with a very slow implementation (the JIT compilers make assumptions to optimize the code and fall back to the slow baseline when the assumptions are broken, but you can't do that with AoT).
I think you've highlighted the real engineering challenge of using LLMs - overcoming the uncertainty. In some contexts, a reasonable level of uncertainty is fine, as the tool itself is being used by experts ( ie the coding tools), but in other cases, you need to engineer a lot more guardrails to ensure a result of sufficient quality.
Getting Node installed is a blocker for some of their customers, evidently. It doesn't surprise me. If it holds back one mid sized enterprise customer, the rewrite is worth it.
I have largely avoided the entire typescript / JavaScript ecosystem specifically because I don't want to deal with node or its ecosystem. It's just so confusing with yarn, npm, npx, then the build systems gulp, grunt, webpack, etc etc - felt very overwhelming.
Yes, if I spent more time learning these things, it would become simple but that seemed like a massive waste of time.
This needs admin permissions, which means a ticket with IT and a good chance it'll be rejected since it's scary as it'll open up the door to many admin level installs of software that IT has no control over.
Installing node under WSL is a better approach anyway, but that'll make it harder for enterprise customers still.
node install can be a real pain sometimes. node ecosystem has had a number of security related issues over the years. supply chain attacks are one of my main fears.
I think most package systems are going to start, if not already, facing real supply chain attacks. The node ecosystem, from an attacker's lens, had quite a heavy leaning ratio of non-security conscious users which makes a better breeding ground for exploitation.
And now everyone is rewriting everything in Go/Rust left and right.
I'd argue Rust and Go are even easier to work with than Python/JS/TS. The package management is better, and static linked native binaries eliminate so many deployment headaches.
Sometimes people really need to follow Yoda and Mr Miyagi advices, instead of jumping right into it.
Go and Rust prove you can get most of the benefit of C/C++ without paying that complexity cost.
Go used to try and let you do it, but has walked back those implementations after all the bugs they've caused, in my understanding.
Especially interesting for software that are 99.9% of the time waiting for inference to come back to you. Sure, makes sense to rewrite something that is heavily relying on the CPU, or where you want an easier time to deal with concurrency, but I feel like that's not what makes Codex takes long time to finish a prompt.
With that said, I also rewrote my local LLM agent software to Rust, as it's easier to deal with concurrency compared to my initial Python prototype. That it compiles into a neat binary is an additional benefit, but could have as easily gone with Go instead of Rust.
In a different domain, I’ve seen a cli tool that requests an oauth token in Python be rewritten to rust and have a huge performance boost. The rust version had requested a token and presented it back to the app in a few milliseconds, but it took Python about five seconds just to load the modules the oauth vendor recommends.
That’s a huge performance boost, never mind how much simpler it is to distribute a compiled binary.
Module import cost is enormous, and while you can do lots of cute tricks to defer it from startup time in a long-running app because Python is highly dynamic, for one-time CLI operations that don’t run a daemon or something there’s just nothing you can do.
I really enjoy Python as a language and an ecosystem, and feel it very much has its place…which is absolutely not anywhere that performance matters.
EDIT: and there’s a viable alternative. Python is the ML language.
Even if a GC'ed language like Go is very fast at allocating/deallocating memory, Rust has no need to allocate/deallocate some amount of memory in the first place. The programmer gives the compiler the tools to optimize memory management, and machines are better at optimizing memory than humans. (Some kinds of optimizations anyway.)
I agree it’s hell. But I’ve not found many comprehensive packaging solutions that aren’t gnarly in some way.
IMHO the Python Packaging community have done an excellent job of producing tools to make packaging easy for folks, especially if you’re using GitHub actions. Check out: https://github.com/pypa/cibuildwheel
Pypa have an extensive list of GitHub actions for various use cases.
I think most of us end up in the “pure hell” because we read the docs on how to build a package instead of using the tools the experts created to hide the chaos. A bit like building a deb by hand is a lot harder than using the tools which do it for you.
https://nodejs.org/api/single-executable-applications.html
In my opinion, bundling the application Payload would be sufficient for interpreted languages like python and JavaScript
this is next Trends
"The existing code base makes certain assumptions -- specifically, it assumes that there is automatic garbage collection -- and that pretty much limited our choices."
An announcement of Codex CLI being rewritten in C++ would be met with confusion and derision.
Why would you say this for Rust in particular?
https://news.ycombinator.com/item?id=44149809
The comment you linked is talking about unspecified application's runtime errors.
Seems like confirmation bias.
Note that most of these rewrites wouldn't be needed if the JIT language would be Java, C#, Common Lisp, Dart, Scheme, Raket.
And all of that list also have AOT compilers, and JIT cache tooling.
If this catches on, and more tools get the "chatgpt, translate this into rust, make it go brrr" treatment, hopefully someone puts in the time & money to take tauri that extra 10-20% left to make it an universal electron replacement. Tauri is great, but still has some pain points here and there.
When I have a smallish application, with tests, written in one language, letting an LLM convert those files into another language is the single task I'm most comfortable handing over almost entirely. Especially when I review the tests and all tests in the new language are passing.
Claude Code tends to write meaningless tests just to get them to pass—like checking if 1 + 1 = 2—and somehow considers that a job well done.
If it's not possible today, what are the challenges and where does a human need to step in and correct the model?
Waiting for Show HN: AbrasionNext, the framework evolution for frontend devs, with SaaS cloud deployment.
Trying to bring that with Slint: https://slint.rs
This is not happening. The new folks have moved to SPA/RSC and a RoR type framework doesn't make much sense in that context.
its already HERE https://loco.rs/
I writing a production level app right now with it
I did this for several projects, works great with much lower costs and compute/memory usage.
It's a CLI tool that makes API calls. I'd bet my bottom dollar that the performance difference between API-wrapping CLI tools in something like Ruby/Python vs Rust/C++ is negligible in perceived experience.
If they wanted people to not have a dependency on Node pre-installed, they could have shipped Single Executable Application's [0] or used a similar tool for producing binaries.
Or used Deno/Bun's native binary packaging.
[0] - https://nodejs.org/api/single-executable-applications.html
It's often parallel processing of I/O (network, filesystem) and computational tasks like testing and compiling code.
[1]: https://github.com/joshka/tui-markdown/pull/80
If codex was half as good as they say it is in the presentation video, surely they could’ve sent a request to the one in chatgpt from their phone while waiting for the bus, and it would’ve addressed your comments…
I suspect its not much because I never see any stats published by any of these companies.
> Aider writes most of its own code, usually about 70-80% of the new code in each release. These statistics are based on the git commit history of the aider repo.
interesting model. every prompt response acceptance gets a git commit without human modifications .
Likewise, you can make a single file distribution of a TypeScript program just fine. (Bun has built in support even.) But people don't think of it as a "thing" in that ecosystem. It's just not the culture. TypeScript means npm or Electron. That's the equivalence embedded in the industry hive mind.
To be clear, I'm not decrying this equivalence. Simplification is good. We use language as a shorthand for a bundle of choices not necessarily tied to language itself. You can compile Python with Nuitka or even interpret C. But why would you spend time choosing a point on every dimension of technical variation independently when you could just pick a known good "preset" called a language?
The most important and most overlooked part of this language choice bundle is developer culture. Sure, in principle, language choice should be orthogonal to mindset, areas of interest, and kinds of aptitude. But let's be real. It isn't. All communities of humans being and Go developers evolve shared practices, rituals, shibboleths, and priesthoods. Developer communities are no exception.
When you choose, say, Rust, you're not just choosing a language. You're choosing that collection of beliefs and practices common among people who like to use that language. Rust people, for example, care very much about, say, performance and security. TypeScript people might care more about development speed. Your C people are going to care more about ABI stability than either.
Even controlling for talent level, you get different emphases. The Codex people are making a wire format for customizing the agent. C people would probably approach the problem by making a plugin system. TypeScript people would say to just talk to them and they'll ship what you need faster than you can write your extension.
Sometimes you even see multiple clusters of practitioners. Game development and HFT might use the same C++ syntax, but I'd argue they're a world apart and less similar to each other than, say, Java and C# developers are.
That's why language debates get so heated: they're not just expressing a preference. They're going to war for their tribe. Also, nothing pisses a language community off more than someone from a different community appropriating their sacred syntax and defiling it by using it wrong.
Codex isn't so much swapping out syntax as making a bet that Rust cultural practices outcompete TypeScript ones in this niche. I'm excited to see the outcome of this natural experiment.
We are in the middle of a transition in programming paradigms.
Let the AI coding models flamewars start.
Unfortunately that in an utopia that will never realise.
People will keep learning programming languages based on hearsay, whatever books they find somewhere, influencers and what not.
> Optimized Performance — no runtime garbage collection, resulting in lower memory consumption
Introducing the list (efficiency resonates with me as a more specific form of performance):
> Our goal is to make the software pieces as efficient as possible and there were a few areas we wanted to improve:
the others ("zero dependencies") are not actually related to efficiency
Efficiency is the top level goal, and that equates directly to performance in most computing tasks: more efficiency means being able to let other work happen. There's times where single threaded outright speed is better, but usually in computing we try as hard as possible to get parallelism or concurrency in our approaches such that efficiency can directly translate to overall performance.
Overall performance as a bullet seems clear enough. Yes it's occluded by a mention of GC, but I don't think the team is stupid enough to think GC is the only performance factor win they might get here, even if they don't list other factors.
Even a pretty modest bit of generosity makes me think they're doing what was asked for here. Performance very explicitly is present, and to me, I think quite clearly a clear objective.
This is interesting, because the current Codex software supports third-party model providers. This makes sense for OpenAI Codex, because is is the underdog compared to Claude Code, but perhaps they have changed their stance on this.
[edit] Seems that this take is incorrect; the source is in the tree.
I would bet it took more wall-clock time to type out that comment than it would have for any number of AI agents to snap the required equivalent of `if not re.match(...): continue` into place
That regex does not cover the complete range of possibilities for a server name.
The author apparently decided to punt on the correctness of this low-risk test -- pending additional thought, research, consultation, or simply higher prioritization.
Reviewing the source for this tree, looks like it's been in public development for a fair amount of time, with many PRs.
Architecting the original 100kloc program well requires skill, but that effort is heavily front loaded.
It's a way to close off codex. There's no point in making a closed source codex if it's in typescript. But there is if it's in rust.
This is just another move to make OpenAI less open.
The neat thing for me is just not needing to setup a Node environment. You can copy the native binary and it should run as-is.
Also, ideally your lightweight client logic can run on a small device/server with bounded memory usage. If OpenAI spins up a server for each codex query, the size of that server matters (at scale/cost) so shaving off mb of overhead is worthwhile.
The big one is not having node as a dependency. Performance, extensibility, safety, yeah, don't really warrant a rewrite.
There shouldn't be a reason why you couldn't and it would give you performance and zero dependency install.
Astral folks are taking notes. (I wouldn't be surprised if they already have a super secret branch where they rewrite Python and make it 100x faster, but without AI bullshit like Mojo).
Edit: ah, I see, I read “LLM” instead of LLVM at first! It's only after I posted my question that realized my mistake.
I'm not sure it makes sense to compile JavaScript natively, due to the very dynamic nature of the language, you'd end up with a very slow implementation (the JIT compilers make assumptions to optimize the code and fall back to the slow baseline when the assumptions are broken, but you can't do that with AoT).
That's a good point, maybe TypeScript would be a better candidate.
For what it would take to compile TS to native code, check out AssemblyScript.
Hilarious take.
Usage is in the eye of the user, I see.
Yes, if I spent more time learning these things, it would become simple but that seemed like a massive waste of time.
This needs admin permissions, which means a ticket with IT and a good chance it'll be rejected since it's scary as it'll open up the door to many admin level installs of software that IT has no control over.
Installing node under WSL is a better approach anyway, but that'll make it harder for enterprise customers still.
https://nodejs.org/en/download
I never used nvm.
If someone doesn't get this, it is a skill issue.