I've built a load of utilities that do that just fine. I use vim as an editor.
The Visual Studio toolchain does have LTSC and stable releases - no one seems to know about them though. see: https://learn.microsoft.com/en-gb/visualstudio/releases/2022... - you should use these if you are not a single developer and have to collaborate with people. Back like in the old days when we had pinned versions of the toolchain across whole company.
> The Visual Studio toolchain does have LTSC and stable releases - no one seems to know about them though.
You only get access to the LTSC channel if you have a license for at least Visual Studio Professional (Community won't do it); so a lot of hobbyist programmers and students are not aware of it.
On the other hand, its existence is in my experience very well-known among people who use Visual Studio for work at some company.
That's not correct. You don't have to give your credit card details or even be logged in but you are still required to have any Visual Studio license. For hobbyists and startups the VS Community license is enough but larger companies need a VS Professional license even for the VS Build Tools.
How strict Microsoft is with enforcement of this license is another story.
You do not need a Professional or Enterprise license to use the Visual Studio Build Tools:
> Previously, if the application you were developing was not OSS, installing VSBT was permitted only if you had a valid Visual Studio license (e.g., Visual Studio Community or higher).
Well, let's say this is the world view of all companies about open-source software. Then what happens. If people "tend to not give crap" about licenses, all the nice guarantees of GPL etc also disappear.
There are zero guarantees and commercial software uses GPLd software as parts of their products all the time. Licenses do not work and you shouldn't respect them whenever you can.
Toolchains on linux are not clear from dependency hell either - ever install an npm package that needs cmake underneath? glibc dependencies that can't be resolved because you need two different versions simultaneously in the same build somehow... python in another realm here as well. That shiny c++ project that needs a bleeding edge boost version that is about 6 months away from being included in your package manager. Remember patching openSSL when heartbleed came around (libssHELL).
Visual studio is a dog but at least it's one dog - the real hell on windows is .net framework. The sheer incongruency of what version of windows has which version of .net framework installed and which version of .net your app will run in when launched... the actual solution at scale for universal windows compatibility on your .net app is to build a c++ shim that checks for .net beforehand and executes it with the correct version in the event of multiple version conflict - you can literally have 5 fully unique runtimes sharing the same .net target.
When was the last time you actually used. NET? Because that's absolutely not how it is. The. NET runtime is shipped by default with Windows and updated via WU. Let alone that you're talking about .NET Framework which has been outdated for years.
Yes and in the wild believe it or not you'll find windows 7 and windows 8.
We had just deprecated support for XP in 2020 - this was for a relatively large app publisher ~10M daily active users on windows. The installer was a c++ stub which checked the system's installed .NET versions and manually wrote the app.config before starting the .net wrapper (or tried to install portable .NET framework installer if it wasn't found at all).
The app supported .NET 3.5* (2.0 base) and 4 originally, and the issue was there was a ".NET Framework Client Profile" install on as surprising amount of windows PCs out there, and that version was incompatible with the app. If you just have a naked .NET exe, when you launch it (without an app.config in the current folder) the CLR will decide which version to run your app in - usually the "highest" version if several are detected... which in this case would start the app in the lightweight version and error out. Also, in the app.config file you can't tell it to avoid certain versions you basically just say "use 4 then 2" and you're up to the mercy of the CLR to decide which environment it starts you in.
This obviated overrides in a static/native c++ stub that did some more intelligent verifications first before creating a tailored app.config and starting the .net app.
Point? I’m SRE on .Net project, we have been through 6-8-10 and its cost us about 2ish hours of work each time. As long as you don’t get crazy, .Net upgrades is just matter of new SDK and runtime and away you go.
Since .NET 10 still doesn't support Type Libraries quite a few new Windows projects must be written in .NET Framework.
Microsoft sadly doesn't prioritize this so this might still be the case for a couple of years.
One thing I credit MS for is that they make it very easy to use modern C# features in .NET Framework. You can easily write new Framework assemblies with a lot of C# 14 features. You can also add a few interfaces and get most of it working (although not optimized by the CLR, e.g. Span). For an example see this project: https://www.nuget.org/packages/PolySharp/
It's also easy to target multiple framework with the same code, so you can write libraries that work in .NET programs and .NET Framework programs.
Most likely never will, because WinRT is the future and WinRT has replaced type libraries with .NET metadata. At least from MS point of view.
The current solution is to use the CLI tools just like C++.
However have you looked into ComWrappers introduced in .NET 8, with later improvements?
I still see VB 6 and Delphi as the best development experience for COM, in .NET it wasn't never that great, there are full books about doing COM in .NET.
Because that’s pretty much any freaking thing - oh Python, oh PHP, oh driving a fork lift, oh driving a car.
Once you invest time in using and learning it is non issue.
I do get pissed off when I want to use some Python lib bit it just doesn’t work out of the box, but there is nothing that works out the box without investing some time.
Just like a car get a teenager into a car he will drive into first tree.
Posting BS on Facebook shouldn’t be benchmark for how easy things should be.
.NET does have flags to include the necessary dependencies with the executable these days so you can just run the .exe and don't need to install .net on the host machine. Granted that does increase the size of the app (not to mention adding shitton of dll's if you don't build as single executable) but this at least is a solved problem.
They do now, after .net core and several other iterations. You'll also be shipping a huge executable compared to a clr linked .net app (which can be surprisingly small).
I went from POP OS (Ubuntu) to EndeavourOS (Arch) Linux because some random software with an appimage or whatever refused to run with Ubuntus “latest” GLIBC and it ticked me off, I just want to run more modern tooling, havent had any software I couldnt just run on Arch, going on over a year now.
Indeed. As late as 2 hours ago I had to change the way I build a private Tauri 2.0 app (bundled as .AppImage) because it wouldn't work on latest Kubuntu, but worked on Fedora and EndeavourOS. So now I have to build it on Ubuntu 22.04 via Docker. Fun fun.
Had fewer issues on EndeavourOS (Arch) compared to Fedora overall though... I will stay on Arch from now on.
>Toolchains on linux are not clear from dependency hell either - ever install an npm package that needs cmake underneath?
That seems more a property of npm dependency management than linux dependency management.
To play devil's advocate, the reason npm dependency management is so much worse than kernel/os management, is because their scope is much bigger, 100x more package, each package smaller, super deep dependency chains. OS package managers like apt/yum prioritize stability more and have a different process.
This is one of the things that tilts me about C and C++ that has nothing to do with mem safety: The compile/build UX is high friction. It's a mess for embedded (No GPOS) too in comparison to rust + probe-rs.
That hasn't been my experience at all. Cross-compiling anything on Rust was an unimaginable pain (3 years or so ago). While GCCs approach of having different binaries with different targets does have its issues, cross compiling just works.
Ah sorry. I should clarify. Not referring to specifically cross compiling; just general compiling. In rust weather PC or embedded, I run Cargo run. For C or C++, it's who knows. A provincial set of steps for each project, error messages, makes me get frustrated. I keep a set of notes for each one I touch to supplement the project's own docs. I am maybe too dumb or inexperienced in some cases, but I am having a hard time understanding why someone would design that as the UX.
I want to focus on the project itself; not jump through hoops in the build process. It feels hostile.
For cross compiling to ARM from a PC in rust in particular, you do one CLI cmd to add the target. Then cargo run, and it compiles, flashes, with debug output.
These are from anecdotes. I am probably doing something wrong, but it is my experience so far.
Very clearly a better option than continuing to use Pip. Even if they do change the license in a few years I will definitely take several years of not being shat on by Pip over the comparatively minor inconvenience of having to switch to an open fork of uv when they rug-pull. If they ever do.
Continuing to use Pip because Astral might stop maintaining uv in future is stupidly masochistic.
The counterpoint of this is Linux distros trying to resolve all global dependencies into a one-size-fits-nothing solution - with every package having several dozen patches trying to make a brand-new application release work with a decade-old release of libfoobar. They are trying to fit a square peg into a round hole and act surprised when it doesn't fit.
And when it inevitably leads to all kinds of weird issues the packagers of course can't be reached for support, so users end up harassing the upstream maintainer about their "shitty broken application" and demanding they fix it.
Sure, the various language toolchains suck, but so do those of Linux distros. There's a reason all-in-one packaging solutions like Docker, AppImage, Flatpak, and Snap have gotten so popular, you know?
> The counterpoint of this is Linux distros trying to resolve all global dependencies into a one-size-fits-nothing solution - with every package having several dozen patches trying to make a brand-new application release work with a decade-old release of libfoobar. They are trying to fit a square peg into a round hole and act surprised when it doesn't fit.
This is only the case for debian and derivatives, lol. Rolling-release distributions do not have this problem. This is why most of the new distributions coming out are arch linux based.
Agreed, but I don't think that has to do with either it's "vanillaness" or the 6 month release schedule. Fedora does a lot of compatibility work behind the scenes that distros not backed by a large company more than likely couldn't afford.
The real kicker is when old languages also fall for this trap. The latest I'm aware of is GHC, which decided to invent it's own build system and install script. I don't begrudge them from moving away from Make, but they could have used something already established.
Had to do this back in 2018, because I worked with a client with no direct internet access on it's DEV/build machines (and even when there was connectivity it was over traditional slow/low-latency satellite connections), so part of the process was also to build an offline install package.
Well - "run as admin" wasn't a problem for that scenario - as I was also configuring the various servers.
(And - it is better on a shared-machine to have everything installed "machine-wide" rather than "per-user", same as PowerShell modules - had another client recently who had a small "C:" drive provisioned on their primary geo-fenced VM used for their "cloud admin" team and every single user was gobbling too much space with a multitude of "user-profile" specific PowerShell modules...)
But - yes, even with a highly trimmed workload it resulted in a 80gb+ offline installer. ... and as a server-admin, I also had physical data-center access to load that installer package directly onto the VM host server via external drive.
The purpose isn't information, the purpose is drama.
Er, sorry. I meant: the purpose isn't just drama—it's a declaration of values, a commitment to the cause of a higher purpose, the first strike in a civilizational war of independence standing strong against commercialism, corporatism, and conformity. What starts with a single sentence in an LLM-rewritten blog post ends with changing the world.
See? And I didn't even need an LLM to write that. My own brain can produce slop with an em dash just as well. :)
Is this post AI-written? The repeated lists with highlighted key points, the "it's not just [x], but [y]" and "no [a] just [b]" scream LLM to me. It would be good to know how much of this post and this project was human-built.
I was on the fence about such an identification. The first "list with highlighted key points" seemed quite awkward to me and definitely raised suspicion (the overall list doesn't have quite the coherence I'd expect from someone who makes the conscious choice; and the formatting exactly matches the stereotype).
But if this is LLM content then it does seem like the LLMs are still improving. (I suppose the AI flavour could be from Grammarly's new features or something.)
It's hated by everyone, why would people imitate it? You're inventing a rationale that either doesn't exist or would be stupider than the alternative. The obvious answer here it they just used an LLM.
"LinkedIn Standard English" is just the overly-enthusiastic marketing speak that all the wannabe CEOs/VCs used to spout. LLMs had to learn it somewhere
marketing people did have a tendency to be more bombastic and self aggrandizing than average, speaking pompous but devoid of meaning shit but normal people most definitely not speak like this:
> The build.bat above isn’t just a helper script; it’s a declaration of independence from the Visual Studio Installer
this is 100% GPT slop, you can even tell it's GPT specifically from the fact that it has a ; instead of — because the recent models were trained to use the emdash less and put a semicolon in the same places it used to throw emdashes in the past.
GPT-4o would have done
>The build.bat above isn’t just a helper script—it’s a declaration of independence from the Visual Studio Installer
>you know why LLMs repeat those patterns so much
Unlike you, I do know why LLMs can fall into repeating certain patterns and it most definitely has nothing to do with "how humans speak". The better the model (as a tool) the more it has been trained on artificially generated data that teaches it the "proper" way to do tasks. Instruction tuned models have nothing to do with the original release of GPT-3, they were their own thing ever since the release of chatGPT itself.
You can control what sort of patterns it falls into and this is why if you had any ability to notice things as a human being you would have seen how newer GPT generated content has less emdash spam even when the human generating the content doesn't bother touching up the text.
I came back around 2017*, expecting the same nice experience I had with VB3 to 6.
What a punch in the face it was...
I honestly cannot fathom anyone developing natively for windows (or even OSX) at this day and age.
Anything will be a webapp or a rust+egui multi-plataform developed on linux, or nothing. It's already enough the amount of self-hate required for android/ios.
* not sure the exact date. It was right in the middle of the WPF crap being forced as "the new default".*
What's exhausting is getting through a ten-paragraph article and realising there was only two paragraphs of actual content, then having to wade back through it to figure out which parts came from the prompt, and which parts were entirely made up by the automated sawdust injector.
That's not an AI problem, it's a a general blog post problem. Humans inject their own sawdust all the time. AI, however, can write concisely if you just tell it to. Perhaps you should call this stuff "slop" without the AI and then it doesn't matter who/what wrote it because it's still slop regardless.
I completely agree with your parent that it's tedious seeing this "fake and gay" problem everywhere and wonder what an unwinnable struggle it must be for the people who feel they have to work out if everything they read was AI written or not.
No thanks. I’m not going to install executables downloaded from an unknown GitHub account named marler8997 without even a simple hash check.
As others have explained the Windows situation is not as bad as this blog post suggests, but even if it was this doesn’t look like a solution. It’s just one other installation script that has sketchy sources.
You don't have to install executables downloaded from an unknown GitHub account named marler8997. You can download that script and read it just like any other shell script.
Just like those complaining about curl|sh on Linux, you are confusing install instructions with source code availability. Just download the script and read it if you want. The curl|sh workflow is no more dangerous that downloading an executable off the internet, which is very common (if stupid) and attracts no vitriol. In no way does it imply that you can not actually download and read the script - something that actually can't be done with downloaded executables.
It is somewhat different when your system forces binaries to be signed... but yeah, largely agreed. The abject refusal of curl|sh is strange to me, unless the refusers are also die-hard GPL adherents. Binaries are significantly more opaque and easier to hide malware in, in almost all cases.
Wait till they find out what the Visual Studio Installer itself does :) I guess this person just trusts a big company like Microsoft who keeps their source hidden more than a single developer who publishes all their source?
If any of this is relevant to you, you're already running Windows, which means Microsoft already has root on your machine, which means it's futile to try to limit the extent to which you trust their binaries.
I know Jonathan Marler for some of his Zig talks and his work in the win32 api bindings for Zig[0], they are even linked from Microsoft's own repo[1] (not sure why he has 2 github users/orgs but you can see it's the same person in the commits).
If you need the Windows(/App) SDK too for the WinRT-features, you can add `winget install --id Microsoft.WindowsSDK.10.0.18362` and/or `winget install --id Microsoft.WindowsAppRuntime.1.8`
Having been the person that used to support those packages, it’s not that simple. You need to pass what workloads you need installed too, and if it’s a project you’re not familiar with god help you.
I used to just install the desktop development one and then work through the build errors until I got it to work, was somewhat painful. (Yes, .vsconfig makes this easier but it still didn’t catch everything when last I was into Windows dev).
I thought for a moment I was missing something here. I always just use winget for this sort of thing as well. It may kickoff a bunch of things, but it’s pretty low effort and reliable.
Nearly all of the Windows hate i see comes from 20 year old takes . ( the bing/cortana / copilot / ads slop criticism is warranted, but is also easily disabled).
I wish open source projects would support MingW or at least not actively blocking it's usage. It's a good compiler that provides an excellent compatibility without the need of any extra runtime DLLs.
I don't understand how open source projects can insist on requiring a proprietary compiler.
There are some pretty useful abstractions and libraries that MinGW doesn't work with. Biggest example is the WIL[1], which Windows kernel programmers use and is a massive improvement in ergonomics and safety when writing native Windows platform code.
'MinGW' is not GCC; it's an ABI, and from the developer perspective it is also the headers and the libraries. You can have GCC MinGW, Clang MinGW, Rust MinGW, Zig MinGW, C# AOT MinGW.
if you want to link msvc built libraries (that are external/you dont have source), mingw may not be an option. for an example you can get steamworks sdk to build with mingw but it will crash at runtime
From the capitalization I can tell you and the parent might not be aware it's "minimal GNU for Windows" which I would tend to pronounce "min g w" and capitalize as "MinGW." I used to say ming. Now it's my little friend. Say hello to my little friend, mang.
For big C++ projects, the .vsconfig import/export way of handling Visual Studio components has worked well for the large teams I'm on. Tell someone to import a .vsconfig and the Visual Studio Installer does everything. Only times we've had issues is from forgetting to update it with components/SDK changes.
Yeah, seems like this is just ignorance around .vsconfig files. Makes life way easier. You can also just use the VS Build Tools exe to install things instead of the full VS installer, if you plan to use a different IDE.
> It’s so vast that Microsoft distributes it with a sophisticated GUI installer where you navigate a maze of checkboxes, hunting for which “Workloads” or “Individual Components” contain the actual compiler. Select the wrong one and you might lose hours installing something you don’t need.
I have a vague memory of stumbling upon this hell when installing the ldc compiler for dlang [1].
Actually not that complicated: You simply check in a global.json [0] where you specify the sdk and workload versions.
Then you also specify target platform sdk versions in the .csproj file and VS will automatically prompt the developer to install the correct toolchain.
At $workplace, we have a script that extracts a toolchain from a GitHub actions windows runner, packages it up, stuffs it into git LFS, which is then pulled by bazel as C++ toolchain.
This is the more scalable way, and I assume it could still somewhat easily be integrated into a bazel build.
Keeping CI entirely out of windows desktop development is the biggest efficiency and cost improvement I've seen in the last 15 years. Our CI toolchain broke so we moved back to a release manager doing it manually. It takes him 20x less time to build it and distribute it (scripted) than it does to maintain the CI pipeline and his desktop machine is several times faster than any cloud CI node we can get hold of.
Edit: Uses a shit load less actual energy than full-building a product thousands of times that never gets run.
* I wonder if Microsoft intentionally doesn't provide this first party to force everyone to install VS, especially the professional/enterprise versions. One could imagine that we'd have a vsproject.toml file similar to pyproject.toml that just does everything when combined with a minimal command line tool. But that doesn't exist for some reason.
Microsoft doesn't seem to care unless you're a company. That's the reason community edition is free. Individual licenses would be pennies to them, and they gain more than that by having a new person making things in their ecosystem. It's in their interest to make their platform accessible as possible.
For Visual Studio components that are versioned (like the C++ compilers/libraries), the version number is included in the component name.
You still have to install the tool that processes pyproject.toml so that doesn’t seem fair to hold against it. You are right that you still have to know whether to install 2022 or 2026.
One day I decided to port my text editor to Windows. Since it depends on pcre2 and treesitter, these two libraries had to be provided by the system.
In the span of ~2hrs I didn't manage to find a way to please Zig compiler to notice "system" libraries to link against.
Perhaps I'm too spoiled by installing a system wide dependency in a single command. Or Windows took a wrong turn a couple of decades ago and is very hostile to both developers and regular users.
I think providing purely-functional libraries as system dependencies that's tied to the whole tool chain at the time was the wrong decision by the Unix world.
The system libraries should only ship system stuff: interaction with the OS (I/O, graphics basics, process management), accessing network (DNS, IP and TLS). They should have stable APIs and ABIs.
Windows isn't hostile. It has a differnt paradigm and Unix (or more correctly usually GNU/Linux) people do not want to give up their worldview.
PCRE is basically only your apps's dependency. It has nothing else to do the rest of the operating system. So it is your responsibility to know how to build and package it.
If you depend on a library and can't figure out how you would compile against it, it's probably better for the end user that you don't make anything because you'll still need to package it up later unless you link statically.
I suspect the pitfall is how you or the zig compiler is linking. Unless you're involving things which vary by OS like hardware interaction, networking, file systems etc, you should not, with a new Lang in 2026, need to do anything special for cross-platform capabilities.
My understanding that "linkSystemLibrary" abstraction in build.zig only holds for Unix systems. And this in turn makes it impossible to build my program on Windows without modifying the build script.
It starts by not looking into Windows through UNIX developer glasses.
The only issue currently plaguing Windows development is the mess with WinUI and WinAppSDK since Project Reunion, however they are relatively easy to ignore.
>It starts by not looking into Windows through UNIX developer glasses.
People don't need any UNIX biases to just want multiple versions of MSVS to work the way Microsoft advertises. For example, with every new version of Visual Studio, Microsoft always says you can install it side-by-side with an older version.
But every time, the new version of VS has a bug in the install somewhere that changes something that breaks old projects. It doesn't break for everybody or for all projects but it's always a recurring bug report with new versions. VS2019 broke something in existing VS2017 installs. VS2022 broke something in VS2019. etc.
The "side-by-side-installs-is-supposed-to-work-but-sometimes-doesn't" tradition continues with the latest VS2026 breaking something in VS2022. E.g. https://github.com/dotnet/sdk/issues/51796
I once installed VS2019 side-by-side with VS2017 and when I used VS2017 to re-open a VS2017 WinForms project, it had red squiggly lines in the editor when viewing cs files and the build failed. I now just install different versions of MSVS in totally separate virtual machines to avoid problems.
I predict that a future version VS2030 will have install bugs that breaks VS2026. The underlying issue that causes side-by-side bugs to re-appear is that MSVS installs are integrated very deeply into Windows. Puts files in c:\windows\system32, etc. (And sometimes you also get the random breakage with mismatched MSVCRT???.DLL files) To avoid future bugs, Microsoft would have to re-architect how MSVS works -- or "containerize" it to isolate it more.
In contrast, gcc/clang can have more isolation without each version interfering with each other.
I'm not arguing this thread's msvcup.exe tool is necessary but I understand the motivations to make MSVS less fragile and more predictable.
Note that this also doesn't work on Linux - your system's package manager probably has no idea how to install and handle having multiple versions of packages and headers.
That's why docker build environments are a thing - even on Windows.
Build scripts are complex, and even though I'm pretty sure VS offers pretty good support for having multiple SDK versions at the same time (that I've used), it only takes a single script that wasn't written with versioning in mind, to break the whole build.
> Note that this also doesn't work on Linux - your system's package manager probably has no idea how to install and handle having multiple versions of packages and headers.
But this isn’t true. Many distros package major versions of GCC/LLVM as separate packages, so you install and use more than one version in parallel, no Docker/etc required
It can indeed be true for some things-such as the C library-but often not for the compilers
The closest thing I saw to this was some vendors shipping their SDKs with half the desktop userland (in a similar 'blob' fashion the post complains about), with shell scripts setting up paths so that their libs and tools are found before system ones.
Why? You may end up with something that doesn't get much attention anymore, but none of the official gui approaches have ever been removed as far as I know. Win32, MFC, winforms, wpf, winui, maui are all still available and apps using them are functional. Even winjs still works apparently, even if it was handed over.
I wouldn't start an app in most of them today, but I wouldn't rewrite one either without a good reason.
Well a number of them have horrific bugs in them which have zero attention. At least win32 has an abstraction level which allows you to work around them.
There’s a fun bug on WPF and form backgrounds for example which means on fractional DPI screens the background is tiled unpredictably. Had to patch that one up rather quickly one day and it was a mess due to how damn complicated WPF is.
I'll just keep using Mārtiņš Možeiko's script, portable-msvc.py, that this tool is based upon. It does everything this does, except a lock file and the autoenv. I'm not particularly interested in the former, and definitely not the latter.
I was just setting up a new machine and was setting up the Rust environment. The very first thing rustup-init asked was to install Visual Studio before proceeding. It was like 20-30gb of stuff installed before moving forward.
This tool would be a great help if I knew beforehand.
> On Linux, the toolchain is usually just a package manager command away. On the other hand, “Visual Studio” is thousands of components.
That package manager command, at the very least, pulls in 50+ packages of headers, compilers, and their dependencies from tens of independent projects, nearly each of them following its own release schedule. Linux distributions have it much harder orchestrating all of this, and yet it's Microsoft that cannot get its wholly-owned thing together.
No one should use any of these weird Frankenstein monstrosities in 2026. And a batch script? :( PowerShell exists.
Install:
- contrary to the blog post, the entirety of Visual Studio, because the IDE and debugger is *really damn good*.
- LLVM-MinGW[1]
Load the 'VSDevShell' DLL[2] for PowerShell, and you're good to go, with three different toolchains now:
cl.exe from VS
clang-cl.exe—you don't need to install this separately in VS; just use the above-mentioned llvm-mingw clang.exe as `clang.exe --driver=cl /winsysroot <path\to\Windows SDK> /vctoolsdir <path\to\VC>`. Or you can use it in GNU-driver-style mode, and use -Xmicrosoft-windows-sys-root. This causes it to target the Windows ABI and links against the VS SDK/VC tools
`clang.exe` that targets the Itanium ABI and links against the MinGW libraries and LLVM libc++.
Done and dusted. Load these into a CMake toolchain and never look at them again.
People really like overcomplicating their lives.
At the same time, learn the drawbacks of all toolchains and use what is appropriate for your needs. If you want to write Windows drivers, then forget about anything non-MSVC (unless you really want to do things the hard way for the hell of it). link.exe is slow as molasses, but can do incremental linking natively. cl.exe's code gen is (sometimes) slightly worse than Clang's. The MinGW ABI does not understand things like SAL annotations[3], and this breaks very useful libraries like WIL[4] (or libraries built on top of them, like the Azure C++ SDK[5] The MinGW headers sometimes straight up miss newer features that the Windows SDK comes with, like cfapi.h[6].
LLVM-MinGW sounds external to Microsoft though. I think the blog focused on in-Microsoft solutions. And I am not sure the "contrary to the blog content" is valid - compared to Linux, the Microsoft stack is much more annoying to install. I installed it, but it was annoying to no ends and took ages.
> compared to Linux, the Microsoft stack is much more annoying to install.
Not really. It's just different. As a cross-platform dev, all desktop OSs have their own idiosyncracies that add up to a net of 'they are all equally rather bad'.
I dunno, it has its uses when porting software written for UNIX-first. Plus, I pointed out Clang, rather than GCC, because Clang is natively a cross-compiler. I don't like to be dogmatic about stuff; if it's useful then it's useful. If it isn't then I will say why (as I explained why there's no need for MSYS2/Cygwin below).
No. You cannot even do direct compilation on the same host and target with clang only.
LLVM doesn't come with the C library headers (VCRuntime) or the executable runtime startup code (VCStartup).Both of which are under Visual Studio proprietary licenses. So to use Clang on Windows without Mingw, you need Visual Studio.
I use MingW without any extra libs (no msys), it just uses the ancient msvcrt.dll that is present in all Windows versions, so my programs work even on Windows 2000.
Additionally the cross-compiler on Linux also produces binaries with no extra runtime requirements.
But that's the point, I don't want the same style executable as Visual Studio. Having to distribute bunch of DLLs and having worse compatibility is pretty bad.
A major part of the incompatibility with older versions of Windows is just because newer VS runtimes cut the support artifically. That's it. Many programs would otherwise work as-is or with just a little help.
yeah, you can get away with this now a days because Git itself installs 2/3rds of the things you need anyway. You just need to finish the job by getting the package and putting the binaries in your git folder. Bam! mingw64, clang, what ever cc you need. It all links to standard windows stuff because you have to tell the linker where your win32.lib is. But this is true no matter the compiler, it's just Visual Studio supplies this in some god awful Program Files path.
MSYS2 is repacked Cygwin though. It is literally the same codebase compiled with slightly different flags. You need a full Unix environment for Bash to run, not just Mingw toolchain. The difference is Cygwin aims to create a full Unix system while MSYS2 just enough development environment to run bash, make etc to build native Windows programs with Mingw.
Git installs its own Mingw and Msys2 stuff but mostly compiled for a Mingw environment so they consume Windows paths natively instead of using MSYS2/Cygwin path conversion. That's why when you have mixed PATH variable all hell breaks loose with Git.
It was not clear what the parent commenter was addressing; I was under the impression they meant 'compile against the MSYS2 environment', which is broadly Cygwin, yes, which should not be forced onto a user.
Okay, but that just seems to be perpetuating the misunderstanding of what MSYS2 is intended for.
It gives you a *nix-like shell/dev environment and tools, but you build native software that runs on Windows systems that don’t have or need to have all/parts of MSYS2/Cygwin installed.
I built a network daemon using the MSYS2 CLANG64 environment and llvm toolchain on Windows 10.
Windows 7 x64 users could download the compiled single-file executable and run it just fine, so long as they installed Microsoft’s Universal C Runtime, which is a free download from Microsoft’s website.
I get your point. Although my point is that there is actually zero need for MSYS at all for this, even as a developer, and especially not with the 'CLANG64' environment. These binaries themselves are built to run in the MSYS2 environment This is how I cross-compile from Windows... to Windows with LLVM-MinGW[1]:
it's been 14 years since i've used msvc for anything real. iirc the philosophy back then was yearly versioned releases with rolling intermediate updates.
this seems to go down the road towards attempts at determinsticish builds which i think is probably a bad idea since the whole ecosystem is built on rolling updates and a partial move towards pinning dependencies (using bespoke tools) could get complicated.
Another option is explore winget and chocolaty. Most build tools and compilers can be installed via the command line on windows. Ask your favorite LLM to create a powershell script to install them all.
To me it seems as if Microsoft wants to make it deliberately harder to have
software developers. Now - I installed all the required things and compiled
on Windows too, but it is very annoying compared to Linux. Microsoft should
simply have ONE default build, e. g. "download this and 80% of developers
will be happy". No need for a gazillion checkboxes.
Fair question! Nope. I'm not endorsing it, and certainly don't know (or even suspect) it would solve this issue. I just recently installed NixOS and was surprised to see Windows mentioned on the downloads page, so looked into it a bit. Maybe soon.
Windows Native is fine. People in that space are comfortable with it.
What needs to be fixed is the valley between unix and windows development for cross-os/many-compiler builds, so one that does both can work seamlessly.
It's not an easy problem and there are lots of faux solutions that seem to fix it all but don't (in builds, the devil is in edge cases).
I seriously doubt that people who get confused by the MSVC++ Installer will be able to handle a CLI app that installs a mystery MSVC++ toolchain version to a versioned directory. They're still going to click the Visual Studio icon on their desktop and scratch their head why your script didn't magically fix their problems.
Say what you want about coding agents, when the cost of writing code goes to near-zero, the cost of wrangling tools becomes a much bigger fraction of development effort. This is an amazing opportunity to address long-standing frictions.
I'm just asking, but is there really a need for a native programs anymore? Where I worked a decade ago, we started porting all our native programs over to the browser and this was when MVC beta was just being released. At this point with Electron and Tauri, is there even a need to write a native program
Now with AI, I would think that porting a native program to the browser wouldn't be the chore it once was.
Yes, very definitely. There has always been a need for high performance native applications. Even in the beginning of the desktop computing revolution, these questions have been asked .. and yes, there is a balance between native and cloud/browser-based computing - some of it is personal, much of it is industrial and corporate, and yet more of the spectrum where both methods are applicable exists, even still, decades later.
> is there really a need for a native programs anymore
As long as you don't give a shit about the fact that your baseline memory consumption is now 500MB instead of 25MB, and that 80% of your CPU time is wasted on running javascript through a JIT and rendering HTML instead of doing logic, no.
If you don't give a shit about your users or their time, there's indeed no longer a need to write native programs.
I use COM and DLLs to extend software/automate. Using Visual Studio gives me some really nice debugging options.
I did try using python and js but the variable explorer is garbage due to 'late binding'.
I thought this was just my ignorance, but I've asked experts, AI, and google searched and they unfortunately agree. That said, some people have created their own log/prints so they don't need to deal with it.
At the risk of being that guy, I haven't had any issues onboarding people onto native projects written in Rust. rustup does a great job of fetching the required toolchains without issue. I'd imagine the same is also true of Go or Zig.
While Microsoft <3 Rust, there are still some quality tooling parity to reach versus Visual Studio abilities for .NET, Python and C++.
Incremental compilation, and linking, parallel builds, hot code reloading, REPL, graphical debugging optimised builds, GPU debugging....
Go is better left for devops stuff like Docker and Kubernetes, and Zig remains to be seen when it becomes industry relevant beyond HN and Reddit forums.
I'm pretty people who write and build C++ on Windows do it for good reasons, often reasons that are out of their control. Your comment is not going to make any difference.
You have to do this for certain rust things too. I can't remember which, but I inevitably run into a need to install the MSVC toolchain to compile rust. I think it might be related to FFI, or libs which use FFI? The same thing comes up in Linux, but the process to install it is different.
I got anxiety reading the article, describing exactly why it sucks. It's nice to know from the article and comments here there are ways around it, but the way I have been doing it was the "hope I check the right checkboxes and wait a few hours" plan. There is usually one "super checkbox" that will do the right things.
I have to do this once per OS [re]install generally.
The Visual Studio toolchain does have LTSC and stable releases - no one seems to know about them though. see: https://learn.microsoft.com/en-gb/visualstudio/releases/2022... - you should use these if you are not a single developer and have to collaborate with people. Back like in the old days when we had pinned versions of the toolchain across whole company.
[1] https://download.visualstudio.microsoft.com/download/pr/5d23...
You only get access to the LTSC channel if you have a license for at least Visual Studio Professional (Community won't do it); so a lot of hobbyist programmers and students are not aware of it.
On the other hand, its existence is in my experience very well-known among people who use Visual Studio for work at some company.
How strict Microsoft is with enforcement of this license is another story.
https://www.stacksocial.com/sales/microsoft-visual-studio-pr...
> Previously, if the application you were developing was not OSS, installing VSBT was permitted only if you had a valid Visual Studio license (e.g., Visual Studio Community or higher).
From (https://devblogs.microsoft.com/cppblog/updates-to-visual-stu...). For OSS, you do not even need a Community License anymore.
Visual studio is a dog but at least it's one dog - the real hell on windows is .net framework. The sheer incongruency of what version of windows has which version of .net framework installed and which version of .net your app will run in when launched... the actual solution at scale for universal windows compatibility on your .net app is to build a c++ shim that checks for .net beforehand and executes it with the correct version in the event of multiple version conflict - you can literally have 5 fully unique runtimes sharing the same .net target.
Only the latest .NET Framework 4.8 is shipped with Windows at this point.
.NET 10 supports a Windows 10 build from 10 years ago.
We had just deprecated support for XP in 2020 - this was for a relatively large app publisher ~10M daily active users on windows. The installer was a c++ stub which checked the system's installed .NET versions and manually wrote the app.config before starting the .net wrapper (or tried to install portable .NET framework installer if it wasn't found at all).
The app supported .NET 3.5* (2.0 base) and 4 originally, and the issue was there was a ".NET Framework Client Profile" install on as surprising amount of windows PCs out there, and that version was incompatible with the app. If you just have a naked .NET exe, when you launch it (without an app.config in the current folder) the CLR will decide which version to run your app in - usually the "highest" version if several are detected... which in this case would start the app in the lightweight version and error out. Also, in the app.config file you can't tell it to avoid certain versions you basically just say "use 4 then 2" and you're up to the mercy of the CLR to decide which environment it starts you in.
This obviated overrides in a static/native c++ stub that did some more intelligent verifications first before creating a tailored app.config and starting the .net app.
Why is it ok that you have to invest 2 times number of apps hours just because MS has such a short life cycle for its .NET versions.
.NET Framework should only be used for legacy applications.
Unfortunately there are still many around that depend on .NET Framework.
Microsoft sadly doesn't prioritize this so this might still be the case for a couple of years.
One thing I credit MS for is that they make it very easy to use modern C# features in .NET Framework. You can easily write new Framework assemblies with a lot of C# 14 features. You can also add a few interfaces and get most of it working (although not optimized by the CLR, e.g. Span). For an example see this project: https://www.nuget.org/packages/PolySharp/
It's also easy to target multiple framework with the same code, so you can write libraries that work in .NET programs and .NET Framework programs.
The current solution is to use the CLI tools just like C++.
However have you looked into ComWrappers introduced in .NET 8, with later improvements?
I still see VB 6 and Delphi as the best development experience for COM, in .NET it wasn't never that great, there are full books about doing COM in .NET.
Because that’s pretty much any freaking thing - oh Python, oh PHP, oh driving a fork lift, oh driving a car.
Once you invest time in using and learning it is non issue.
I do get pissed off when I want to use some Python lib bit it just doesn’t work out of the box, but there is nothing that works out the box without investing some time.
Just like a car get a teenager into a car he will drive into first tree.
Posting BS on Facebook shouldn’t be benchmark for how easy things should be.
Thus this should be less of a problem.
Had fewer issues on EndeavourOS (Arch) compared to Fedora overall though... I will stay on Arch from now on.
That seems more a property of npm dependency management than linux dependency management.
To play devil's advocate, the reason npm dependency management is so much worse than kernel/os management, is because their scope is much bigger, 100x more package, each package smaller, super deep dependency chains. OS package managers like apt/yum prioritize stability more and have a different process.
I want to focus on the project itself; not jump through hoops in the build process. It feels hostile.
For cross compiling to ARM from a PC in rust in particular, you do one CLI cmd to add the target. Then cargo run, and it compiles, flashes, with debug output.
These are from anecdotes. I am probably doing something wrong, but it is my experience so far.
uv has more of less solved this (thank god). Night and day difference from Pip (or any of the other attempts to fix it honestly).
At this point they should just deprecate Pip.
Continuing to use Pip because Astral might stop maintaining uv in future is stupidly masochistic.
That's where I stopped.
Toolchains on linux distributions with adults running packaging are just fine.
Toolchains for $hotlanguage where the project leaders insist on reinventing the packaging game, are not fine.
I once again state these languages need to give up the NIH and pay someone mature and responsible to maintain packaging.
And when it inevitably leads to all kinds of weird issues the packagers of course can't be reached for support, so users end up harassing the upstream maintainer about their "shitty broken application" and demanding they fix it.
Sure, the various language toolchains suck, but so do those of Linux distros. There's a reason all-in-one packaging solutions like Docker, AppImage, Flatpak, and Snap have gotten so popular, you know?
This is only the case for debian and derivatives, lol. Rolling-release distributions do not have this problem. This is why most of the new distributions coming out are arch linux based.
You can then build a script/documentation that isolates your specific requirements and workloads:
https://learn.microsoft.com/en-us/visualstudio/install/use-c...
Had to do this back in 2018, because I worked with a client with no direct internet access on it's DEV/build machines (and even when there was connectivity it was over traditional slow/low-latency satellite connections), so part of the process was also to build an offline install package.
(And - it is better on a shared-machine to have everything installed "machine-wide" rather than "per-user", same as PowerShell modules - had another client recently who had a small "C:" drive provisioned on their primary geo-fenced VM used for their "cloud admin" team and every single user was gobbling too much space with a multitude of "user-profile" specific PowerShell modules...)
But - yes, even with a highly trimmed workload it resulted in a 80gb+ offline installer. ... and as a server-admin, I also had physical data-center access to load that installer package directly onto the VM host server via external drive.
I am so fed up with this! Please if you're writing an article using LLMs stop writing like this!
“This isn’t just [what the thing literally is]; it’s [hyperbole on what the thing isn’t].”
Er, sorry. I meant: the purpose isn't just drama—it's a declaration of values, a commitment to the cause of a higher purpose, the first strike in a civilizational war of independence standing strong against commercialism, corporatism, and conformity. What starts with a single sentence in an LLM-rewritten blog post ends with changing the world.
See? And I didn't even need an LLM to write that. My own brain can produce slop with an em dash just as well. :)
But if this is LLM content then it does seem like the LLMs are still improving. (I suppose the AI flavour could be from Grammarly's new features or something.)
It's hated by everyone, why would people imitate it? You're inventing a rationale that either doesn't exist or would be stupider than the alternative. The obvious answer here it they just used an LLM.
> and clearly it serves some benefit to readers.
What?
We need a dictionary like this :D
Know what's more annoying than AI posts? Seeing accusations of AI slop for every. last. god. damned. thing.
So if you see LinkedInglish on LinkedIn, it may or may not be an LLM. Outside of LinkedIn... probably an LLM.
It is curious why LLMs love talking in LinkedInglish so much. I have no idea what the answer to that is but they do.
> The build.bat above isn’t just a helper script; it’s a declaration of independence from the Visual Studio Installer
this is 100% GPT slop, you can even tell it's GPT specifically from the fact that it has a ; instead of — because the recent models were trained to use the emdash less and put a semicolon in the same places it used to throw emdashes in the past.
GPT-4o would have done
>The build.bat above isn’t just a helper script—it’s a declaration of independence from the Visual Studio Installer
>you know why LLMs repeat those patterns so much
Unlike you, I do know why LLMs can fall into repeating certain patterns and it most definitely has nothing to do with "how humans speak". The better the model (as a tool) the more it has been trained on artificially generated data that teaches it the "proper" way to do tasks. Instruction tuned models have nothing to do with the original release of GPT-3, they were their own thing ever since the release of chatGPT itself.
You can control what sort of patterns it falls into and this is why if you had any ability to notice things as a human being you would have seen how newer GPT generated content has less emdash spam even when the human generating the content doesn't bother touching up the text.
I came back around 2017*, expecting the same nice experience I had with VB3 to 6.
What a punch in the face it was...
I honestly cannot fathom anyone developing natively for windows (or even OSX) at this day and age.
Anything will be a webapp or a rust+egui multi-plataform developed on linux, or nothing. It's already enough the amount of self-hate required for android/ios.
* not sure the exact date. It was right in the middle of the WPF crap being forced as "the new default".*
What if it was?
What if it wasn't?
What if you never find out definitely?
Do you wonder that about all content?
If so, doesn't that get exhausting?
I completely agree with your parent that it's tedious seeing this "fake and gay" problem everywhere and wonder what an unwinnable struggle it must be for the people who feel they have to work out if everything they read was AI written or not.
> curl -L -o msvcup.zip https://github.com/marler8997/msvcup/releases/download/v2026...
No thanks. I’m not going to install executables downloaded from an unknown GitHub account named marler8997 without even a simple hash check.
As others have explained the Windows situation is not as bad as this blog post suggests, but even if it was this doesn’t look like a solution. It’s just one other installation script that has sketchy sources.
Just like those complaining about curl|sh on Linux, you are confusing install instructions with source code availability. Just download the script and read it if you want. The curl|sh workflow is no more dangerous that downloading an executable off the internet, which is very common (if stupid) and attracts no vitriol. In no way does it imply that you can not actually download and read the script - something that actually can't be done with downloaded executables.
[0] https://github.com/marlersoft/zigwin32 [1] https://github.com/microsoft/win32metadata
Alternatively, there's this:
Install Visual Studio Build Tools into a container to support a consistent build system | Microsoft Learn
https://learn.microsoft.com/en-us/visualstudio/install/build...
`winget install --id Microsoft.VisualStudio.2022.BuildTools`.
If you need the Windows(/App) SDK too for the WinRT-features, you can add `winget install --id Microsoft.WindowsSDK.10.0.18362` and/or `winget install --id Microsoft.WindowsAppRuntime.1.8`
I used to just install the desktop development one and then work through the build errors until I got it to work, was somewhat painful. (Yes, .vsconfig makes this easier but it still didn’t catch everything when last I was into Windows dev).
"winget install Microsoft.VisualStudio.BuildTools"
"winget install Microsoft.WindowsSDK.10.0.26100"
I don't understand how open source projects can insist on requiring a proprietary compiler.
[1]: https://github.com/microsoft/wil
To answer your question, the headers.
Just use Clang + MSVC STL + WinSDK. Very simple.
Newer C# features like ref returns, structs, spans, et. al., make the overhead undetectable in many cases.
https://github.com/prasannavl/WinApi
https://github.com/microsoft/CsWin32
I have a vague memory of stumbling upon this hell when installing the ldc compiler for dlang [1].
1. https://wiki.dlang.org/Building_and_hacking_LDC_on_Windows_u...
Then you also specify target platform sdk versions in the .csproj file and VS will automatically prompt the developer to install the correct toolchain.
[0] https://learn.microsoft.com/en-us/dotnet/core/tools/global-j...
What you’re actually wanting here is .vsconfig https://learn.microsoft.com/en-us/visualstudio/install/impor...
At $workplace, we have a script that extracts a toolchain from a GitHub actions windows runner, packages it up, stuffs it into git LFS, which is then pulled by bazel as C++ toolchain.
This is the more scalable way, and I assume it could still somewhat easily be integrated into a bazel build.
Edit: Uses a shit load less actual energy than full-building a product thousands of times that never gets run.
* I wonder if Microsoft intentionally doesn't provide this first party to force everyone to install VS, especially the professional/enterprise versions. One could imagine that we'd have a vsproject.toml file similar to pyproject.toml that just does everything when combined with a minimal command line tool. But that doesn't exist for some reason.
You still have to install the tool that processes pyproject.toml so that doesn’t seem fair to hold against it. You are right that you still have to know whether to install 2022 or 2026.
You’ve never experienced genuine pain in your life. Have you tried to change the GCC compiler version in Linux?
If it’s not packaged and you’ve got to build it yourself, Godspeed. An if you’ve got to change libc versions…
In the span of ~2hrs I didn't manage to find a way to please Zig compiler to notice "system" libraries to link against.
Perhaps I'm too spoiled by installing a system wide dependency in a single command. Or Windows took a wrong turn a couple of decades ago and is very hostile to both developers and regular users.
The system libraries should only ship system stuff: interaction with the OS (I/O, graphics basics, process management), accessing network (DNS, IP and TLS). They should have stable APIs and ABIs.
Windows isn't hostile. It has a differnt paradigm and Unix (or more correctly usually GNU/Linux) people do not want to give up their worldview.
PCRE is basically only your apps's dependency. It has nothing else to do the rest of the operating system. So it is your responsibility to know how to build and package it.
All dependencies should be vendored into your project.
The only issue currently plaguing Windows development is the mess with WinUI and WinAppSDK since Project Reunion, however they are relatively easy to ignore.
People don't need any UNIX biases to just want multiple versions of MSVS to work the way Microsoft advertises. For example, with every new version of Visual Studio, Microsoft always says you can install it side-by-side with an older version.
But every time, the new version of VS has a bug in the install somewhere that changes something that breaks old projects. It doesn't break for everybody or for all projects but it's always a recurring bug report with new versions. VS2019 broke something in existing VS2017 installs. VS2022 broke something in VS2019. etc.
The "side-by-side-installs-is-supposed-to-work-but-sometimes-doesn't" tradition continues with the latest VS2026 breaking something in VS2022. E.g. https://github.com/dotnet/sdk/issues/51796
I once installed VS2019 side-by-side with VS2017 and when I used VS2017 to re-open a VS2017 WinForms project, it had red squiggly lines in the editor when viewing cs files and the build failed. I now just install different versions of MSVS in totally separate virtual machines to avoid problems.
I predict that a future version VS2030 will have install bugs that breaks VS2026. The underlying issue that causes side-by-side bugs to re-appear is that MSVS installs are integrated very deeply into Windows. Puts files in c:\windows\system32, etc. (And sometimes you also get the random breakage with mismatched MSVCRT???.DLL files) To avoid future bugs, Microsoft would have to re-architect how MSVS works -- or "containerize" it to isolate it more.
In contrast, gcc/clang can have more isolation without each version interfering with each other.
I'm not arguing this thread's msvcup.exe tool is necessary but I understand the motivations to make MSVS less fragile and more predictable.
That's why docker build environments are a thing - even on Windows.
Build scripts are complex, and even though I'm pretty sure VS offers pretty good support for having multiple SDK versions at the same time (that I've used), it only takes a single script that wasn't written with versioning in mind, to break the whole build.
But this isn’t true. Many distros package major versions of GCC/LLVM as separate packages, so you install and use more than one version in parallel, no Docker/etc required
It can indeed be true for some things-such as the C library-but often not for the compilers
I wouldn't start an app in most of them today, but I wouldn't rewrite one either without a good reason.
There’s a fun bug on WPF and form backgrounds for example which means on fractional DPI screens the background is tiled unpredictably. Had to patch that one up rather quickly one day and it was a mess due to how damn complicated WPF is.
(granted we made our own MFC around it)
[1] https://clang.llvm.org/docs/MSVCCompatibility.html
[2] https://clang.llvm.org/docs/UsersManual.html#clang-cl
https://gist.github.com/mmozeiko/7f3162ec2988e81e56d5c4e22cd...
This tool would be a great help if I knew beforehand.
That package manager command, at the very least, pulls in 50+ packages of headers, compilers, and their dependencies from tens of independent projects, nearly each of them following its own release schedule. Linux distributions have it much harder orchestrating all of this, and yet it's Microsoft that cannot get its wholly-owned thing together.
winget install Microsoft.VisualStudio.2022.BuildTools
What is the minimal winget command to get everything installed, ready for : cl main.cpp ?
Ps: I mean a winget command which does not ask anything, neither in command line, nor GUI ? Totally unattenfed.
Install:
Load the 'VSDevShell' DLL[2] for PowerShell, and you're good to go, with three different toolchains now: Done and dusted. Load these into a CMake toolchain and never look at them again.People really like overcomplicating their lives.
At the same time, learn the drawbacks of all toolchains and use what is appropriate for your needs. If you want to write Windows drivers, then forget about anything non-MSVC (unless you really want to do things the hard way for the hell of it). link.exe is slow as molasses, but can do incremental linking natively. cl.exe's code gen is (sometimes) slightly worse than Clang's. The MinGW ABI does not understand things like SAL annotations[3], and this breaks very useful libraries like WIL[4] (or libraries built on top of them, like the Azure C++ SDK[5] The MinGW headers sometimes straight up miss newer features that the Windows SDK comes with, like cfapi.h[6].
[1]: https://github.com/mstorsjo/llvm-mingw
[2]: https://learn.microsoft.com/en-gb/visualstudio/ide/reference...
[3]: https://learn.microsoft.com/en-gb/cpp/c-runtime-library/sal-...
[4]: https://github.com/microsoft/wil
[5]: https://github.com/Azure/azure-sdk-for-cpp
[6]: https://learn.microsoft.com/en-gb/windows/win32/cfapi/build-...
Good to know LLVM works on windows too though.
Not really. It's just different. As a cross-platform dev, all desktop OSs have their own idiosyncracies that add up to a net of 'they are all equally rather bad'.
This is fantastic and someone at Microslop should take notes.
LLVM doesn't come with the C library headers (VCRuntime) or the executable runtime startup code (VCStartup).Both of which are under Visual Studio proprietary licenses. So to use Clang on Windows without Mingw, you need Visual Studio.
Additionally the cross-compiler on Linux also produces binaries with no extra runtime requirements.
Compared to older Mingw64 environments those link with the latest UCRT so you get almost the same style executable as Visual Studio.
The only difference for C is that it uses Mingw exception handling and global initialization code, and it uses Itanium ABI for C++.
A major part of the incompatibility with older versions of Windows is just because newer VS runtimes cut the support artifically. That's it. Many programs would otherwise work as-is or with just a little help.
The Windows-native software you build with MSYS2 can be shipped to and run by users that don’t have anything of MSYS2 installed.
Git installs its own Mingw and Msys2 stuff but mostly compiled for a Mingw environment so they consume Windows paths natively instead of using MSYS2/Cygwin path conversion. That's why when you have mixed PATH variable all hell breaks loose with Git.
Doesn't it come with `pacman` too?
https://www.msys2.org/docs/environments/
It gives you a *nix-like shell/dev environment and tools, but you build native software that runs on Windows systems that don’t have or need to have all/parts of MSYS2/Cygwin installed.
I built a network daemon using the MSYS2 CLANG64 environment and llvm toolchain on Windows 10.
Windows 7 x64 users could download the compiled single-file executable and run it just fine, so long as they installed Microsoft’s Universal C Runtime, which is a free download from Microsoft’s website.
I get your point. Although my point is that there is actually zero need for MSYS at all for this, even as a developer, and especially not with the 'CLANG64' environment. These binaries themselves are built to run in the MSYS2 environment This is how I cross-compile from Windows... to Windows with LLVM-MinGW[1]:
[1]: https://github.com/mstorsjo/llvm-mingwIf you are compiling for your native system, yes.
But as soon as you try cross-compiling, you are in for a lot of pain.
this seems to go down the road towards attempts at determinsticish builds which i think is probably a bad idea since the whole ecosystem is built on rolling updates and a partial move towards pinning dependencies (using bespoke tools) could get complicated.
What year is it?! Also, haven't heard any complaints regarding VS on MacOS, how ironic...
(To be clear, I haven't tried this with Nix, but I have with other distros.)
This script is great. Just use it. The title saying “I fixed” is moderately offensive glory stealing.
What needs to be fixed is the valley between unix and windows development for cross-os/many-compiler builds, so one that does both can work seamlessly.
It's not an easy problem and there are lots of faux solutions that seem to fix it all but don't (in builds, the devil is in edge cases).
are we doomed to only read AI slop from now on? to get a couple paragraphs in and suddenly be hit with the realization that is AI?
it's all so tiresome
Now with AI, I would think that porting a native program to the browser wouldn't be the chore it once was.
As long as you don't give a shit about the fact that your baseline memory consumption is now 500MB instead of 25MB, and that 80% of your CPU time is wasted on running javascript through a JIT and rendering HTML instead of doing logic, no.
If you don't give a shit about your users or their time, there's indeed no longer a need to write native programs.
funny how Electron apps tend to have many more users than their native "performant" counterparts, isn't it?
I did try using python and js but the variable explorer is garbage due to 'late binding'.
I thought this was just my ignorance, but I've asked experts, AI, and google searched and they unfortunately agree. That said, some people have created their own log/prints so they don't need to deal with it.
Incremental compilation, and linking, parallel builds, hot code reloading, REPL, graphical debugging optimised builds, GPU debugging....
Go is better left for devops stuff like Docker and Kubernetes, and Zig remains to be seen when it becomes industry relevant beyond HN and Reddit forums.
I got anxiety reading the article, describing exactly why it sucks. It's nice to know from the article and comments here there are ways around it, but the way I have been doing it was the "hope I check the right checkboxes and wait a few hours" plan. There is usually one "super checkbox" that will do the right things.
I have to do this once per OS [re]install generally.
Imagine not be capable to install visual studio, what a shame.
That's purely incompetence or virtual point farming.