Rethinking C++: Architecture, Concepts, and Responsibility

(blogs.embarcadero.com)

54 points | by timeoperator 5 days ago

10 comments

  • simonask 4 hours ago
    From TFA:

    > C++ is often described as complex, hard to learn, and unsafe. That reputation is undeserved. The language itself is not unsafe. On the contrary: it is precise, honest, and consistent. What is unsafe is how it is used if it is misunderstood or if one remains in old patterns.

    I think this take needs to stop. It’s a longer way to say “skill issue”. Meanwhile, decades of industry experience have shown that the governing principles of even modern C++ make it incredibly hard (expensive) to deliver high quality software. Not impossible - there’s lots of examples - but unreasonably hard.

    C++ is fundamentally unsafe, because that’s how the language works, and if you think otherwise, you don’t know C++. There are patterns and paradigms that people use to limit the risk (and the size of the impact crater), and that’s helpful, but usually very difficult to get right if you also want any of the benefits of using C++ in the first place.

    Certain people will disagree, but I surmise that they haven’t actually tried any alternative. Instead they are high on the feeling of having finally grokked C++, which is no small feat, and I know because I’ve been there. But we have to stop making excuses. The range of problems where C++ is unequivocally the superior solution is getting smaller.

    • amluto 3 hours ago
      The author seems to be writing about a dream language that isn’t actually C++. For example:

      > In my streams I showed how RAII can thereby also be applied to database operations: a connection exists as long as it is in scope.

      Only if that connection object doesn’t support move — we’re 12 years of C++ standards past the arrival of move, and it still leaves its source in an indeterminate state.

      > With std::variant, C++ gained a tool that allows dynamic states without giving up type safety.

      variant is not quite type-safe:

      https://en.cppreference.com/w/cpp/utility/variant/valueless_...

      > With C++20, std::ranges decisively extends this principle: it integrates the notions of iterator, container, and algorithm into a unified model that combines type safety and readability.

      Ranges may be type-safe, but they’re not safe. Like string_view, a range is a reference, and the language does not help ensure that the referent remains valid.

    • ActorNightly 1 hour ago
      >Meanwhile, decades of industry experience have shown that the governing principles of even modern C++ make it incredibly hard (expensive) to deliver high quality software.

      Unreal Engine is C++ based and plenty of games have used it.

      Fundamentally, when it comes to safety, its either everything or nothing. Rust is by definition unsafe, because it has "unsafe" keyword. If the programmer has enough discipline not use use unsafe everywhere, he/she has enough discipline to write normal C++ code.

      But as far as C++ goes, the main problem is that the syntax still allows C style pointers and de referencing for compatibility with C code. Generally, if you stick to using std library constructs and smart pointers for everything, the code becomes very clean. Unique ptr is basically the thing that inspired Rust ownership semantics after all.

    • wavemode 2 hours ago
      > decades of industry experience have shown that the governing principles of even modern C++ make it incredibly hard (expensive) to deliver high quality software

      How can "decades of experience" show the deficiencies of Modern C++, which was invented in 2011?

      If you've worked on a project built entirely on the technologies and principles of modern C++, and found that it caused your software to be low-quality, then by all means share those experiences. C++ has no shortage of problems, there is no doubt. But hand-waving about "decades" of nondescript problems other people have had with older C++ versions, is somewhat of a lazy dismissal of the article's central thesis.

      • simonask 6 minutes ago
        > How can "decades of experience" show the deficiencies of Modern C++, which was invented in 2011?

        Please read the sentence you're quoting again.

    • nurettin 1 hour ago
      Isn't it a "skill issue" whenever you complain about a language? I can say rust is too slow and too convoluted, and you will tell me "well stop fsckin cloning everything and use Arc everywhere. Oh and yeah lifetimes are hard but once you get used to it you will write perfect programs like my claude max subscription does!"
    • instig007 3 hours ago
      > The range of problems where C++ is unequivocally the superior solution is getting smaller.

      The range of issues where the superior solutions offer language features superior to the features of modern C++ is getting smaller too.

      • surajrmal 3 hours ago
        The c++ features that get bolted on to replicate those in other languages tend to never reach parity because of all the legacy baggage they need to design around. Modules are not nearly as useful as one would hope. std::variant and std::optional are not nearly as ergonomic or safe to use as rust equivalents. coroutines are not exactly what anyone really wanted. If you're simply looking for checkboxes on features then I suppose you have a point.

        To be clear, I like and continue to use modern c++ daily, but I also use rust daily and you cannot really make a straight faced argument that c++ is catching up. I do think both languages offer a lot that higher languages like go and Python don't offer which is why I never venture into those languages, regardless of performance needs.

        • instig007 2 hours ago
          > std::variant and std::optional are not nearly as ergonomic or safe to use as rust equivalents.

          > but I also use rust daily and you cannot really make a straight faced argument that c++ is catching up.

          I mostly use std::ranges, lambdas, and concepts, and I see them catching up, as an evolutionary process rather than a fixed implementation in the current standard. Nowadays I can do familiar folds and parallel traversals that I couldn't do in the past without assuming third-party libraries. My optionals are empty vectors: it suits my algorithms and interfaces a lot, and I never liked `Maybe a` anyways (`Either errorOrDefault a` is so much better). I also use Haskell a lot, and I'm used to the idea that outside of my functional garden the industry's support for unions is hardly different to the support of 2-tuples of (<label>, <T>), so I don't mind the current state of std::variant either.

      • simonask 3 hours ago
        There’s definitely holes, but I’m wondering what you are referring to here.
  • gwbas1c 4 hours ago
    > Library vendors must have the courage to create a new generation of libraries—libraries that consistently use concepts, typelists, ranges, and compile‑time mechanisms. Compiler vendors, in turn, are responsible for continuing this development and fully unlocking the new language means.

    > But all of us—the C++ developers—must go back to school. We must learn C++ anew, not because we have forgotten it, but because through evolution it has become a different language. Only those who understand the modern language constructs can use the new tools properly and unfold the potential of this generation of libraries.

    Once you get to that point, you might as well create and learn a different language.

    • instig007 4 hours ago
      > Once you get to that point, you might as well create and learn a different language.

      Nope, it's still incredibly valuable to be able to c++14 and c++26 two different translation units and then later link them together (all without leaving the familiar toolchains and ecosystems). That's how big legacy projects can evolve towards better safety incrementally.

      • MontagFTB 3 hours ago
        If the Standard has anything to say about compatibility between different language versions, I doubt many developers know those details. This is breeding ground for ODR violations, as you’re likely using compilers with different output (as they are built in different eras of the language’s lifetime) especially at higher optimization settings.

        This flies in the face of modern principles like building all your C++, from source, at the same time, with the same settings.

        Languages like Rust include these settings in symbol names as a hash to prevent these kinds of issues by design. Unless your whole team is a moderate-level language lawyer, you must enforce this by some other means or risk some really gnarly issues.

        • PaulDavisThe1st 2 hours ago
          > Languages like Rust include these settings in symbol names as a hash to prevent these kinds of issues by design.

          Historically, C++ compilers' name mangling scheme for symbols did precisely the same thing. The 2000-2008 period for gcc was particularly painful since the compiler developers really used it very frequently, to "prevent these kinds of issues by design". The only reason most C++ developers don't think about this much any more is that most C++ compilers haven't needed to change their demangling algorithm for a decade or more.

          • MontagFTB 2 hours ago
            C++’s name mangling scheme handles some things like namespaces and overloading, but it does not account for other settings that can affect the ABI layer of the routine, like compile time switches or optimization level.
            • PaulDavisThe1st 1 hour ago
              The name mangling scheme was changed to reflect things other than namespaces and overloading, it was modified to reflect fundamental compiler version incompatibilities (i.e. the ABI)

              Optimization level should never cause link time or run time issues; if it does I'd consider that a compiler/linker bug, not an issue with the language.

  • recursivecaveat 2 hours ago
    It's okay to admit when C++ doesn't have a feature. std::variant is an approximation of sum type support. It's ergonomics are an absolute travesty as a result, any errors you get are going to be 5 pages of template gunk, and I'm sure that using it pervasively is terrible for compile times. It has been possible to construct a type safe unit library as demonstrated in the article for forever. I've never seen anyone use this ability in a production codebase, because a terrible library emulation of a feature is not a real feature.

    The notion that just using the new fancy types automatically makes everything memory safe has to stop. std::expected contains either a value or an error. If you call .value() and you're wrong you get an exception. If you call .error() and you're wrong you get undefined behaviour. This was added in C++23. Since there's no destructuring you have to call these methods btw, just don't make any mistakes with your preconditions! Regardless 90% of memory safety errors I see are temporal. Unless we completely ban references and iterators they will not be going anywhere. Using unique_ptr instead of new does not do anything when you insert into a map while holding a reference to an element.

    Developers also have to be able to make their own things. We can't pretend that absolutely everything we will ever need is bundled up in some perfect library. To write a typesafe application you need to be able to create your own domain specific abstractions, which to me precludes them looking like this:

        template <class ty>
        concept db_result_tuple = requires { typename remove_cvref_t<ty>; }
                               && []<class... Es>(std::tuple<Es...>*) {
                                    return all_result_args_ok_v<Es...>;
          }( static_cast<typename std::add_pointer_t<remove_cvref_t<ty>>>(nullptr) );
  • fsloth 9 hours ago
    I get the feeling author would just like to use a better language, like F# or Ocaml, and completely misses the point what makes C++ valuable.

    C++ is valuable, because the existing tooling enables you to optimize the runtime peformance of a program (usually you end up with figuring out the best memory layout and utilization).

    C++ is valuable becaus it's industry support guarantees code bases live for decades _without the need to modify them_ to latest standards.

    C++ is valuable because the industry tooling allows you to verify large areas of the program behaviour at runtime (ASAN etc).

    I simply don't understand what type of industrial use this type of theoretical abstraction building serves.

    Using the metaprogramming features makes code bases extremly hard to modify and they don't actually protect from a category of runtime errors. I'm speaking from experience.

    I would much rather have a codebase with a bit more boilerplate, a bit more unit tests and strong integration testing suite.

    The longer I use C++ the more I'm convinced something like Orthodox C++ is the best method to approach the language https://bkaradzic.github.io/posts/orthodoxc++/

    This keeps the code maintainable, and performant (with less effor than metaprogramming directed C++).

    Note: the above is just an opinion, with a very strong YMMV flavour, coming from two decades in CAD, real time graphics and embedded development.

    • jandrewrogers 6 hours ago
      C++20 inverts the traditional relationship between the core language and metaprogramming, which arguably makes it new language in some ways. Instead of being a quirky afterthought, it has become the preferred way to interact with code. There is a point of friction in that the standard library doesn’t (and can’t) fully reflect this change.

      Metaprogramming style in C++20 only has a loose relationship to previous versions. It is now concise and highly maintainable. You can do metaprogramming in the old painful and verbose way and it will work but you can largely dispense with that.

      It took me a bit to develop the intuitions for idiomatic C++20 because it is significantly different as a language, but once I did there is no way I could go back. The degree of expressiveness and safety it provides is a large leap forward.

      Most C++ programmers should probably approach it like a new language with familiar syntax rather than as an incremental update to the standard. You really do need to hold it differently.

      • jebarker 5 hours ago
        As someone that has only dabbled in C++ over the past 10 years or so, it feels like each new release has this messaging of “you have to think of it as a totally new language”. It makes C++ very unapproachable.
        • jandrewrogers 1 hour ago
          It isn’t each release but there are three distinct “generations” of C++ spanning several decades where the style of idiomatic code fundamentally changed to qualitatively improve expressiveness and safety. You have legacy, modern (starting with C++11), and then whatever C++20 is (postmodern?).

          This is happening to many older languages because modern software has more intrinsic complexity and requires more rigor than when those languages were first designed. The languages need to evolve to effectively address those needs or they risk being replaced by languages that do.

          I’ve been writing roughly the same type of software for decades. What would have been considered state-of-the-art in the 1990s would be a trivial toy implementation today. The languages have to keep pace with the increasing expectations for software to make it easier to deliver reliably.

        • gpderetta 4 hours ago
          As someone that has been using C++ extensively for the last 25 years, each release has felt as an incremental improvement. Yes, there are big chunks in each release that are harder to learn, but usually a team can introduce them at their own pace.

          The fact that C++ is a very large and complex language and that makes it unapproachable is undeniable though, but I don't think the new releases make it significantly worse. If anything, I think that a some of the new stuff does ease the on-ramp a bit.

      • TYPE_FASTER 4 hours ago
        > Metaprogramming style in C++20 only has a loose relationship to previous versions. It is now concise and highly maintainable. You can do metaprogramming in the old painful and verbose way and it will work but you can largely dispense with that.

        This was my takeaway as well when I revisited it a few years ago. It's a very different, and IMO vastly improved, language compared to when I first used it decades ago.

      • jnwatson 2 hours ago
        If you're going to go through the effort of learning a new language, it makes sense to consider another language altogether, one without 30 years of accumulated cruft.
        • jandrewrogers 43 minutes ago
          An advantage is that if you already know the older language then you don’t have to learn the new idioms up front to use it. You can take your time and still be productive. It isn’t why I would use it but it is a valid reason.

          I have used many languages other than C++20 in production for the kind of software I write. I don’t have any legacy code to worry about and rarely use the standard library. The main thing that still makes it an excellent default choice, despite the fact that I dislike many things about the language, is that nothing else can match the combination of performance and expressiveness yet. Languages that can match the performance still require much more code, sometimes inelegant, to achieve an identical outcome. The metaprogramming ergonomics of C++20 are really good and allow you to avoid writing a lot of code, which is a major benefit.

      • pjmlp 6 hours ago
        I only which concepts were easier, those of use that don't use C++ daily have to look the expression syntax all the time, much better than the old ways I guess.
      • surajrmal 3 hours ago
        Wait until people see how reflection on c++26 further pushes the metaprogramming paradigm. I'm more hopeful for reflection than I have been for any other c++ feature which has landed in the last decade (concepts, modules, coroutines, etc).
    • pjmlp 6 hours ago
      As someone that had the option to choose between C and C++, coming from compiled BASIC and Object Pascal backgrounds, back in the early 1990's.

      What makes C++ valueable is being a TypeScript for C, born in the same UNIX and Bell Labs farm (so to speak), allowing me to tap into the same ecosystem, while allowing me to enjoy the high level abstractions of programming languages like Smalltalk, Lisp, or even Haskell.

      Thus I can program on MS-DOS limited with 640 KB, an ESP32, Arduino, a CUDA card, or a distributed system cluster with TB of memory, selecting which parts are more convinient for the specific application.

      Naturally I would like in 2025 to be able to exercise such workflows with a compiled managed language instead of C++, however I keep being in the minority, thus language XYZ + C++ it is.

      • AnimalMuppet 4 hours ago
        Does go count as managed, in your view? (Honest question - I don't know go well enough to have much of an opinion.)
        • olluk 4 hours ago
          I'd call managed C#, Java, Go, Pythhon, JS, etc. Something with GC e.g., managed memory
        • pjmlp 4 hours ago
          Yes, managed languages are all that have some form of automatic resource management, regardless of what shape it takes, or a more high language runtime.

          Using Go as example, and the being in minority remark, you will remember the whole discussion about Go being a systems language or not, and how it was backpedaled to mean distributed systems, not low level OS systems programming.

          Now, I remember when programming compilers, linkers, OS daemons/services, IoT devices, firmware was considered actual systems programming.

          But since Go isn't bootstraped, TinyGo and TamaGo don't exist, that naturally isn't possible. /s

    • ghosty141 5 hours ago
      > C++ is valuable, because the existing tooling enables you to optimize the runtime peformance of a program

      This is true for MANY other languages too, I don't see how this makes c++ different. With gdb its quite the opposite, handlig c++ types with gdb can be a nightmare and you either develop your own gdb glue code or write c-like c++.

      > C++ is valuable becaus it's industry support guarantees code bases live for decades _without the need to modify them_ to latest standards.

      In times of constant security updates (see the EU's CRA or equivalent standards in the US) you always gotta update your environment which often also means updating tooling etc. if you don't wanna start maintaining a super custom ecosystem.

      I don't see this as a positive in general, there is bit rot and a software that is stuck in the past is generally not a good sign imo.

      > C++ is valuable because the industry tooling allows you to verify large areas of the program behaviour at runtime (ASAN etc).

      Sanitizers are not C++ exclusive too and with rust or C# you almost never need them for example. Yes C++ has extensive debugging tools but a big part of that is because the language has very few safeguards which naturally leads to a lot of crashes etc..

      I think the idea of using only a small subset of C++ is interesting but it ignores the problem that many people have, you don't have the time to implement your own STL so you just use the STL. Ofc it gives me more control etc. but I'd argue most of the time writing orthodox c++ won't save time even in the long run, it will save you headaches and cursing about c++ being super complicated but in the end in modern environments you will just reinvent the wheel a lot and run into problems already solved by the STL.

      • mkornaukhov 5 hours ago
        > handlig c++ types with gdb can be a nightmare and you either develop your own gdb glue code or write c-like c++.

        That's why better to use lldb and it's scripts.

        > I think the idea of using only a small subset of C++ is interesting but it ignores the problem that many people have, you don't have the time to implement your own STL so you just use the STL.

        Yeah, agree. It's just much easier to take a "framework" (or frameworks) where all the main problems solved: convenient parallelism mechanisms, scheduler, reactor, memory handling, etc. So it's turning out you kinda writing in your own ecosystem that's not really different from another language, just in C++ syntax.

    • jokoon 5 hours ago
      orthodox C++ should be a subset of C++, I would really use it, like if there was a compiler flag

      I can imagine it might be insanely faster to compile

    • gpderetta 8 hours ago
      sorry, I can't take something that argues for "printf" in favour of anything else seriously.
      • locknitpicker 6 hours ago
        > sorry, I can't take something that argues for "printf" in favour of anything else seriously.

        I think you're arguing from a position of willful ignorance. The article is clear on how it lauds C++'s std::printnl, not printf.

        http://en.cppreference.com/w/cpp/io/println.html

        Here's what the article argues:

        > With std::format, C++ has gained a modern, powerful, and safe formatting system that ends the classic, error‑prone printf mechanisms. std::format is not merely convenient but fully type‑safe: the compiler checks that placeholders and data types match.

        Solid remark, and the consensus on how std::printnl and std::format are an important improvement over std::cout or C's printf.

        • gpderetta 5 hours ago
          I was referring to the Orthodox C++ article linked by parent. Of course format is an improvement on both printf and iostream.
      • jstimpfle 7 hours ago
        I'll bite. printf might be unsafe in terms of typing, in theory, but it's explicit and readable (with some caveats such as "PRIi32"). The actual chance of errors happening is very low in practice, because format strings are static in all practical (sane) uses so testing a single codepath will usually detect any programmer errors -- which are already very rare with some practice. On top of that, most compilers validate format strings. printf compiles, links, and runs comparatively quickly and has small memory footprint. It is stateless so you're always getting the expected results.

        Compare to <iostream>, which is stateful and slow.

        There's also std::format which might be safe and flexible and have some of the advantages of printf. But I can't use it at any of the places I'm working since it's C++20. It probably also uses a lot of template and constexpr madness, so I assume it's going to be leading to longer compilation times and hard to debug problems.

        • TinkersW 6 hours ago
          I my experience you absolutely must have type checking for anything that prints, because eventually some never previously triggered log/assertion statement is hit, attempts to print, and has an incorrect format string.

          I would not use iostreams, but neither would I use printf.

          At the very least if you can't use std::format, wrap your printf in a macro that parses the format string using a constexpr function, and verifies it matches the arguments.

          • jstimpfle 5 hours ago
            _Any_ code that was never previously exercised could be wrong. printf() calls are typically typechecked. If you write wrappers you can also have the compiler type check them, at least with GCC. printf() code is quite low risk. That's not to say I've never passed the wrong arguments. It has happened, but a very low number of times. There is much more risky code.

            So such a strong "at the very least" is misapplied. All this template crap, I've done it before. All but the thinnest template abstraction layers typically end up in the garbage can after trying to use them for anything serious.

            • TinkersW 1 hour ago
              Error Log/assertions prints are by are the most likely code to have not been run prior. Some compilers type check printf, but not all.
        • gpderetta 5 hours ago
          The biggest issue with printf is that it is not extensible to user types.

          I also find it unreadable; beyond the trivial I always need to refer to the manual for the correct format string. In practice I tend to always put a placeholder and let clangd correct me with a fix-it.

          Except that often clangd gives up (when inside a template for example), and in a few cases I have even seen GCC fail to correctly check the format string and fail at runtime (don't remember the exact scenario).

          Speed is not an issue, any form of formatting and I/O is going to be too slow for the fast path and will be relegated to a background thread anyway.

          Debugging and complexity has not ben an issue with std::format so far (our migration from printf based logging has been very smooth). I will concede that I do also worry about the compile time cost.

        • surajrmal 2 hours ago
          I largely avoided iostream in favor of printf-like logging apis, but std::format changed my mind. The only hazard I've found with it is what happens when you statically link the std library. It brings in a lot of currency and localization nonsense and bloats the binary. I'm hoping for a compiler switch to fix that in the future. libfmt, which std::format is based on, doesn't have this problem.
      • unwind 8 hours ago
        The article argues that modern C++ has type-checked string formatting, so it does not argue for (unchecked) `printf()`, right?
        • vintagedave 7 hours ago
          "The article" is ambiguous. The one this HN post is about does not argue for it, at all. But the one in the comment above directly says,

          > Don’t use stream (<iostream>, <stringstream>, etc.), use printf style functions instead.

          and has a code example of what they argue 'Orthodox C++' should be, which uses printf.

          I'm all for a more sensible or understandable C++, but not at the expense of losing safety. In fact I would prefer the other way: I still feel incredibly saddened that Sean Baxter's Circle proposal for Safe C++ is not where the language is heading. That, plus some deep rethinking and trimming of some of the worst edge behaviours, and a neater standard library, would be incredible.

          • gpderetta 5 hours ago
            I was indeed referring to the 'Orthodox C++ article'.
  • hsaliak 1 hour ago
    When I got into computing, it was a refuge from societal expectations. Now you cannot code in whatever programming language because of societal expectations to do it safe, do it with modern libraries, with the right build system etc.. just do what you like. Its OK. Have fun.
  • pjmlp 9 hours ago
    I still love C++ Builder, regardless of all Borland misteps that lead to where Embarcadero is today, it is the survivor of C++ RAD IDE tooling, Visual C++ never was as Visual as it name implies.
    • kaiken1987 3 hours ago
      Builder and Delphi 6 had a way to build and design UI's that worked smoothly that I've yet to see from a UI framework.
      • pjmlp 1 hour ago
        Indeed, pity that they are only available to those of us that don't mind using the community editions, or work at companies that usually don't care that much about commercial licenses prices, meaning project delivery costs is measured in millions.

        Sure there is FreePascal and Lazarus, sadly it doesn't get enough love.

  • yosefk 8 hours ago
    "Many—especially historically minded—developers complain that modern C++ compilers take longer to compile. But this criticism is short‑sighted. You cannot compare C++ compile times with compilation in other languages, because the compiler is doing something entirely different."
    • rerdavies 8 hours ago
      If only it would do something entirely different faster. :-(

      Somebody really needs to rethink the entire commitment to meta-programming. I had some hope that concepts would improve reporting, but they seem to actually make it worse, and -- if they improve compile times at all, I'm not seeing it.

      And it has nothing to do with historicity. Every time I visit another modern language (or use it seriously) I am constantly reminded that C++ compile times are simply horrible, and a huge impediment to productivity.

    • fsloth 5 hours ago
      A slow compiler impedes developers velocity, not only taking longer, but breaking their concentration.

      The whole point of a programming language is to be an industrial productivity tool that is faster to use than hand writing assembly.

      Performance is a core requirement industrial tools. It's totally fine to have slow compilers in R&R and academia.

      In industry a slow compiler is an inexcusable pathology. Now, it can be that pathology can't be fixed, but not recognizing it as a pathology - and worse, inventing excuses for it - implies the writer is not really industrially minded. Which makes me very worried why they are commenting on an industrial language.

    • pjmlp 6 hours ago
      We can easily complain, because there were attempts to improve in the past like Energize C++ and Visual Age for C++ v4, or systems like Live++.

      However too many folks are stuck in the UNIX command line compiler mindset.

      I keep bumping into people that have no idea about the IDE based compilation workflows from C++ Builder and Visual C++, their multihreaded compilation, incremental compilation and linking, pre-compiled headers that actually work, hot code reloading, and many other improvments.

      Or the CERN C++ interpreters for that matter.

      Many don't seem to ever have ventured beyond calling gcc or clang with Makefiles, and nothing else.

    • ahartmetz 3 hours ago
      I wonder if it's time to implement some library features in the compiler. Some things are very widely used and very rarely modified. It should be possible to opt out and use the library version, of course.
    • gpderetta 8 hours ago
      As a long-time C++ user I definitely complain that C++ takes long to compile. Then again, I always have.
    • ozgrakkurt 7 hours ago
      This is also because llvm and gcc are just slow right? Are there any alternative c++ compiler that is faster maybe?
  • dustfinger 3 hours ago
    Despite all the critisism, C++ has been my favorite language for 20+ years. Nowadays I code 99% of the time in python and previously TypeScript, but all my personal for fun projects are in C++. I just enjoy coding in it so much more. I know I will get some heat for this, but I would love to live in an idealistic world where computers were still for hackers and it was all just about fun not profit. I was so inspired by the writings of Eric S. Raymond, I always hoped I would experience some of that. ~SIGH~. Maybe in my retirement.
    • ActorNightly 44 minutes ago
      > but I would love to live in an idealistic world where computers were still for hackers

      This never changed.

      In the past, hacking was exploiting human errors in writing faulty code. These days, its pretty much the same thing except the faulty code isn't things like buffer overflows due to no bounds checking, but more higher level faulty software with things like password reuse, no 2 factor authentication, and so on.

  • Jeff-Collins 3 hours ago
    [dead]
  • trzy 2 hours ago
    I think they should chuck the STL and start over. I left C++ for a while and spent a lot of time in C# and Swift. Going back to C++ is painful because the interfaces feel very non-uniform and cumbersome.

    I also think that named parameters would go a long way toward improving the language.

    Lastly, explore some way to make possible a breaking change with "old C++".