• bluetomcat an hour ago

    What a mess of an article. A pretentious mishmash of scattered references with some vague abstract claims that could be summarised in one paragraph.

    • flohofwoe an hour ago

      Sort of fitting though, because C++ coroutines turned out quite the mess (are they actually usable in real world code by now?).

      I think in the end it's just another story of a C++ veteran living through the inevitable Modern C++ trauma and divorce ;)

      (I wonder what he's up to today, ITHare was quite popular in game dev circles in the 2010s for his multiplayer networking blog posts and books)

      • TuxSH 10 minutes ago

        > C++ coroutines turned out quite the mess (are they actually usable in real world code by now?).

        They are, they are extensively used by software like ScyllaDB which itself is used by stuff like Discord, BlueSky, Comcast, etc.

        C++ coroutines and "stackless coroutines" in general are just compiler-generated FSMs. As for allocation, you can override operator new for the promise types and that operator new gets forwarded the coroutine's function arguments

    • gsliepen 2 hours ago

      Early programming languages had to work with the limited hardware capabilities of the time in order to be efficient. Nowadays, we have so much processing power available that the compiler can optimize the code for you, so the language doesn't have to follow hardware capabilities anymore. So it's only logical that the current languages should work the limitations of the compilers. Perhaps one day those limitations will be gone as well for practical purposes, and it would be interesting to see what programming languages could be made then.

      • flohofwoe an hour ago

        > Nowadays, we have so much processing power available that the compiler can optimize the code for you, so the language doesn't have to follow hardware capabilities anymore.

        That must be why builds today take just as long as in the 1990s, to produce a program that makes people wait just as long as in the 1990s, despite the hardware being thousands of times faster ;)

        In reality, people just throw more work at the compiler until build times become "unbearable", and optimize their code only until it feels "fast enough". These limits of "unbearable" and "fast enough" are built into humans and don't change in a couple of decades.

        Or as the ancient saying goes: "Software is a gas; it expands to fill its container."

        • adrianN an hour ago

          At least we can build software systems that are a few orders of magnitude more complex than in the 90s for approximately the same price. The question is whether the extra complexity also offers extra value.

          • flohofwoe 32 minutes ago

            True, but a lot of that complexity is also just pointless boilerplate / busywork disguised as 'best practices'.

        • j16sdiz an hour ago

          The problem is: "the platform" is never defined.

          When you decouple the language from the hardware and you don't specify an abstraction model (like java vm do), "the platform" is just whatever the implementer feels like at that moment.

          • lmm an hour ago

            Isn't that the tail wagging the dog? If you build the language to fit current compilers then it will be impossible to ever redesign those compilers.