• fuzzfactor 2 hours ago

    I don't think anybody can learn enough about concurrent things because it's happening all the time whether you want it to or not.

    • ygritte a day ago

      [flagged]

      • maxmcd a day ago

        How would a distinction between concurrency and parallelism benefit the modeling of program logic?

        The programming language incorporates thread+locking mechanisms.

        • Kranar a day ago

          Parallelism introduces additional hazards such as data races, which are not present in concurrent code that lacks parallelism.

          • seanw444 a day ago

            I'm of the opinion that we chose the wrong term for concurrency. Concurrency means multiple things going on simultaneously, which is not what's happening with our version of it. Only one thing is happening at any given time, but the tasks are being multiplexed on a shared resource. Parallelism = concurrency, "concurrency" = multiplexing.

            Maybe there's a more appropriate term than multiplexing, but I think that's certainly better than concurrency at describing it.

            • Kranar a day ago

              Yes that's true the terms are confusing but nevertheless it's important. For example having read through some of this book it's still not clear if this book involves parallelism, especially since the book compares itself to Python, which does not involve parallelism without running C extensions.

            • Jtsummers a day ago

              Data races and other race conditions are still present in concurrent systems without parallelism (of the actually executing at the same time sense, like with multiple cores). If they weren't, we wouldn't need most uses of mutexes and semaphores on single core processors. As the book gets into, concurrency is about multiple tasks that are arbitrarily interleaved with each other. That interleaving is why you can have data races and other errors even in a single core system.

              • Kranar a day ago

                Data races are not possible on a single core system.

                • undefined a day ago
                  [deleted]
                  • convolvatron a day ago

                    they are entirely possible assuming preemptive scheduling

                    • Kranar a day ago

                      Yes that's true and I was wrong to say otherwise. A data race can happen with preemptive multithreading on data whose size exceeds what the platform guarantees to access atomically, typically the word size.

                      A more accurate statement would be that parallelism introduces additional possibilities for data races than those possible from concurrent execution (without parallelism).

            • convolvatron a day ago

              Robert van Renesse is a hugely respected distributed systems researcher with decades of influential publications, and is highly respected as a mentor in the community.

              I'm sorry that you feel his contributions are meaningless because he hasn't caught up with the way that devops people talk about these things.

              • undefined 19 hours ago
                [deleted]
                • ygritte 11 hours ago

                  Then I'm even more disappointed. Appeal to authority does not make a good publication. He should know better or at least clearly define the terms he uses. Words have meaning. If you use the same words, but with a different meaning, it will produce confusion. Confusion may be a feature in marketing material, but if this publication is intended to teach something, then the confusion is a bug.

                • jackmottatx a day ago

                  The bike shed of parallelism discussion, people with nothing useful to add just parrot this. It is the worst.