• btrettel a day ago

    The headline is inaccurate. As far as I can tell, no patents have been granted yet. Intel filed patent applications. Failure to distinguish between applications and granted patents is far too common.

    https://patents.google.com/patent/US20250217157A1/en

    See the sidebar on the right? Look at "Application US18/401,460 events". Note that the status is "Pending" and not "Active" or "Expired". Google isn't always accurate here as their data could be out of date, but they're accurate enough for me to not look further. You can check the other countries as well to see all are pending.

    • rs186 a day ago

      Just like articles that say "xx research group publishes new article discovering ..." when it is a preprint on arxiv (especially in "traditional" physical sciences). I mean, they kind of published it, but I would be very careful about reporting work that has not gone through peer review yet.

      • dkiebd a day ago

        Okay? This is not relevant. What matters here is that they have developed this technology. “Patents” in the title of the article is a way of saying that they have developed the technology.

        • btrettel a day ago

          I don't agree. I think most people would think that "patents" implies that a patent office has granted a patent.

          For what it's worth, confusing patents and patent applications is a pet peeve of mine as a former patent examiner. I've seen people criticize the USPTO for apparently granting a patent on some nonsense, but when I look at it, the USPTO rejected the application. The problem is that people can't tell the difference between a patent application and patent. I saw an opportunity to clarify this issue and I took it.

          • Doxin 15 hours ago

            > What matters here is that they have developed this technology.

            Having a pending patent, or even a granted patent, does not mean the technology described has been invented. There are many many patents on all sorts of infinite energy devices for example. It should go without saying that none of those work.

        • ch_123 a day ago

          This feels like Intel's researchers explored an idea, and decided to patent it as a matter of routine. The limits of ILP in typical applications are well documented, and I can't imagine that issuing dozens of instructions at once is likely to be useful outside of some very specific benchmarks.

          Perhaps one use is to compete with GPUs, but even a multi core CPU is not likely to compete with a GPU in terms of number of arithmetic/vector units.

          • IsTom a day ago

            > On the software side, the system uses either a JIT compiler, static compiler, or binary instrumentation to split a single-threaded program into code segments to assign different blocks to different cores. It injects special instructions for flow control, register passing, and sync behavior, enabling the hardware to maintain execution integrity.

            Itanium is back again?

            • hakfoo a day ago

              I always thought there was value in a simpler take on this. Especially in the commercial-software world, not everything is compiled for the exact foibles of your current CPU.

              Why aren't we running a JIT from x86 to "optimized subset of x86"? how much performance could it buy us?

              • BuildTheRobots a day ago

                They kinda do - microcode translates x86(_64) into micro-opts for actual execution. If you go back far enough to the P6 the actual execution core was essentially RISC [1]

                [1] https://news.ycombinator.com/item?id=36380149

              • euLh7SM5HDFY a day ago

                As bad as it worked out I don't think Itanium tried to break Amdahl's law. And that is how I understand this magic multicore execution of single-thread code.

                • jerf a day ago

                  I'm surprised they even pursued this line of research, though they may be considering it just as a basic territory claim that they don't have a high expectation of turning into anything. Research into "implicit parallelism" has been done a lot over the years and the consistent result has been that there is a lot less than people intuitively think, and I mean, a lot less. I wouldn't hold out much hope for this... but then again, in a world of nearly frozen clock speeds, it wouldn't take much to stand out.

                  • IsTom a day ago

                    With Itanium they assumed that "smart compilers" would locally parallelize programs.

                  • hulitu a day ago

                    > Itanium is back again?

                    Itanium was also the trojan horse against competing architectures. From that POV, it succeeded.

                  • devl547 a day ago

                    Softmachines VISC architecture is not dead?

                    • Havoc a day ago

                      > improving single-thread performance, provided that it has enough parallel work

                      So cases where the programmer didn’t optimise?

                      • arccy 2 days ago

                        how long until we get something like spectre for this...

                        • snickerbockers a day ago

                          I doubt it, based on TFA it looks like it has more in common with multi-issue pipelining and ooe than speculative execution.

                          • mhh__ a day ago

                            Isn't the whole point of OOE that the design is inherently speculative otherwise there's basically nothing to dispatch?

                            • neuroelectron a day ago

                              There's a lot of processor state in each core which would be a great place to hide exploits when the microcode is assuming synced operation between cores.

                              • jokoon a day ago

                                by TFA you mean the fucking article?

                                why swear? not I have a problem with it

                                • Cthulhu_ a day ago

                                  I don't get it either, seems to be a HN culture thing. I get it when people reply to people who haven't read the article (like in RTFM), but it's often used unsolicited and IMO unnecessary.

                                  • rep_lodsb a day ago

                                    "the fine article"

                              • brnt a day ago

                                (Dynamic?) Software Bulldozer? What could possibly go wrong?

                                • m3kw9 a day ago

                                  Aka FPGA