« BackThe Birth of Chipzillaabortretry.failSubmitted by rbanffy a day ago
  • Namahanna 18 hours ago

    Another very good history of early Intel is the Asianometry video and associated write-up: https://www.asianometry.com/p/intel-and-amd-the-first-30-yea...

    • aurizon 14 hours ago

      Once accountants started to run this ship, they sailed onto rocky shores. Profits should be used for research, instead they wasted ~~100 billion on stock buy-backs to keep the funds happy. Those billions, if spent on research, might have kept them off the rocks.

      • whatever1 7 hours ago

        Depends on your scope. From an intel perspective it would be wise to keep their cash, and maybe they would be in a better position today. From a market perspective we unlocked 100B and pumped them to other companies for example Apple, tsmc, nvidia etc or IPO’ed new ones. These seemed to achieve better multipliers on the capital hence as a whole we are better off (theoretically).

        Now of course all of these are on the macro level. If you look closely the collapse of Intel would cause severe disruption both on the business and on the geopolitics fronts.

      • slowmovintarget 21 hours ago

        This is the story of the birth of Intel, and with it so many of the firsts that laid the foundation for our current technology landscape: The first DRAM chip, the creation of the first microprocessor (the 4004), on through the release of the Intel 8080.

        • em3rgent0rdr 15 hours ago

          Debatable to claim the 4004 as "the first microprocessor". It's safer to specify it as the first "commercially-available general purpose" microprocessor. See https://en.wikipedia.org/wiki/Microprocessor#First_projects for a few pre-4004 chips that also are debatabley the first microprocessor: - Four-Phase Systems AL1 chip (1969), which was later demonstrated in a courtroom hack to act as a microprocessor (though there is much debate on whether that hack was too hacky) - The F-14 CADC's ALU chip (1970), which was classified at the time - Texas Instruments TMS 1802NC (announced September 17, 1971, two months before the 4004), which is more specifically termed a microcontroller nowadays, but nevertheless the core was entirely inside a single chip.

          • adrian_b 6 hours ago

            I do not consider 4004 as "general purpose".

            It was designed for implementing a desktop calculator, not a general-purpose computer. With some effort it could be repurposed to implement a simple controller, but it was completely unsuitable for implementing the processor of a general-purpose programmable computer.

            For implementing a general-purpose processor, it is likely that using MSI TTL integrated circuits would have been simpler than using Intel 4004.

            Intel 8008 (which implemented the architecture of Datapoint 2200), was the first commercially-available monolithic processor that could be used to make a general-purpose computer, and which has actually been used for this.

            Around the same time with the first monolithic processors, Intel has invented the ultraviolet-erasable programmable read-only memory.

            The EPROM invented by Intel has been almost as important as the microprocessors for enabling the appearance of cheap personal computers, by avoiding the need for other kinds of non-volatile memories for storing programs (e.g. punched-tape readers or magnetic core memories), which would have been more expensive than the entire computer.

            • klelatti an hour ago

              Inexpensive personal computers weren’t shipped with EPROMs, they were shipped with mask programmable ROMs. EPROMs were used in development but they were nowhere near as important as the microprocessor.

              • rbanffy 3 hours ago

                Didn’t PROM come before EPROM? While I agree EPROM enabled easier testing, PROMs would fit the bill once their contents got stable.

                • adrian_b 2 hours ago

                  Bipolar PROM was too small to contain the equivalent of the BIOS of a computer in the early seventies.

                  Bipolar PROMs were initially used mainly to store microprograms for CPUs with microprogrammed control and for implementing various kinds of programmable logic, in which role they were later replaced by PLAs, then by PALs.

                  I do not think that there has ever been any kind of computer that has stored in bipolar PROMs programs that were usable during normal operation, except perhaps some kind of embedded controllers designed before microprocessors become widespread, together with their associated EPROMs.

                  By the time when the Intel EPROMs like 1702 and 2708 have appeared (with a capacity of 256 bytes, then of 1 kbyte), typical bipolar PROMs had capacities of either 32 bytes or 128 bytes.

                  In that space you could put at most some kind of initial loader that would load a real bootstrapping program from something like a punched-tape reader. This kind of solution was used in some minicomputers, replacing the introducing of such an initial loader from console keys, by the operator. Such minicomputers were still at least an order of magnitude more expensive than the first computers with microprocessors, mainly due to the expensive peripherals required for a working system.

            • brcmthrowaway 18 hours ago

              How did Intel lose dram to micron?!

              • adrian_b 6 hours ago

                By the time when the Japanese DRAM manufacturers began to make DRAM chips that were both better and cheaper than those made by Intel, Intel has decided in 1985, probably rightly, that instead of investing a lot of money in trying to catch up with the Japanese they should better cut their losses by exiting the DRAM market and they should concentrate on what they were doing better, i.e. CPUs for IBM compatible PCs.

                This decision has been taken not much after the launch of the IBM PC/AT and also when Intel was preparing the launch of 80386, so they were pretty certain that they can make a lot of money from CPUs, even if they abandon their traditional market.

                It is likely that Intel has reached that point where such a decision had to be taken because for many years they must have underestimated the competence of the Japanese, by believing that they are not innovating, but only copying what the Americans do, exactly like now many Americans claim about the Chinese. When they have realized that actually the quality of the Japanese DRAMs is higher and their semiconductor plants have much better fabrication yields, it was too late.

                • Panzer04 18 hours ago

                  Deliberate decision to focus on higher-margin products that aren't commodities (like memory). I believe similar logic was used to justify the sale of their flash business.

                  • xadhominemx 17 hours ago

                    Micron itself was often touch and go until several competitors went bankrupt around 2010

                  • glitchc 3 hours ago

                    DRAM has very tight profit margins. It's a very cost-focused product line to be in, a company like Intel would never be able to get costs low enough. It was the right call.

                    • adrian_b 2 hours ago

                      During the early eighties, DRAM has become a product with very tight profit margins, thanks to the Japanese competitors.

                      Before that, Intel and Mostek could charge an arm and a leg for their DRAMs, obtaining handsome profits.

                    • bee_rider 15 hours ago

                      I don’t really get the Intel/Micron relationship. Much later, Intel collaborated with Micron on their NVME tech (3D Xpoint/optane), but in the end they gave up the product line to Micron, right?

                      Companies don’t have friends. But they seem quite cozy?

                  • wslh 15 hours ago

                    > The 3101 held 64 bits of data (eight letters of sixteen digits)

                    The 3101 held 64 bits of data (eight bytes, each representing values from 0 to 255).

                    • aurizon 14 hours ago

                      I recall when Mike Magee of the UK inquirer coined the term 'Chimpzilla'(AMD) as Intel's(Chipzilla) perpetual rival