• a_c a day ago

    Thanks for the list. To me, the watershed moment was when I was trying to implement my minimalistic neural network. Suddenly the concept of weight, activation function, back propagation made much more sense to me. Thanks to LLM, generating the same code has never been easier. The advent of LLM made learning popular subjects so much easier

    • jkestner a day ago

      Thanks to AI I don't need to read all those 300,000 words now, right?

      • FrustratedMonky a day ago

        Curious, is there any update to the original list? I didn't notice in the article if there was any update after 2020. The conclusion links to some other summaries, but doesn't directly call out any sources after 2020.

        Almost everything is still from 2017. Which might be correct, I'm asking if there is any reading list like this from someone like Ilya for changes since 2017. The AI field has gotten so crowded with papers and articles, it is hard to find some trusted list. Maybe why a list from Ilya is so popular.

        • tarolangner a day ago

          Good point, as far as I am aware even Carmack himself later suggested that the list be updated: https://x.com/ID_AA_Carmack/status/1622673143469858816

          It would be very interesting to see, but I couldn't find anything like that so far. Of course there are hundreds of lists from other people though.

        • __lbracket__ a day ago

          [dead]

          • undefined a day ago
            [deleted]