• photonthug 6 hours ago

    > Though APL may strike some as a strange language of choice for deep learning, it offers benefits that are especially suitable for this field: First, the only first-class data type in APL is the multi-dimensional array, which is one of the central object of deep learning in the form of tensors. This also signifies that APL is by nature data parallel and therefore particularly amenable to parallelization. Notably, the Co-dfns project compiles APL code for CPUs and GPUs, exploiting the data parallel essence of APL to achieve high performance. Second, APL also almost entirely dispenses with the software-specific "noise" that bloats code in other languages, so APL code can be directly mapped to algorithms or mathematical expressions on a blackboard and vice versa, which cannot be said of the majority of programming languages. Finally, APL is extremely terse; its density might be considered a defect by some that renders APL a cryptic write-once, read-never language, but it allows for incredibly concise implementations of most algorithms. Assuming a decent grasp on APL syntax, shorter programs mean less code to maintain, debug, and understand.

    This is really cool. At about 150 lines, terse indeed. And it makes sense that of course APL could work well with gpus, but I’m kind of surprised there’s enough of it still out in the wild so that there’s already a reliable tool chain for doing this.

    • nextos 5 hours ago

      > APL could work well with gpus

      I've seen at least an APL implementation running on top of Julia, thanks to macros.

      Julia has good GPU support, and it makes it easy to compose that support with any library.

      However, kdb+ and q, which are APL descendants, have good GPU support already: https://code.kx.com/q/interfaces/gpus. But licenses are not cheap...

      • koolala 3 hours ago

        GPUs can even run APL as a higher level programming language. It's the only abstract language I've ever heard to run on a GPU. Arrays in, Arrays out. A gpu is array programming hardware.

        I hope one day its normal like the 1000s of CPU languages. Would be nice to have more than 10 gpu languages.

      • Dr_Birdbrain 6 hours ago

        > APL code can be directly mapped to algorithms or mathematical expressions on a blackboard and vice versa

        After looking at the code, I find this claim questionable.

        • Avshalom 4 hours ago

          APL was invented by Iverson as a blackboard notation because he felt the existing notation was awkward/insufficent for describing computation/algorithms

          • koolala 3 hours ago

            linear algebra notation is the real notation (on a blackboard). aka the language of AI. the medium is just the message format.

          • jodrellblank 6 hours ago

            After looking at HN comments for years, I find this low effort dismissal downvoteable.

            APL was originally a rewrite and normalisation of traditional math notation for use on blackboards. Before it was anything to do with computers it was linear algebra without all the bizarre precedence rules and with some common useful operations.

            • keithalewis 5 hours ago

              Name checks out.

          • sakras an hour ago

            > Though APL may strike some as a strange language of choice for deep learning

            I've actually spent the better part of last year wondering why we _haven't_ been using APL for deep learning. And actually I've been wondering why we don't just use APL for everything that operates over arrays, like data lakes and such.

            Honestly, APL is probably a good fit for compilers. I seem to remember a guy who had some tree-wrangling APL scheme, and could execute his compiler on a GPU. But I can't find it now.

          • gcanyon 4 hours ago

            > Though APL may strike some as a strange language of choice for deep learning

            It sure did to me, even as someone who has written (a trivial amount of) J. But the argument that follows is more than convincing.