• threepi 14 hours ago

    Author here. Happy to see this posted here. This is actually a series of blog posts:

    1. Exploring LoRA — Part 1: The Idea Behind Parameter Efficient Fine-Tuning and LoRA: https://medium.com/inspiredbrilliance/exploring-lora-part-1-...

    2. Exploring LoRA - Part 2: Analyzing LoRA through its Implementation on an MLP: https://medium.com/inspiredbrilliance/exploring-lora-part-2-...

    3. Intrinsic Dimension Part 1: How Learning in Large Models Is Driven by a Few Parameters and Its Impact on Fine-Tuning https://medium.com/inspiredbrilliance/intrinsic-dimension-pa...

    4. Intrinsic Dimension Part 2: Measuring the True Complexity of a Model via Random Subspace Training https://medium.com/inspiredbrilliance/intrinsic-dimension-pa...

    Hope you enjoy reading the other posts too. Merry Christmas and Happy Holidays!

    • 3abiton 7 hours ago

      Thanks for sharing. This got me thinking, why is medium so used for such technical articles? Especially that lots of articles get blasted behind a paywall for me recently.

    • jwildeboer 10 hours ago

      (Not to be confused with LoRa, (short for long range) which is a spread spectrum modulation technique derived from chirp spread spectrum (CSS) technology, powering technologies like LoRaWAN and Meshtastic)

      • SeasonalEnnui 5 hours ago

        This gets me every time. I expect to see something interesting and it turns to be the other one. One is a fantastic thing and the other is mediocre, pick which way round at your discretion!

        • pavlov an hour ago

          What exactly is the confusion? Does “parameter efficient fine-tuning” mean anything in context of the other Lora? If not, then it’s probably obvious which one this is about.

          • mrgaro 21 minutes ago

            Actually it does: Lora the radio protocol has parameters to tune. Usually both sender and receiver needs to match these, so I read this like a method how these could be automatically tuned based on the distance and radio environment.

          • sva_ 2 hours ago

            Pretty simple to spot LoRa vs LoRA.

          • FusspawnUK 9 hours ago

            really wish they had come up with another name. googling gets annoying

            • the__alchemist 5 hours ago

              Contributors: They both use mixed capitalization. They have partially-overlapping audiences.

          • danielhanchen 9 hours ago

            Super cool series of articles! :)

            • gautambt 14 hours ago