« BackLlamafile Returnsblog.mozilla.aiSubmitted by aittalam 3 days ago
  • jart 2 days ago

    Really exciting to see Mozilla AI starting up and I can't wait to see where the next generation takes the project!

    • bsenftner 21 hours ago

      People are so uninformed, they don't know you are Mozilla AI's star employee.

      • setheron 15 hours ago

        I agree that the work they put out is A+ Whenever I see content produced by jart, it's always amazing.

        • dingnuts 2 hours ago

          sorry, I'm out of the loop. is this thread glazing a celebrity member commenting on an announcement from his own team to create hype?

          what the fuck is wrong with this website

        • rvz 21 hours ago

          s/are/was

          I don't know if you were informed but you realize that jart is no longer at Mozilla anymore and now at Google Inc?

          • setheron 16 hours ago

            jart, you are back at Google?

            • jart 7 hours ago

              Yeah Google liked llamafile so much that they asked me to help them improve the LLM on their website too.

      • thangalin 20 hours ago

        Tips:

            # Avoid issues when wine is installed.
            sudo su -c 'echo 0 > /proc/sys/fs/binfmt_misc/status'
        
        And:

            # Capture the entirety of the instructions to obtain the input length.
            readonly INSTRUCT=$(
              join ${PATH_PREFIX_SYSTEM} ${PATH_PROMPT_SYSTEM} ${PATH_PREFIX_SYSTEM}
              join ${PATH_SUFFIX_USER} ${PATH_PROMPT_USER} ${PATH_SUFFIX_USER}
              join ${PATH_SUFFIX_ASSIST} "/dev/null" ${PATH_SUFFIX_ASSIST}
            )
        
            (
              echo ${INSTRUCT}
            ) | ./llamafile \
              -m "${LINK_MODEL}" \
              -e \
              -f /dev/stdin \
              -n 1000 \
              -c ${#INSTRUCT} \
              --repeat-penalty 1.0 \
              --temp 1.5 \
              --silent-prompt > output.txt
        • chrismorgan 15 hours ago

          > # Avoid issues when wine is installed.

          > sudo su -c 'echo 0 > /proc/sys/fs/binfmt_misc/status'

          Please don’t recommend this. If binfmt_misc is enabled, it’s probably for a reason, and disabling it will break things. I have a .NET/Mono app installed that it would break, for example—it’s definitely not just Wine.

          If binfmt_misc is causing problems, the proper solution is to register the executable type. https://github.com/mozilla-ai/llamafile#linux describes steps.

          I made myself a package containing /usr/bin/ape and the following /usr/lib/binfmt.d/ape.conf:

            :APE:M::MZqFpD::/usr/bin/ape:
            :APE-jart:M::jartsr::/usr/bin/ape:
        • swyx 21 hours ago

          justine tunney gave a great intro to Llamafile at AIE last year if it helps anyone: https://www.youtube.com/watch?v=-mRi-B3t6fA

          • michaelgiba 19 hours ago

            I’m glad to see llamafile being resurrected. A few things I hope for:

            1. Curate a continuously extended inventory of prebuilt llamafiles for models as they are released 2. Create both flexible builds (with dynamic backend loading for cpu and cuda) and slim minimalist builds 3. Upstreaming as much as they can into llama.cpp and partner with the project

            • michaelgiba 19 hours ago

              Crazier ideas would be: - extend the concept to also have some sort of “agent mode” where the llamafiles can launch with their own minimal file system or isolated context - detailed profiling of main supported models to ensure deterministic outputs

              • njbrake 19 hours ago

                Love the idea!

            • dolmen 11 hours ago

              Cosmocc and Cosmopolitan are remarkable technical achievements and llamafile made me discover them.

              The llamafile UX (CLI interface and web server with chat to quickly interact with the model) is great and make easy to download and play with a local LLM.

              However I fail to see use cases where I would build a solution built on a llamafile. If I want to play with multiple models, I don't need to have the binary attached to the model data. If I want to play with a model on multiple operating systems, I'm fine downloading the llamafile tool binary for the platform separately from the model data (in fact, on Windows one have to download the llamafile.exe separately anyway because of a limit of the OS for executable files).

              So Cosmopolitan is great tech, the llamafile command (the "UX for a model" part) is great, but I'm not convinced by the value of Cosmopolitan applied here.

              • synergy20 20 hours ago

                how is this different from ollama? for me the more/open the merrier.

                • ricardobeat 20 hours ago

                  Ollama is a model manager and pretty interface for llama.cpp, llamafile is a cross-platform packaging tool to distribute and run individual models also based on llama.cpp

                • FragenAntworten 17 hours ago

                  The Discord link is broken, in that it links to the server directly rather than to an invitation to join the server, which prevents new members from joining.

                  • njbrake 8 hours ago

                    Fixed, thank you!

                  • benatkin 16 hours ago

                    > As the local and open LLM ecosystem has evolved over the years, time has come for llamafile to evolve too. It needs refactoring and upgrades to incorporate newer features available in llama.cpp and develop a refined understanding of the most valuable features for its users.

                    It seems people have moved on from Llamafile. I doubt Mozilla AI is going to bring it back.

                    This announcement didn't even come with a new code commit, just a wish. https://github.com/mozilla-ai/llamafile/commits/main/

                    • apitman 21 hours ago

                      This is great news. Given the proliferation of solid local models, it would be cool if llamafile had a way to build your own custom versions with the model of your choice.

                      • behindsight a day ago

                        great stuff, working on something around agentic tooling and hope to collab with Mozilla AI as well in the future as they share the same values I have

                        • throawayonthe 21 hours ago

                          go get that investor money i guess?