• OsamaJaber 9 minutes ago

    Small models in the browser are a different optimization problem than small models on a server. On server you chase throughput so you batch. In browser you're stuck at batch size 1, which means kernel launch overhead and memory bandwidth dominate, not FLOPs

    • Sathwickp 19 minutes ago

      just tried it out, must say it's amazing the speed at which it generates these diagrams Is this opensource by any chance? Would love to take a look at the code and understand how it works

      • rahimnathwani 2 hours ago

        How does this part work?

        "The LLM outputs compact code (~50 tokens) instead of raw Excalidraw JSON (~5,000 tokens)."

        I see on the left that the LLM is outputting some instructions to add nodes and edges to the diagram. But what is interpreting those commands and turning them into an Excalidraw file?

        • wesleynepo 2 hours ago

          Really interesting, I wish I could understand the under the hood better but I guess I don't have all the background needed.

          • logicallee 3 hours ago

            I love this idea. Unfortunately, it says "Unsupported browser/GPU" for me. This is Desktop Chrome version 147 (page says it requires 134+) and I have a 1060 card with 6 GB of RAM on this specific device, so it should fit. I have more than 4 GB of free RAM as well.

            • teamchong 3 hours ago

              sorry it’s not working for you. I built this as a personal project for self-learning, but I plan to take a look at this issue next weekend. you can check out a video demo of it here: https://github.com/user-attachments/assets/71ae6e5c-a5ec-4d0...

              • logicallee 3 hours ago

                That's amazing. Very good result. Thanks for sharing.

            • COOLmanYT 6 hours ago

              no firefox support?

              • teamchong 4 hours ago

                firefox has webgpu already, but the subgroups extension isn't in yet. every matmul / softmax kernel here leans on subgroupShuffleXor for reductions, that's the blocker. same reason mlc webllm and friends don't run on firefox either. once mozilla ships it this should work

              • hhthrowaway1230 5 hours ago

                so multiple of these browser wasm demos make me re-download the models, can someone make a cdn for it or some sort u uberfast downloader? just throw some claude credits against it ty!

                • wereHamster 5 hours ago

                  CDN wouldn't help much. These days browsers partition caches by origin, so if two different tools (running on different domains) fetch the same model from the CDN, the browser would download it twice.

                  • cjbgkagh 2 hours ago

                    Did not know that. That sounds extraordinary wasteful, there must be a file hash based method that would allow sharing such files between domains.

                    • faangguyindia 28 minutes ago

                      It offers security.

                      Just like you wouldn't use same table in your system for all users in a multi tenant application.

                      • cjbgkagh 24 minutes ago

                        If the file is hashed strongly enough then it can be no other file. I can see how information on previous sites visited can be leaked and how this could be bad but I think whitelisting by end users could still allow some files to be used. E.g. the code for react.

                      • thornewolf an hour ago

                        it's a security feature. otherwise my malicious site could check for cdn.sensitivephotoswebsite.com and blackmail you if it was cached already

                        • cjbgkagh an hour ago

                          It would be nice if there was a whitelist option for non-sensitive content. I stopped using cdn links due to the overhead of the extra domain lookups but I did think that my self hosted content would be cached across domains.

                    • logicallee an hour ago

                      >can someone make a cdn for it or some sort u uberfast downloader? just throw some claude credits against it ty!

                      Okay, I did so. I realize that in your later followup comment you might want something different (like for Chrome itself to cache these downloads or something) but for now I made what you asked for, here you go:

                      https://stateofutopia.com/experiments/ephemeralcdn/

                      It's an ultrafast temporary CDN for one-off experiments like this. Should be lightning fast. By including the script, you can include any file this CDN serves.

                      • embedding-shape 5 hours ago

                        Adding a file input where users can upload files to the frontend directly from their file manager would probably work as a stop-gap measure, for the ones who want something quick that let people manage their own "cache" of model files.

                        • logicallee 4 hours ago

                          Would you be okay with it using your upload at the same time, then a p2p model would work. (This is potentially a good match for p2p because edge connections are very fast, they don't have to go across the whole Internet). You could be downloading from uploaders in your region. Let me know if you would be okay with uploading at the same time, then this model works and I can build it for you for people to use this way.

                          • Rekindle8090 5 hours ago

                            What? downloaded for me at 2gbps

                            • hhthrowaway1230 5 hours ago

                              Ah let me clarify, many of the in the browser demos make me download certain models even if I already have them It would be great if there was a way that I don't have to redownload them across demos so that I just have a cache. or an in browser model manager. hope this makes sense.

                              Or indeed use some sort of huggingface model downloader (if that exist with XET)

                              • varun_ch 4 hours ago

                                I think this would sit best at the browser level. I’m not sure there’s a nice way for multiple websites to share a cache like that.

                                • hhthrowaway1230 5 hours ago

                                  also maybe a good usecase to finally have P2P web torrents :)

                                • hhthrowaway1230 5 hours ago

                                  Yeah that's great but I'm in a cafe outside burning my phone data. ty!

                              • agent37 3 hours ago

                                Very cool. Did you happen to try other models like Qwen and was there a difference as opposed to Gemma ?