Is it possible to point this at a custom OAI compatible llm endpoint of my choice?
Not right now but I do plan to add that as a feature in future. I'm gonna let people use their OpenAI key or their custom OAI compatible endpoint with the extension
Thanks, but I’ll just read the webpage instead.