Forked from will-lms/openai-compat-endpoint
Description
Added a drop-down list of models. Models @openrouter.ai are used. baseUrl: "https://openrouter.ai/api/v1"
Last updated
Updated 5 days agobyProject Files
README
Use any OpenAI-compatible API in LM Studio.
Any models via OpenRouter.ai API
Just select provider & model. Enter your key from openrouter ("sk-or-v1-111..."). First get API key. https://openrouter.ai/settings/keys it is free sometimes.
todo: drop-down menu with models downloadable online from endpoint (models list). half-done, next: Next step
The availability of free models for requests depends on many factors. Some are always available, while others require two or three requests before returning a 429 error. Now "stepfun/step-3.5-flash:free", "arcee-ai/trinity-mini:free" are available.
After installing the plugin, all its contents are located in the file production.js (alias dev.js) This is the file that runs every time the plugin is launched. Making any text edits to this file will immediately affect the plugin's operation. Use with caution. Other files are not needed at all for work, everything you need is in this one file.