README.md
This is for connecting local server models like llamacpp, ollama, etc... to LM studio. Any server with a base url. The original plugin does not allow the to add thier own model names (as far as I know) so I could not load/unload or query my models.
This plugin uses the user's /v1/models endpoint to list their available models, and adds them to the model dropdown in the UI, overwriting the OpenAI and Anthropic defaults.
To do this: