How to setup an LLM provider

All providers require some amount of setup:

- Ollama, as a local model provider, requires a local installation. Visit the website to learn how.

- All other providers require an API key that you can install locally: