Chatting with an LLM in Ollama or OpenAI
The basic interface for working with an LLM is the GtLlmChat
. It allows the user to interact with an LLM programmatically or through a UI. The backend of these chats is abstracted and can be changed.
To chat with OpenAI, ensure that you have added an API key. This can be done in Adding an OpenAI API key. After the API key was added, you will be able to chat with an LLM.
chat := GtLlmChat new provider: GtOpenAIProvider withApiKeyFromFile
Executing the snippet above will allow you to interact with the chat through a UI. You can also send messages through code.
chat sendMessage: 'Hello friend! Who are you?'
Providers are the chat’s backend. Currently, backends for OpenAI and Ollama exist.
Providers can be changed at any time, even in the middle of a chat.
chat provider: (GtOllamaProvider new model: 'llama3.1'); sendMessage: 'What is your name?'
Please note that if you do not have the provided model locally already, it will first be downloaded, which will take some time. It should be fast afterwards as long as the model remains on the machine.