Chatting with an LLM in Ollama, Anthropic or OpenAI
The basic interface for working with an LLM is the GtLlmChat
. It allows the user to interact with an LLM programmatically or through a UI. The backend of these chats is abstracted and can be changed.
You can begin a chat using the default connection if it is set up. See how to check this and change the default connection in Managing connections and setting the default.
chat := GtLlmChat new
You can also connect to a specific provider. To chat with OpenAI, for instance, ensure that you have added an API key. This can be done in Adding an OpenAI API key. After the API key was added, you will be able to chat with an LLM.
chat := GtLlmChat new provider: GtOpenAIProvider withApiKeyFromFile
Executing the snippet above will allow you to interact with the chat through a UI. You can also send messages through code.
chat sendMessage: 'Hello friend! Who are you?'
Providers are the chat’s backend. Currently, backends for Anthropic, OpenAI and Ollama exist.
Providers can be changed at any time, even in the middle of a chat.
chat provider: (GtOllamaProvider new model: 'llama3.1'); sendMessage: 'What is your name?'
Please note that if you do not have the provided model locally already, it will first be downloaded, which will take some time. It should be fast afterwards as long as the model remains on the machine.
Anthropic works similarly, but here the provider needs both a model and an API key.
chat provider: (GtAnthropicProvider withApiKeyFromFile model: 'claude-3-5-sonnet-20241022'); sendMessage: 'And who are you?'