Working with the Ollama API client
Once Ollama is set up and running on your machine, you will be able to run code immediately.
client := GtOllamaClient new
This client is your entrypoint for interacting with Ollama. Next, you can download a new model. This step is optional if you’ve already downloaded all the models you want. Please note that this can take quite some time, depending on the size of the model.
client pullModel: 'llama3.1'
To find out which models are available, you can query the list of clients.
client listModels
Then, we can start interacting with the client!
messages := { GtLlmSystemMessage new content: 'You are a helpful assistant.'. GtLlmUserMessage new content: 'Hello Llama! How are you today?' }. client completeChatWithModel: 'llama3.1' andMessages: messages