File search in the response provider

Just as in Adding the GT book to a LLM to answer questions about GT, we can add file search to the OpenAI response provider.

This requires us to set up and manage our own files and vector stores, but allows the provider to remain stateless.

In the following, we are going to re-create the assistant built in Adding the GT book to a LLM to answer questions about GT using the responses provider and manual file management.

First we need to set up the book and add it to a vector store. The book file generation is unchanged from the case study.

pages := LeDatabase gtBook pages.

aFile := FileLocator temp / 'GToolkitBook.txt'.
aFile ensureDelete.
aFile
	writeStreamDo: [ :aStream | pages do: [ :aPage | aStream nextPutAll: aPage asMarkdownPage ] ]
  

Unlike in that example, however, we then need to set up our own vector store. To do this, we need to upload the file to OpenAI beforehand.

apiFile := GtOpenAIClient withApiKeyFromFile
		uploadFile: aFile
		withPurpose: 'assistants'
  

We then create the vector store with that single file.

vectorStore := GtOpenAIClient withApiKeyFromFile
		createVectorStoreNamed: 'GToolkitBook'
		withFiles: {apiFile id}
  

Now we can use that store in the assistant.

The assistant itself is generally unchanged from the case study, with the exception that we do not need to reset the provider.

assistant := GtLlmAssistant new
		description: 'You are an assistant that answers questions about Glamorous Toolkit (also: GToolkit or GT) by referring to the GToolkit book provided to you.'.

assistant
  

We then simply create chat and add the tool.

chat := assistant createChat.

chat provider
	addTool: (GtLlmFileSearchTool new vectorStoreIds: {vectorStore id})
  

Finally, we can ask our questions.

chat sendChatRequest: 'What is Ludo?'