File search in the response provider
We can add file search to the OpenAI response provider. This requires us to set up and manage our own files and vector stores, but allows the provider to remain stateless.
First we need to set up the book and add it to a vector store. The book file generation is slightly altered so that we can ensure that the test query is not based on what the LLM already knows.
pages := LeDatabase gtBook pages. file := FileLocator temp / 'GToolkitBook.txt'. file ensureDelete. file writeStreamDo: [ :aStream | pages do: [ :aPage | aStream nextPutAll: ((GtLLepiterContentExporter new page: aPage; export) copyReplaceAll: 'Ludo' with: 'Bludo') ] ].
We then need to set up our own vector store. To do this, we need to upload the file to OpenAI beforehand.
apiFile := GtOpenAIClient withApiKeyFromFile uploadFile: file withPurpose: 'assistants' "'user_data'"
We then create the vector store with that single file.
vectorStore := GtOpenAIClient withApiKeyFromFile
createVectorStoreNamed: 'GToolkitBook'
withFiles: {apiFile id}
Now we can use that store in the chat.
The chat itself is similarly with the one from the case study, with the exception that we do not need to reset the provider.
chat := GtLChat new
instruction: 'You are an assistant that answers questions about Glamorous Toolkit (also: GToolkit or GT) by referring to the GToolkit book provided to you. Always check your answer with file search before answering.'
titled: 'Purpose';
provider: (GtLConnection new
providerClass: GtLOpenAIProvider;
label: 'GPT 5.1';
model: 'gpt-5.1';
addOption: 'isStreaming' withValue: false) buildProvider;
tools: {GtLFileSearchTool new vectorStoreIds: {vectorStore id}};
sendMarkdown: 'What is Ludo?'