add comment on how to do retrieval
This commit is contained in:
7
main.go
7
main.go
@ -115,3 +115,10 @@ func singlePromptInteraction(systemPrompt, prompt string, retries int) (openai.C
|
||||
|
||||
return resp, nil
|
||||
}
|
||||
|
||||
// TODO: anything to be stored in the database should be chunked to sizes between 512 and 1024 tokens
|
||||
// it should also overlap with the previous chunk by 100-200 tokens
|
||||
// When the LLM asks for more context, it should be able to use the database to find the most relevant chunks here is how:
|
||||
// We will get the embeddings for each prompt and use those embeddings to search for the closest 6 chunks
|
||||
// we will use a separate LLM prompt to make an attempt to select and sort the chunks based on the user's input
|
||||
// then we will add the best matched chunks to the main prompt as further context for the given prompt
|
||||
|
Reference in New Issue
Block a user