Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

restarting chat with history #17

Open
eladrave opened this issue Mar 12, 2023 · 0 comments
Open

restarting chat with history #17

eladrave opened this issue Mar 12, 2023 · 0 comments

Comments

@eladrave
Copy link

If you ever save the chat to DB, then after a while this becomes too big, and the amount of tokens will be too much, or it will just not work.

Another option is to use embedding.

I was researching it, and found https://llamahub.ai/
Which essentially takes data from a source, and creates an embedding data for it.
Here is a simple snippet of what I did:

from llama_index import download_loader, GPTSimpleVectorIndex, Document, SimpleDirectoryReader
import os
from pathlib import Path
os.environ["OPENAI_API_KEY"] = 'MY OPEN AI KEY'

PDFReader = download_loader("PDFReader")

loader = PDFReader()
documents = loader.load_data(file=Path('/Users/eladrave/Downloads/datafile.pdf'))

index = GPTSimpleVectorIndex(documents)
index.save_to_disk("documents.json")

response = index.query("What are terms?")
print(response)

(This just reads a PDF)

We can save the chat.get_conversation() when a session "ends" (timeout etc)
like this:

from llama_index import download_loader, GPTSimpleVectorIndex, Document, SimpleDirectoryReader
import os
from pathlib import Path
os.environ["OPENAI_API_KEY"] = 'MY OPEN AI KEY'

UnstructuredReader = download_loader("UnstructuredReader")

loader = UnstructuredReader()
documents = loader.load_data(file=Path('memfilewithconversation.txt'))
index = GPTSimpleVectorIndex(documents)

response = index.query("What are terms?")

And then when you send a prompt to chat, you use it something like:

I will ask you questions based on the following context:
— Start of Context —

The_Response_from_above

— End of Context—
My question is: “How much wood would a woodchuck chew?”

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant