Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add new text followup preprocessor #947

Open
wants to merge 20 commits into
base: main
Choose a base branch
from
Open

Add new text followup preprocessor #947

wants to merge 20 commits into from

Conversation

jeffbl
Copy link
Member

@jeffbl jeffbl commented Jan 21, 2025

Please ensure you've followed the checklist and provide all the required information before requesting a review.
If you do not have everything applicable to your PR, it will not be reviewed!
If you don't know what something is or if it applies to you, ask!

Resolves #918 and resolves #934.

This PR adds a new preprocessor text-followup designed to let a Monarch user speak a followup query for the current graphic, and receive an LLM-generated response. The new text-followup schema, which contains both a brief and a full response tag, was merged earlier.

It also implements the prompt override as described in #931, but that issue is being left open since this needs to also be implemented in the other LLM-based preprocessors before it is closed. The format for this override is in the README.md file.

Testing includes running Venissa's script (see below) repeatedly with different prompts, which found many ways llama3.2-vision resisted producing good JSON that followed the prompt. Venissa also tested it with the text-followup-handler she is implementing, and it works end-to-end. It still fails sometimes, but these failures seem to be infrequent with the current prompt.

jq '.followup.query = "Are there any animals?"' /home/venicq/follow-up.json | /var/docker/image/bin/sendimagereq /dev/stdin https://unicorn.cim.mcgill.ca/image/render/preprocess/ | jq '.preprocessors."ca.mcgill.a11y.image.preprocessor.text-followup"'

Also note that I did git merge from main into this branch, which means there are some spurious commits listed in this PR, but it should merge cleanly to main. Please let me know if there is any trouble, and I'll do rebase instead in the future, which I think would have avoided this problem.

@VenissaCarolQuadros not making you a formal reviewer, but since you are the main consumer of this preprocessor, figured you might want to take a look at how the sausage is made, and any comments or suggestions are welcome.

Don't delete below this line.


Required Information

  • I referenced the issue addressed in this PR.
  • I described the changes made and how these address the issue.
  • I described how I tested these changes.

Coding/Commit Requirements

  • I followed applicable coding standards where appropriate (e.g., PEP8)
  • I have not committed any models or other large files.

New Component Checklist (mandatory for new microservices)

  • I added an entry to docker-compose.yml and build.yml.
  • I created A CI workflow under .github/workflows.
  • I have created a README.md file that describes what the component does and what it depends on (other microservices, ML models, etc.).

OR

  • I have not added a new component in this PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment