-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Attribute Error on running bvqa.py describe (moondream?) #20
Comments
This is a versioning issue as Transformers has deprecated Moondream 2024-07-23 has the old call to 2024-07-23 is the version we are using: Moondream 2024-08-26+ has a new implementation of Unless @geoffroy-noel-ddh there's some other reason not to, we could try updating to |
Thanks, I think we can upgrade Moondream version as we'll keep updating the rest of the stack to enable other newer model types. On minor issue is that the experiments I ran last year on the Sculpting Time proof of concept use the 2024-07-23 version. If I need to reproduce those results I'll create a new file under build for specific versions of moondream. More generally the BVQA requirements are quite loosely set so far (most entries in requirementsX.txt are not bracketed). And this will soon cause other similar problems for any describer/model. I'll try to think about better versioning approach. |
I actually cannot reproduce the bug with vikhyatk/moondream2:2024-07-23 and transformers 4.48.1. The description on Ubuntu completes without errors. I wonder why... Ok, I see. If I re-install bvqa dependencies from scratch with build/requirements.txt and build/requirements-moondream.txt, transformers 4.48.2 is installed. Which still has the deprecated method. But if one installs the dependencies for qwen (build/requirements-qwen) latest unreleased version transformers (4.49.0.dev0 as of today) is pulled and this doesn't have the method. I can reproduce the bug that way. So I think we (read I) should also remove the |
While testing of 2025-01-09 on my system (from a freshly cloned repo and recreated env)
In contrast, 2024-07-23 describes the three images in 2:44 using all CPUs and 14GB. 2024-08-26 also works well. And it works with live version of transformers (4.49). I can't see anyone else reporting similar issues about 2025-01-09 on moondream repo. I'll test on another machine. |
On our ML machine I get this error with 2025-01-09 after a fresh install.
|
Thanks. ML error is a known pytorch & cuda version incompatibility, it seems (I can bypass it with However |
Okay, I will check the CPU-only option on TRE tomorrow. Are you thinking we upgrade to the previous version (2024-08-26) for now? If it's only being triggered on the dev build of transformers 4.49 we don't need to upgrade if we stick with the 4.48 release, but it is a deprecated method - it will break at some point in the future soon. |
I noticed Moondream's README recommends to use transformers for GPU and, presumably, its own package for CPU. It might explain the problem. Transformers version used to work on CPU (see above) but has become too slow with latest moondream. If that's true, it means we'd have to split the code in our moondream's describer (one branch for CPU, one for GPU) or drop support for CPU (as not essential). That's mildly annoying. I'll report the CPU issue with 2025-01-09 transformers on moondeam repo when I have time to get feedback. |
Yes, switching to 2024-08-26 as a default would be a good compromise I think. Until I can elucidate the CPU issue with new version. Thanks for your help! |
I have replicated the unusably slow performance on CPU on the TRE for |
Thank you. I reported that last night after trying again on two machine from scratch. Let's wait and see... |
NB. Unable to paste stack from the TRE VM to this Issue, so will try to capture relevant details:
On running /test/python ../bvqa.py describe (no args, so assuming moondream by default?)
First warning as follows:
Second warning appears as first description is attempted:
Final error suggests syntax problem
The text was updated successfully, but these errors were encountered: