Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug:Running Incremental Model For nearly above half a millions of data #512

Open
g-diwakar opened this issue Feb 12, 2025 · 0 comments
Open

Comments

@g-diwakar
Copy link

g-diwakar commented Feb 12, 2025

My pipeline was working fine for a month until the data volume reaches to like 50 million of rows. It then throws an error lilke this:

what(): {"exception_type":"INTERNAL","exception_message":"Attempted to access index 9223372036854775807 within vector of size 3"}

I first of all thought this error could be beacuse of the duckdb side but no.

There is no error when I am running the model with a full-refresh mode. But as soon as I run an incremental model I am encountered with such kind of error.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant