You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We need to come up with a way to process checkpoints between destructive operations. So if I do filtering of the data and lose some rows, I can flag the next checkpoint as "post-destructive", "deduplication" or something similar.
This will pay off a lot when we implement validation #95 and have the ability to attach the control measures onto data frame instead of spark session.
The text was updated successfully, but these errors were encountered:
We need to come up with a way to process checkpoints between destructive operations. So if I do filtering of the data and lose some rows, I can flag the next checkpoint as "post-destructive", "deduplication" or something similar.
This will pay off a lot when we implement validation #95 and have the ability to attach the control measures onto data frame instead of spark session.
The text was updated successfully, but these errors were encountered: