You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sometimes when loading the OPLSDA page, something fails on the backend and the CSV is deleted (or at least no longer available in the frontend).
Some snippets from the stack trace:
Traceback (most recent call last):
File "/home/daniel.chiquito/envs/viime-X305hhT4/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1284, in _execute_context
cursor, statement, parameters, context
File "/home/daniel.chiquito/envs/viime-X305hhT4/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 590, in do_execute
cursor.execute(statement, parameters)
sqlite3.IntegrityError: UNIQUE constraint failed: validated_metabolite_table.csv_file_id
The above exception was the direct cause of the following exception:
...
File "/home/daniel.chiquito/envs/viime-X305hhT4/lib/python3.7/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/home/daniel.chiquito/git/viime/viime/views.py", line 558, in save_validated_csv_file
db.session.commit()
File "/home/daniel.chiquito/envs/viime-X305hhT4/lib/python3.7/site-packages/sqlalchemy/orm/scoping.py", line 163, in do
return getattr(self.registry(), name)(*args, **kwargs)
...
sqlalchemy.exc.IntegrityError: (sqlite3.IntegrityError) UNIQUE constraint failed: validated_metabolite_table.csv_file_id
[SQL: INSERT INTO validated_metabolite_table (id, created, csv_file_id, name, normalization, normalization_argument, transformation, scaling, imputation_info, meta, raw_measurements_bytes, measurement_metadata_bytes, sample_metadata_bytes, groups_bytes) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)]
[parameters: ('2fb7f9edc9ef46db9e7c0f8f0bb34fab', '2020-10-22 15:17:50.169653', 'efd5cc70522b4e7386382be7424bdf1d', 'Cancer_chemo_muscle_NMR.csv', None, None, None, None, '{"mcar": [], "mnar": []}', '{}', <memory at 0x7f561bc3a7a0>, <memory at 0x7f561bc3abb0>, <memory at 0x7f561bc3a390>, <memory at 0x7f561bc3a600>)]
(Background on this error at: http://sqlalche.me/e/gkpj)
I haven't been able to reproduce reliably, but I've seen it happen in production and locally. Trying to load the OPLSDA page (and possibly the PLSDA page?) causes the breadcrumb in the top to change from VIIME >> Data >> filename.csv >> Analyze Data >> OPLS-DA to VIIME >> Data >> {UUID} >> Analyze Data >> OPLS-DA. Going back to the Data page no longer shows the uploaded CSV.
The text was updated successfully, but these errors were encountered:
I've found that this is reproducible by loading another analysis (I used Group Prediction), then opening OPLSDA in a new tab. As it tries to load the dataset the CSV will be deleted.
This is source of the race condition https://github.com/girder/viime/blob/master/viime/views.py#L545-L555. It will definitely occur if two validate endpoints are called simultaneously. I'm not sure if there are other places in the code that the validated table is created, but it would be worth looking for.
Sometimes when loading the OPLSDA page, something fails on the backend and the CSV is deleted (or at least no longer available in the frontend).
Some snippets from the stack trace:
I haven't been able to reproduce reliably, but I've seen it happen in production and locally. Trying to load the OPLSDA page (and possibly the PLSDA page?) causes the breadcrumb in the top to change from
VIIME >> Data >> filename.csv >> Analyze Data >> OPLS-DA
toVIIME >> Data >> {UUID} >> Analyze Data >> OPLS-DA
. Going back to the Data page no longer shows the uploaded CSV.The text was updated successfully, but these errors were encountered: