-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tests: add regression test to check WFs are started properly #37
base: master
Are you sure you want to change the base?
Conversation
logs=None, | ||
results=None, | ||
) | ||
db.session.commit() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This commit is not required, pls remove it.
Also, changing the state of the db in a test is bad practice as it affects other tests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PS: if you feel proactive you can try to remove the same statement in all other tests and see it was required for the test to pass. My guess is that in most cases, it is not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't believe that changing the DB state is a bad practice. Not cleaning it, instead, it is.
A proper tear-down
function is defined in the config.py
and run after each test, this ensures the DB consistency for each test. Also, not committing the session won't insert a CrawlerJob in the DB, making these tests fail because they read it from the DB.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure you need to commit, if you are working in the same session. But if you clean up explicitly, it's not too bad.
assert str(job.status) | ||
assert job.status == JobStatus.PENDING | ||
|
||
with patch('inspire_crawler.tasks.start') as mock: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use autospec=True
(more info here https://docs.python.org/3/library/unittest.mock.html#autospeccing), I am preparing a PR adding it for all patches in inspire-next
.
errors=None, | ||
log_file="/foo/bar" | ||
) | ||
mock.apply_async.assert_called() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
... then you'll discover that assert_called
doesn't exit in our version of mock
: https://github.com/testing-cabal/mock/blob/master/mock/mock.py#L309-L317.
That seems to be wrong, I looked at the wrong place in that file: https://github.com/testing-cabal/mock/blob/master/mock/mock.py#L893.
logs=None, | ||
results=None, | ||
) | ||
db.session.commit() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure you need to commit, if you are working in the same session. But if you clean up explicitly, it's not too bad.
No description provided.