-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support multiple names for gnr_resolve() #12
Comments
hi @lyttonhao I'll take a look later today... |
Thanks. @sckott |
@lyttonhao I used your fix in your fork for parsing more than 1, and fixed so that works with > 1 name passed in. Can you share the example that was failing for you? |
Okay. I will test the new code soon. Thanks, @sckott. |
Hi @sckott, I think there is still a problem as I faced before. When I test 300 names it works well, but it failed when querying 500 or more names. It seems that the parameters should not be too long. |
See the documentation for the API http://resolver.globalnames.org/api they allow should be a simple thing to add in |
Hi @sckott, I've added some code to work with POST according to https://github.com/ropensci/taxize/blob/master/R/gnr_resolve.R#L86-L97. Below is my corresponding codes:
However, it seems that when while result_json['status'] == 'working':, it would be an infinite loop. Can you give some advices? Thank your very much. |
@lyttonhao I'll have a look soon, trying to get testing and CI set up first, so we can have checks on all change/PR's, etc. |
@lyttonhao That |
Hi @sckott, I'm very sorry that I've missed your message these days. Do you mean that change the |
@lyttonhao I'm not sure. I think when GNR API starts operating in a queue it's not working as it's supposed to be. Here is a URL response from a job which I submitted more than 6 hours back, for query size = 1,010 It still shows status as 'working'. Maybe they need to fix things on their end. But at least, we got it working for query size > 300 but < 1000, by adding POST. @sckott Any ideas? |
@lyttonhao @panks I'll take a look at this |
If there isn't any hope of it working, then one thing we can do is split lists of size > 1000 into smaller chunks and concatenate their results. |
@panks @lyttonhao I just played with this now in R, and it seems that when number of names > 1000 the job seems to never finish. I am asking about this now - we should probably not pass more than 1000 names, so jus break up into chunks of < 1000 and pass those. |
Yeah I guess splitting the list is the best way to go as of now. I will do that and send a PR. Thanks! |
Since the return line in gnr.py only return the first result, current gnr_resolve don't support to return results of multiple names. I change this line to return all results. It works well when the query containing about 100 names, but gets error of " No JSON object could be decoded" when the number is larger. I haven't fixed it. Anyone can help?
The text was updated successfully, but these errors were encountered: