Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed to work with keras2onnx-made models #76

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

mdhimes
Copy link

@mdhimes mdhimes commented Jul 28, 2020

Using an ONNX model made from keras2onnx resulted in the error discussed here: #23

So, I went and fixed the code to work for such models. I left the old code in comments in case this broke something for ONNX models made via other means.

However, now that it's working for me, I am finding slightly different predictions between the Keras model and the model from onnx2keras (Keras compile model-->keras2onnx save-->onnx load-->onnx2keras). I'm not sure at what point in that conversion link the difference(s) is(are) being introduced. In my case, the differences are -0.1% +- 2.3%, with the worst difference being 9.3%.
I did some additional testing of the ONNX file by using other software to make the predictions, and the onnx2keras model predictions are identical. So, the differences in outputs are confirmed to not be due to onnx2keras.

@gmalivenko
Copy link
Owner

Hello @mdhimes!
Thanks for your pull request, I'll review / test it as soon as possible.

@mdhimes
Copy link
Author

mdhimes commented Jul 28, 2020

Sounds good, let me know if I can help in any way, like sharing the model files.

FYI, my versions are onnx and keras2onnx 1.7.0, onnx2keras 0.0.22, Keras 2.2.4, Tensorflow 1.13.2

@mdhimes
Copy link
Author

mdhimes commented Sep 14, 2020

Hi @nerox8664 any update?

@mdhimes
Copy link
Author

mdhimes commented Feb 1, 2021

Hi @nerox8664 just checking if you have looked into this yet? It's still a problem last I checked, and I haven't found a problem with my fix.

@bwery
Copy link

bwery commented Mar 9, 2021

I encounter a similar error when converting a model that has been produced with Caffe and has been converted to ONNX using CoreML and then onnxmltools.

Looking to the code and to the modifications made by mdhimes, I see that the problem is in lines 78 to 93 of "converter.py".

I was wondering why seeking for the 'name' through field indexes. It seems that it can be retrieved directly as an attribute.

I have made the following modification on my system and it looks working fine with my network. Would this be a correct solution ?

for onnx_w in onnx_weights:
    onnx_extracted_weights_name = onnx_w.name
    weights[onnx_extracted_weights_name] = numpy_helper.to_array(onnx_w)

    #try:
    #    if len(onnx_w.ListFields()) < 4:
    #        onnx_extracted_weights_name = onnx_w.ListFields()[1][1]
    #    else:
    #        onnx_extracted_weights_name = onnx_w.ListFields()[2][1]
    #    weights[onnx_extracted_weights_name] = numpy_helper.to_array(onnx_w)
    #except:
    #    onnx_extracted_weights_name = onnx_w.ListFields()[3][1]
    #    weights[onnx_extracted_weights_name] = numpy_helper.to_array(onnx_w)

    logger.debug('Found weight {0} with shape {1}.'.format(
                 onnx_extracted_weights_name,
                 weights[onnx_extracted_weights_name].shape))

Would be a test using the function "HasField()" useful to fall back on the index-based method in some cases the attribute would have another name ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants