Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Opening jitted model is impossible #2

Open
RobinGRAPIN opened this issue Aug 9, 2022 · 0 comments
Open

Opening jitted model is impossible #2

RobinGRAPIN opened this issue Aug 9, 2022 · 0 comments

Comments

@RobinGRAPIN
Copy link

Hello, I am searching a way to use Faster-RCNN on an iOS mobile App, so I tried your tutorial from the beginning by running converter.py, as I struggle for months to implement by my own the missing functions in C++ for RoiAlign, nms and so on. But the line m = torch.jit.load('./maskrcnn/model_freezed.pt') implies the following error :

WARNING:root:Torch version 1.12.1 has not been tested with coremltools. You may run into unexpected errors. Torch 1.10.2 is the most recent version that has been tested.
Traceback (most recent call last):
  File "/Users/rgn12/Desktop/CoreML-MaskRCNN/converter/converter.py", line 27, in <module>
    m = torch.jit.load('./maskrcnn/model_freezed.pt')
  File "/Users/rgn12/Library/Python/3.8/lib/python/site-packages/torch/jit/_serialization.py", line 162, in load
    cpp_module = torch._C.import_ir_module(cu, str(f), map_location, _extra_files)
RuntimeError: 
Unknown builtin op: _caffe2::GenerateProposals.
Could not find any similar ops to _caffe2::GenerateProposals. This op may not exist or may not be currently supported in TorchScript.
:
  File "code/__torch__/detectron2/export/caffe2_modeling/___torch_mangle_546.py", line 81
    scores0 = torch.detach(scores)
    bbox_deltas0 = torch.detach(bbox_deltas)
    rpn_rois, _1 = ops._caffe2.GenerateProposals(scores0, bbox_deltas0, im_info, CONSTANTS.c96, 0.0625, 100, 10, 0.69999999999999996, 0., True, -180, 180, 1., False)
                   ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ <--- HERE
    input54 = ops._caffe2.RoIAlign(input44, rpn_rois, "NCHW", 0.0625, 6, 6, 0, True)
    input55 = torch._convolution(input54, CONSTANTS.c97, CONSTANTS.c98, [1, 1], [0, 0], [1, 1], False, [0, 0], 1, False, False, True, True)

So the problem appears while loading the Jitted model, and the implemented functions in custom_ops and custom_mil_ops aren't used there as their implementation will serve in the conversion to CoreML. Do you know how to solve this ? Thank you levy much in advance !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant