How to run custom model (.pth) in C++

  Kiến thức lập trình

I have trained a Mask DINO model for instance segmentation task. For getting better performance on inference, I’d like to run this model in C++.

I’ve already tried some methods and encountered errors along the way.

  1. I attempted to convert to ONNX using PyTorch. After loading weights with torch.load(), I ran model.eval(). This function threw an error because torch.load() returned dictionary which was not expected. And I encountered the same issue when using TorchScript.

  2. The original repository, IDEA-Research, is based on detectron2 so I tried to convert using detectron2’s tools and encountered an error similar to the one described in this issue.

I think the fact that Mask DINO has Mask2Former codes causes these errors, as mentioned in this comment in the issue above. It says, it doesn’t support working with ONNX.

My main question is, is there any way to run this model in C++? I really need to do this.

LEAVE A COMMENT