How to run torch dataloader in a sub-process of multiprocessing.Pool?
I want to inference model in multiprocessing, instead of use torch.distributed, how can I use multiprocessing.Pool?
I want to inference model in multiprocessing, instead of use torch.distributed, how can I use multiprocessing.Pool?