How to use a Flux model without CUDA on Silicon Metal devices?

  Kiến thức lập trình

When I try to run the FluxPipeline on a Jupyter notebook it gets stuck on the cudart assertion error, but I can’t have a CUDA compatible torch on my M1 Pro mac. Is there a workaround for this?

I had the model loaded already without error.

Code I used:

import torch
from diffusers import FluxPipeline

model_id = "black-forest-labs/FLUX.1-schnell"

pipe = FluxPipeline.from_pretrained("black-forest-labs/FLUX.1-schnell", torch_dtype=torch.bfloat16)
pipe.enable_model_cpu_offload()

The snippet I ran:

code

And this is the error:

error

New contributor

bendemonium is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.

Theme wordpress giá rẻ Theme wordpress giá rẻ Thiết kế website

LEAVE A COMMENT