High VRAM usage: 21.5GB

#52
by Leonliclash - opened

Why is it that when I use the flux DEV model just for ordinary text-to-image operations, with a resolution of 896x1152, the VRAM usage remains around 21.5GB?

FLUX1.dev is a 12B params model, when you load a 12B params bf16/fp16 model, the base vram requirement is approximate 22GB(12 billion * 16bit). And acctually this model is not only 12B, it also has a huge text encoder.

Using the example given I have issues, even with the enable model CPU offload and reducing the image size, I'm still getting out of memory issues. I've been debugging with myself and chat and Claude and I haven't come up with anything yet. i've got 24gb vram

instead of the example they give, which uses this.
pipe.enable_model_cpu_offload()

I got to work using this:
pipe.enable_sequential_cpu_offload()

instead of the example they give, which uses this.
pipe.enable_model_cpu_offload()

I got to work using this:
pipe.enable_sequential_cpu_offload()

@dfreelan how much vram usage after pipe.enable_sequential_cpu_offload() ? thank you

Sign up or log in to comment