Flux for 8GB graphics card
Is it possible to run such huge model directly or wait for prune versions to use it?
Lets discuss on the script to prune it properly.
Will this also work on rundiffusion?
works on my 8gb card, took 2 minutes for a 4-step image though...
hold my beer Works on my Dell Inspiron 15 Gaming (CPU matters when you offload work onto the CPU to make the model fit on a) NVIDIA GeForce 1050 mobile (4GB VRAM):
I ran this code: https://github.com/InServiceOfX/InServiceOfX/blob/master/PythonLibraries/HuggingFace/MoreDiffusers/morediffusers/Applications/terminal_only_finite_loop_flux.py
I made this: https://www.instagram.com/p/C-U61P2p0jG/?utm_source=ig_web_copy_link&igsh=MzRlODBiNWFlZA== https://x.com/inserviceofx/status/1820790765670252776
I did this about 15hours after flux1.schnell released in lowvram mode. 2 step generated in about 60 seconds in comfyui.
Flux still can't do the hard things, like understand idioms and purely abstract concepts like spaceships.
the bathtub prompt is "Burning the midnight oil while trying not to throw the baby out with the bathwater."
try this merged quantized model version of FLUX https://civitai.com/models/629858