Training GPU requirements and Guidelines?
#2
by
C0nsumption
- opened
I’m extremely interested in training Lora’s and supporting the model on CivitAI. But can I get more insight into training requirements?
For Lora training, can I get away with a 4090 or do I need to hit the cloud?
For full parameter training, is a single a100 enough? Do I need multiple? How long to train? Any insight would be deeply appreciated
I asked BlueLeaf about this (LyCORIS) and he said you can train this on 8GB. I was shocked, but he said he is working on adding this so we can train it via Kohya.