FLUX LoRA multi-node finetuning? #1220
Closed
AbdullahMu
started this conversation in
General
Replies: 1 comment
-
|
check the DISTRIBUTED.md document for a step by step guide. if it fails beyond that, it seems to be a local issue and probably out of scope, so i will convert this to a discussion instead |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
is it possible to do mult-inode LoRA finetuing with
accelerate? I believe i have made theaccelerateconfig file correctly, and FLUX has been loaded to all the GPUs in all nodes. but after that, i get an error like this:is there a way to fix this?
Beta Was this translation helpful? Give feedback.
All reactions