Problems and Proposed Solutions Some merged models have good performance, such as AnythingV3. Should I continue to merge? This is not scientifically sound and will ultimately result in a model that is overfitted in some cases and normal in others. This model looks good at first, but you will find that it is not faithful to the input prompt and suffers from the problem of language drift mentioned above. 3.Merged Models? Solution We can use the method mentioned above to train two models together using a word frequency list with Dreambooth. We can add or replace the training data with the images generated by the model we want to merge, according to the calculated ratio, and maintain a dynamic training dataset during training to prevent overfitting, as mentioned above. We get a balanced model that does not overfit in certain directions. Then choose a checkpoint that is about to be overfitted but not yet as the final version. This type of model is popular in the community because it has good output even under poorly written prompt inputs, such as the CertainThing.