Skip to content
This repository has been archived by the owner on Oct 14, 2023. It is now read-only.

Latest commit

 

History

History
33 lines (23 loc) · 2.06 KB

README.md

File metadata and controls

33 lines (23 loc) · 2.06 KB

Custom Diffusion WebUI

An unofficial extension that implements Custom Diffusion for Automatic1111's WebUI.

What is Custom Diffusion

Custom Diffusion is, in short, finetuning-lite with TI. Instead of tuning the whole model, only the K and V matrices of the cross-attention blocks are tuned simultaneously with token embedding(s). It has similar speed and memory requirements to TI and supposedly gives better results in less steps.

How to use this

Training

You can find the UI in the Train/Train Custom Diffusion tab. Just train as you would a normal TI embedding. Under the training log directory, alongside with name-steps.pt you should also see name-steps.delta.safetensors, which contain finetuned delta weights (~50MB at half precision uncompressed).

Regularization images

Custom Diffusion proper includes regularization. To generate regularization images, go to Custom Diffusion Utils/Make regularization images. You can then optionally supply the generated images directory as regularization when training.

Using trained weights

The trained deltas will be under models/deltas (--deltas-dir); you can also copy over logged .safetensors versions. The delta weights can be used in txt2img/img2img as an Extra Network. You can select them under the extra networks tab like hypernets. Use the token embedding like a normal TI embedding.

Disclaimer

This is an unofficial implementation based on the paper and the features and implementation details may be different to the original.

Todo (roughly ordered by priority)

  • UI/UX
  • More testing and demo
  • Separate lr for embedding and model weights
  • Blending (simple linear combination of deltas)
  • Merging (optimization based from paper)
  • Compression
  • Let users choose what weights to finetune
  • Regularization
  • Multi-concept training