Age | Commit message (Collapse) | Author |
|
|
|
|
|
|
|
|
|
Added DeepDanbooru interrogator
|
|
|
|
|
|
|
|
|
|
support EMA weights in processing (????)
|
|
|
|
Otherwise, we may override it with one of the next two path (. or ..) if it is present there, and then the local paths of other modules (taming transformers, codeformers, etc.) wont be found in sd_path/../.
Fix https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/1085
|
|
|
|
|
|
|
|
|
|
I had to check the code to work out what splitting was 🤷🏿
|
|
|
|
regarding cross attention optimizations
|
|
|
|
model
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
xformers attention
|
|
|
|
|
|
|
|
|
|
before the final AND will be dropped
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
* Add hypernetwork support to split_cross_attention_forward_v1
* Fix device check in esrgan_model.py to use devices.device_esrgan instead of shared.device
|