Age | Commit message (Collapse) | Author |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Since norm layer need fp32, I only convert the linear operation layer(conv2d/linear)
And TE have some pytorch function not support bf16 amp in CPU. I add a condition to indicate if the autocast is for unet.
|
|
|
|
|
|
old-style emphasis
|
|
|
|
|
|
|
|
|
|
Edit-attention fixes
|
|
|
|
|
|
Add lora-embedding bundle system
|
|
|
|
Ability for extensions to return custom data via api in response.images
|
|
fix the key error exception when processing override_settings keys
|
|
|
|
fix: checkpoints_loaded:{checkpoint:state_dict}, model.load_state_dict issue in dict value empty
|
|
|
|
fix IndexError
|
|
|
|
added option to play notification sound or not
|
|
Add altdiffusion-m18 support
|
|
Fix square brackets multiplier
|
|
|
|
|
|
in the extensions
|
|
|
|
|
|
|
|
|
|
Change denoising_strength default to None.
|
|
|
|
|
|
fix fieldname regex
|
|
|
|
|
|
|
|
edit-attention: Allow editing whitespace delimiters
|
|
|