Age | Commit message (Collapse) | Author |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Since norm layer need fp32, I only convert the linear operation layer(conv2d/linear)
And TE have some pytorch function not support bf16 amp in CPU. I add a condition to indicate if the autocast is for unet.
|
|
|
|
general prompt.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
old-style emphasis
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Start / Restart generation by Ctrl (Alt) + Enter
|
|
LF
|
|
|
|
LF instead CRLF
|
|
Exclude lambda
|
|
(((attention))) syntax
|
|
Edit-attention fixes
|
|
|
|
|
|
Add ability to interrupt current generation and start generation again by Ctrl (Alt) + Enter
|
|
webui.settings.bat support
|
|
|
|
Add lora-embedding bundle system
|
|
|
|
Ability for extensions to return custom data via api in response.images
|