index
:
stable-diffusion-webui-gfx803.git
master
stable-diffusion-webui by AUTOMATIC1111 with patches for gfx803 GPU and Dockerfile
about
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
modules
/
sd_hijack_optimizations.py
Age
Commit message (
Expand
)
Author
2023-01-06
Change sub-quad chunk threshold to use percentage
brkirch
2023-01-06
Add Birch-san's sub-quadratic attention implementation
brkirch
2022-12-20
Use other MPS optimization for large q.shape[0] * q.shape[1]
brkirch
2022-12-10
cleanup some unneeded imports for hijack files
AUTOMATIC
2022-12-10
do not replace entire unet for the resolution hack
AUTOMATIC
2022-11-23
Patch UNet Forward to support resolutions that are not multiples of 64
Billy Cao
2022-10-19
Remove wrong self reference in CUDA support for invokeai
Cheka
2022-10-18
Update sd_hijack_optimizations.py
C43H66N12O12S2
2022-10-18
readd xformers attnblock
C43H66N12O12S2
2022-10-18
delete xformers attnblock
C43H66N12O12S2
2022-10-11
Use apply_hypernetwork function
brkirch
2022-10-11
Add InvokeAI and lstein to credits, add back CUDA support
brkirch
2022-10-11
Add check for psutil
brkirch
2022-10-11
Add cross-attention optimization from InvokeAI
brkirch
2022-10-11
rename hypernetwork dir to hypernetworks to prevent clash with an old filenam...
AUTOMATIC
2022-10-11
fixes related to merge
AUTOMATIC
2022-10-11
replace duplicate code with a function
AUTOMATIC
2022-10-10
remove functorch
C43H66N12O12S2
2022-10-09
Fix VRAM Issue by only loading in hypernetwork when selected in settings
Fampai
2022-10-08
make --force-enable-xformers work without needing --xformers
AUTOMATIC
2022-10-08
add fallback for xformers_attnblock_forward
AUTOMATIC
2022-10-08
simplify xfrmers options: --xformers to enable and that's it
AUTOMATIC
2022-10-08
emergency fix for xformers (continue + shared)
AUTOMATIC
2022-10-08
Merge pull request #1851 from C43H66N12O12S2/flash
AUTOMATIC1111
2022-10-08
update sd_hijack_opt to respect new env variables
C43H66N12O12S2
2022-10-08
Update sd_hijack_optimizations.py
C43H66N12O12S2
2022-10-08
add xformers attnblock and hypernetwork support
C43H66N12O12S2
2022-10-08
Add hypernetwork support to split cross attention v1
brkirch
2022-10-08
switch to the proper way of calling xformers
C43H66N12O12S2
2022-10-07
added support for hypernetworks (???)
AUTOMATIC
2022-10-07
add xformers attention
C43H66N12O12S2
2022-10-02
Merge branch 'master' into stable
Jairo Correa
2022-10-02
initial support for training textual inversion
AUTOMATIC