index
:
stable-diffusion-webui-gfx803.git
master
stable-diffusion-webui by AUTOMATIC1111 with patches for gfx803 GPU and Dockerfile
about
summary
refs
log
tree
commit
diff
log msg
author
committer
range
path:
root
/
modules
/
textual_inversion
/
textual_inversion.py
Age
Commit message (
Expand
)
Author
2023-01-07
CLIP hijack rework
AUTOMATIC
2023-01-06
rework saving training params to file #6372
AUTOMATIC
2023-01-06
Merge pull request #6372 from timntorres/save-ti-hypernet-settings-to-txt-rev...
AUTOMATIC1111
2023-01-06
allow loading embeddings from subdirectories
Faber
2023-01-05
typo in TI
Kuma
2023-01-05
Include model in log file. Exclude directory.
timntorres
2023-01-05
Clean up ti, add same behavior to hypernetwork.
timntorres
2023-01-05
Add option to save ti settings to file.
timntorres
2023-01-04
Merge branch 'master' into gradient-clipping
AUTOMATIC1111
2023-01-04
use shared function from processing for creating dummy mask when training inp...
AUTOMATIC
2023-01-04
fix the merge
AUTOMATIC
2023-01-04
Merge branch 'master' into inpaint_textual_inversion
AUTOMATIC1111
2023-01-04
Merge pull request #6253 from Shondoit/ti-optim
AUTOMATIC1111
2023-01-03
add job info to modules
Vladimir Mandic
2023-01-03
Save Optimizer next to TI embedding
Shondoit
2023-01-02
feat(api): return more data for embeddings
Philpax
2023-01-02
fix the issue with training on SD2.0
AUTOMATIC
2022-12-31
changed embedding accepted shape detection to use existing code and support t...
AUTOMATIC
2022-12-31
validate textual inversion embeddings
Vladimir Mandic
2022-12-24
fix F541 f-string without any placeholders
Yuval Aboulafia
2022-12-14
Fix various typos
Jim Hays
2022-12-03
Merge branch 'master' into racecond_fix
AUTOMATIC1111
2022-11-30
Use devices.autocast instead of torch.autocast
brkirch
2022-11-27
Merge remote-tracking branch 'flamelaw/master'
AUTOMATIC
2022-11-27
set TI AdamW default weight decay to 0
flamelaw
2022-11-26
Add support Stable Diffusion 2.0
AUTOMATIC
2022-11-23
small fixes
flamelaw
2022-11-21
fix pin_memory with different latent sampling method
flamelaw
2022-11-20
Gradient accumulation, autocast fix, new latent sampling method, etc
flamelaw
2022-11-19
change StableDiffusionProcessing to internally use sampler name instead of sa...
AUTOMATIC
2022-11-05
Simplify grad clip
Muhammad Rizqi Nur
2022-11-04
Fixes race condition in training when VAE is unloaded
Fampai
2022-11-02
Merge branch 'master' into gradient-clipping
Muhammad Rizqi Nur
2022-11-01
fixed textual inversion training with inpainting models
Nerogar
2022-10-31
Fixed minor bug
Fampai
2022-10-31
Merge branch 'master' of https://github.com/AUTOMATIC1111/stable-diffusion-we...
Fampai
2022-10-31
Added TI training optimizations
Fampai
2022-10-30
Merge master
Muhammad Rizqi Nur
2022-10-30
Fix dataset still being loaded even when training will be skipped
Muhammad Rizqi Nur
2022-10-30
Add missing info on hypernetwork/embedding model log
Muhammad Rizqi Nur
2022-10-30
Revert "Add cleanup after training"
Muhammad Rizqi Nur
2022-10-29
Add cleanup after training
Muhammad Rizqi Nur
2022-10-29
Add input validations before loading dataset for training
Muhammad Rizqi Nur
2022-10-29
Merge branch 'master' into gradient-clipping
Muhammad Rizqi Nur
2022-10-28
Fix log off by 1
Muhammad Rizqi Nur
2022-10-28
Learning rate sched syntax support for grad clipping
Muhammad Rizqi Nur
2022-10-28
Gradient clipping for textual embedding
Muhammad Rizqi Nur
2022-10-26
typo: cmd_opts.embedding_dir to cmd_opts.embeddings_dir
DepFA
2022-10-26
Implement PR #3625 but for embeddings.
timntorres
2022-10-26
Implement PR #3309 but for embeddings.
timntorres
[next]