Commit Graph

7758 Commits

Author SHA1 Message Date
Yuki Shindo 847d451505 Fix command line arguments format in webui-user.bat (#135)
* fix command line arguments format in webui-user.bat

* fix embeddings dir path
2024-02-08 19:47:42 -05:00
lllyasviel f06ba8e60b Significantly reduce thread abuse for faster model moving
This will move all major gradio calls into the main thread rather than random gradio threads.
This ensures that all torch.module.to() are performed in main thread to completely possible avoid GPU fragments.
In my test now model moving is 0.7 ~ 1.2 seconds faster, which means all 6GB/8GB VRAM users will get 0.7 ~ 1.2 seconds faster per image on SDXL.
2024-02-08 10:13:59 -08:00
lllyasviel 291ec743b6 use better context manager to fix potential problems 2024-02-08 02:00:54 -08:00
lllyasviel 760f727eb9 use better context manager to fix potential problems 2024-02-08 01:51:18 -08:00
lllyasviel 4c9db26541 support controlnet_model_function_wrapper for t2i-adapter 2024-02-07 21:42:56 -08:00
lllyasviel 50035ad414 fix outpaint with inpaint_global_harmonious 2024-02-07 20:31:22 -08:00
lllyasviel 49ec325f6a lin 2024-02-07 20:14:02 -08:00
lllyasviel a1670c536d Allow controlnet_model_function_wrapper
for animatediff to manage controlnet batch slicing window
2024-02-07 20:06:23 -08:00
Chenlei Hu 383aaca1eb Improve model filtering (#114) 2024-02-07 22:53:41 -05:00
lllyasviel 42dd258c8d revise attention alignment in stylealign #100
A mistake in 0day release is that the attention layers of cond and uncond items in a batch are aligned when they should not.
after align batch in cond and uncond separately they now works and give same results to legacy sd-webui-cnet
2024-02-07 19:11:53 -08:00
lllyasviel f63917a323 add codes discussed in #73 that may help ipadapters 2024-02-07 18:57:57 -08:00
lllyasviel ef781cabcb Backend: Allow control signal to be none for advanced weighting 2024-02-07 13:02:42 -08:00
lllyasviel c3a66b016b try solve dtype cast for #112 2024-02-07 12:39:55 -08:00
Chenlei Hu e1faf8327b Add back ControlNet HR option (#90)
* Add back ControlNet HR option

* nits

* protect kernel

* add enum

* fix

* fix

* Update controlnet.py

* restore controlnet.py

* hint ui

* Update controlnet.py

* fix

* Update controlnet.py

* Backend: better controlnet mask batch broadcasting

* Update README.md

* fix inpaint batch dim align #94

* fix sigmas device in rare cases #71

* rework sigma device mapping

* Add hr_option to infotext

---------

Co-authored-by: lllyasviel <lyuminzhang@outlook.com>
2024-02-07 11:09:52 -05:00
lllyasviel 257ac2653a rework sigma device mapping 2024-02-07 00:44:12 -08:00
lllyasviel d11c9d7506 fix sigmas device in rare cases #71 2024-02-06 23:12:35 -08:00
lllyasviel 4ea4a92fe9 fix inpaint batch dim align #94 2024-02-06 22:57:53 -08:00
lllyasviel 65f9c7d442 Update README.md 2024-02-06 21:46:54 -08:00
lllyasviel c185e39e59 Backend: better controlnet mask batch broadcasting 2024-02-06 20:13:09 -08:00
Chenlei Hu 1110183943 Fix ControlNet UI preset (#87) 2024-02-06 21:52:04 -05:00
lllyasviel e62631350a Update forge_version.py 2024-02-06 17:55:14 -08:00
lllyasviel e579fab4d0 try solve #71 2024-02-06 17:46:23 -08:00
lllyasviel 6301a6660e solve #73 2024-02-06 17:28:00 -08:00
lllyasviel 711844ecd8 solve #76 2024-02-06 17:25:28 -08:00
lllyasviel 70ae2a4bce fix controlnet ignored in batch #55 2024-02-06 17:12:57 -08:00
lllyasviel fc5c70a28d gradio fix 2024-02-06 14:32:42 -08:00
Chengsong Zhang b58b0bd425 batch mask (#44)
* mask batch, not working

* mask batch, working, infotext broken

* try remove old codes

* set CUDA_VISIBLE_DEVICES with args

* Revert "try remove old codes"

This reverts commit 63c527c373.

* Update controlnet_ui_group.py

* readme

* 🐛 Fix infotext

---------

Co-authored-by: lllyasviel <lyuminzhang@outlook.com>
Co-authored-by: huchenlei <chenlei.hu@mail.utoronto.ca>
2024-02-06 12:54:35 -06:00
lllyasviel 5bea443d94 add a note after fixing repeated loading bug on 4090 2024-02-06 08:13:29 -08:00
Chenlei Hu 79e4e46061 🐛 Fix SVD tab (#63) 2024-02-06 11:10:47 -05:00
lllyasviel 402b7beb87 fix repeated model loading bug on 4090 2024-02-06 08:02:03 -08:00
lllyasviel a578da074b fix repeated model loading bug on 4090 2024-02-06 07:59:19 -08:00
lllyasviel d76b830add reduce prints 2024-02-06 07:56:15 -08:00
lllyasviel 65367aa24d accurate words 2024-02-06 07:51:37 -08:00
lllyasviel 4939cf18d8 update hints for 4090 2024-02-06 07:43:33 -08:00
lllyasviel 7359740f36 Revert "safer device"
This reverts commit 1204d490d9.
2024-02-06 05:27:07 -08:00
lllyasviel 1204d490d9 safer device 2024-02-06 05:01:58 -08:00
lllyasviel 9c31b0ddcb try fix #56 2024-02-06 04:51:08 -08:00
lllyasviel 6aee7a2032 Update forge_version.py 2024-02-05 23:55:27 -08:00
lllyasviel 74ff4a9ba9 set CUDA_VISIBLE_DEVICES with args 2024-02-05 23:53:11 -08:00
lllyasviel 1ecbff15fa add note about token merging 2024-02-05 21:55:59 -08:00
lllyasviel 7fd499a034 use new links for images if they were broken 2024-02-05 21:41:37 -08:00
lllyasviel b8cd6d2e21 Update forge_version.py 2024-02-05 20:12:37 -08:00
lllyasviel 58dff34084 fix ddpm 2024-02-05 20:12:17 -08:00
Kohaku-Blueleaf 393c19bbcf Remove Lycoris back compact (#41)
Forge use totally different mechanism to handle lora/lycoris models
I think it is good to remove the back compact things so "lyco" alias can be used by legacy lycoris extensions. (some other extension may rely on it)
2024-02-06 11:17:15 +08:00
lllyasviel b03df6fdb1 Update README.md 2024-02-05 17:59:37 -08:00
lllyasviel f4e6794dbd Update README.md 2024-02-05 17:13:47 -08:00
lllyasviel c5b51b35fb Update README.md 2024-02-05 15:50:44 -08:00
lllyasviel 218a10179b Update README.md 2024-02-05 15:48:57 -08:00
lllyasviel 4221ccf239 Update README.md 2024-02-05 15:45:03 -08:00
lllyasviel 53057f33ed Update forge_version.py 2024-02-05 15:30:45 -08:00