Commit Graph

4982 Commits

Author SHA1 Message Date
Martín (Netux) Rodríguez
dd268c48c9 feat(extensions): add toggle all checkbox to Installed tab
Small QoL addition.

While there is the option to disable all extensions with the radio buttons at the top, that only acts as an added flag and doesn't really change the state of the extensions in the UI.

An use case for this checkbox is to disable all extensions except for a few, which is important for debugging extensions.
You could do that before, but you'd have to uncheck and recheck every extension one by one.
2023-06-25 00:48:46 -03:00
AUTOMATIC
59419bd64a add changelog for 1.4.0 2023-06-09 22:47:58 +03:00
AUTOMATIC
cfdd1b9418 linter 2023-06-09 22:47:27 +03:00
AUTOMATIC1111
89e6c60546
Merge pull request #11092 from AUTOMATIC1111/Generate-Forever-during-generation
Allow activation of Generate Forever during generation
2023-06-09 22:33:23 +03:00
AUTOMATIC1111
d00139eea8
Merge pull request #11087 from AUTOMATIC1111/persistent_conds_cache
persistent conds cache
2023-06-09 22:32:49 +03:00
AUTOMATIC1111
b8d7506ebe
Merge pull request #11123 from akx/dont-die-on-bad-symlink-lora
Don't die when a LoRA is a broken symlink
2023-06-09 22:31:49 +03:00
AUTOMATIC1111
f9606b8826
Merge pull request #10295 from Splendide-Imaginarius/mk2-blur-mask
Split mask blur into X and Y components, patch Outpainting MK2 accordingly
2023-06-09 22:31:29 +03:00
AUTOMATIC1111
741bd71873
Merge pull request #11048 from DGdev91/force_python1_navi_renoir
Forcing Torch Version to 1.13.1 for RX 5000 series GPUs
2023-06-09 22:30:54 +03:00
Aarni Koskela
d75ed52bfc Don't die when a LoRA is a broken symlink
Fixes #11098
2023-06-09 13:26:36 +03:00
Splendide Imaginarius
72815c0211 Split Outpainting MK2 mask blur into X and Y components
Fixes unexpected noise in non-outpainted borders when using MK2 script.
2023-06-09 08:37:26 +00:00
Splendide Imaginarius
1503af60b0 Split mask blur into X and Y components
Prequisite to fixing Outpainting MK2 mask blur bug.
2023-06-09 08:36:33 +00:00
w-e-w
46e4777fd6 Generate Forever during generation
Generate Forever during generation
2023-06-08 17:56:03 +09:00
w-e-w
7f2214aa2b persistent conds cache
Update shared.py
2023-06-08 14:27:22 +09:00
AUTOMATIC1111
cf28aed1a7
Merge pull request #11058 from AUTOMATIC1111/api-wiki
link footer API to Wiki when API is not active
2023-06-07 07:49:59 +03:00
AUTOMATIC1111
806ea639e6
Merge pull request #11066 from aljungberg/patch-1
Fix upcast attention dtype error.
2023-06-07 07:48:52 +03:00
Alexander Ljungberg
d9cc0910c8
Fix upcast attention dtype error.
Without this fix, enabling the "Upcast cross attention layer to float32" option while also using `--opt-sdp-attention` breaks generation with an error:

```
  File "/ext3/automatic1111/stable-diffusion-webui/modules/sd_hijack_optimizations.py", line 612, in sdp_attnblock_forward
    out = torch.nn.functional.scaled_dot_product_attention(q, k, v, dropout_p=0.0, is_causal=False)
RuntimeError: Expected query, key, and value to have the same dtype, but got query.dtype: float key.dtype: float and value.dtype: c10::Half instead.
```

The fix is to make sure to upcast the value tensor too.
2023-06-06 21:45:30 +01:00
DGdev91
62860c221e Skip force pyton and pytorch ver if TORCH_COMMAND already set 2023-06-06 15:43:32 +02:00
w-e-w
96e446218c link footer API to Wiki when API is not active 2023-06-06 18:58:44 +09:00
DGdev91
8646768801 Write "RX 5000 Series" instead of "Navi" in err 2023-06-06 10:03:20 +02:00
DGdev91
95d4d650d4 Check python version for Navi 1 only 2023-06-06 09:59:13 +02:00
DGdev91
e0d923bdf8 Force python1 for Navi1 only, use python_cmd for python 2023-06-06 09:55:49 +02:00
DGdev91
2788ce8c7b Fix error in webui.sh 2023-06-06 01:51:35 +02:00
DGdev91
8d98532b65 Forcing Torch Version to 1.13.1 for Navi and Renoir GPUs 2023-06-06 01:05:31 +02:00
AUTOMATIC1111
a009fe15fd
Merge pull request #11047 from AUTOMATIC1111/parse_generation_parameters_with_error
handles exception when parsing generation parameters from png info
2023-06-06 00:13:27 +03:00
w-e-w
851bf43520 print error and continue
print error and continue
2023-06-06 05:50:43 +09:00
AUTOMATIC1111
0895c2369c
Merge pull request #11037 from AUTOMATIC1111/restart-autolaunch
fix rework-disable-autolaunch for new restart method
2023-06-05 20:57:31 +03:00
w-e-w
c2808f3040 SD_WEBUI_RESTARTING 2023-06-06 02:52:05 +09:00
w-e-w
eaace155ce restore old disable --autolaunch 2023-06-06 02:47:18 +09:00
AUTOMATIC1111
e89a248e2e
Merge pull request #11031 from akx/zoom-and-pan-namespace
Zoom and pan: namespace & simplify
2023-06-05 20:40:31 +03:00
AUTOMATIC1111
1dd8d571a4
Merge pull request #11043 from akx/restart-envvar
Restart: only do restart if running via the wrapper script
2023-06-05 20:06:40 +03:00
Aarni Koskela
46a5bd64ed Restart: only do restart if running via the wrapper script 2023-06-05 20:04:28 +03:00
w-e-w
1411a6e74b rework-disable-autolaunch 2023-06-06 01:09:30 +09:00
AUTOMATIC
18acc0b30d revert the message to how it was 2023-06-05 11:08:57 +03:00
AUTOMATIC1111
7a7a201d81
Merge pull request #10956 from akx/len
Simplify a bunch of `len(x) > 0`/`len(x) == 0` style expressions
2023-06-05 11:06:37 +03:00
Aarni Koskela
2d4c66f7b5 Zoom and Pan: simplify waitForOpts 2023-06-05 10:40:42 +03:00
Aarni Koskela
6163b38ad9 Zoom and Pan: use for instead of forEach 2023-06-05 10:37:00 +03:00
Aarni Koskela
afbb0b5f86 Zoom and Pan: simplify getElements (it's not actually async) 2023-06-05 10:37:00 +03:00
Aarni Koskela
68cda4f213 Zoom and Pan: use elementIDs from closure scope 2023-06-05 10:37:00 +03:00
Aarni Koskela
8fd20bd4c3 Zoom and Pan: move helpers into its namespace to avoid littering global scope 2023-06-05 10:36:55 +03:00
AUTOMATIC
9781f31f74 Merge branch 'master' into dev 2023-06-05 06:16:03 +03:00
AUTOMATIC
baf6946e06 Merge branch 'release_candidate' 2023-06-05 06:13:41 +03:00
AUTOMATIC1111
1e7e34337f
Merge pull request #11013 from ramyma/get_latent_upscale_modes_api
Get latent upscale modes API endpoint
2023-06-04 18:20:36 +03:00
ramyma
4faaf3e723 Add endpoint to get latent_upscale_modes for hires fix 2023-06-04 17:05:29 +03:00
AUTOMATIC
fbf88343de prevent calculating cons for second pass of hires fix when they are the same as for the first pass 2023-06-04 16:29:02 +03:00
AUTOMATIC
1ca5e76f7b fix for conds of second hires fox pass being calculated using first pass's networks, and add an option to revert to old behavior 2023-06-04 13:07:31 +03:00
AUTOMATIC1111
1c6dca9383
Merge pull request #10997 from AUTOMATIC1111/fix-conds-caching-with-extra-network
fix conds caching with extra network
2023-06-04 12:07:41 +03:00
AUTOMATIC1111
56bf522913
Merge pull request #10990 from vkage/sd_hijack_optimizations_bugfix
torch.cuda.is_available() check for SdOptimizationXformers
2023-06-04 11:34:32 +03:00
AUTOMATIC
2e23c9c568 fix the broken line for #10990 2023-06-04 11:33:51 +03:00
AUTOMATIC1111
0819383de0
Merge pull request #10975 from AUTOMATIC1111/restart3
A yet another method to restart webui.
2023-06-04 11:17:20 +03:00
AUTOMATIC1111
efc4c79b5e
Merge pull request #10980 from AUTOMATIC1111/sysinfo
Added sysinfo tab to settings
2023-06-04 11:16:32 +03:00