Jared Deckard
d3c86e5178
Note the Gradio user in the Exif data
2023-06-14 17:15:52 -05:00
Beinsezii
1d7c51fb9f
WEBUI.SH Navi 3 Support
...
Navi 3 card now defaults to nightly torch to utilize rocm 5.5
for out-of-the-box support.
https://download.pytorch.org/whl/nightly/
While its not yet on the main pytorch "get started" site,
it still seems perfectly indexable via pip which is all we need.
With this I'm able to clone a fresh repo and immediately run ./webui.sh
on my 7900 XTX without any problems.
2023-06-14 13:07:22 -07:00
w-e-w
376f793bde
git clone show progress
2023-06-15 04:23:52 +09:00
Jared Deckard
fa9d2ac2ff
Fix gradio special args in the call queue
2023-06-14 13:53:13 -05:00
w-e-w
6091c4e4aa
terminate -> stop
2023-06-14 19:53:08 +09:00
w-e-w
49fb2a3376
response 501 if not a able to restart
2023-06-14 19:52:12 +09:00
w-e-w
6387f0e85d
update workflow kill test server
2023-06-14 18:51:54 +09:00
w-e-w
5be6c026f5
rename routes
2023-06-14 18:51:47 +09:00
Danil Boldyrev
3a41d7c551
Formatting code with Prettier
2023-06-14 00:31:36 +03:00
Danil Boldyrev
9b687f013d
Reworked the disabling of functions, refactored part of the code
2023-06-14 00:24:25 +03:00
Aarni Koskela
d807164776
textual_inversion/logging.py: clean up duplicate key from sets (and sort them) (Ruff B033)
2023-06-13 13:07:39 +03:00
Aarni Koskela
8ce9b36e0f
Upgrade ruff to 272
2023-06-13 13:07:06 +03:00
Aarni Koskela
2667f47ffb
Remove stray space from SwinIR model URL
2023-06-13 13:00:05 +03:00
Aarni Koskela
bf67a5dcf4
Upscaler.load_model: don't return None, just use exceptions
2023-06-13 12:44:25 +03:00
Aarni Koskela
e3a973a68d
Add TODO comments to sus model loads
2023-06-13 12:38:29 +03:00
Aarni Koskela
0afbc0c235
Fix up if "http" in ...:
to be more sensible startswiths
2023-06-13 12:38:29 +03:00
Aarni Koskela
89352a2f52
Move load_file_from_url
to modelloader
2023-06-13 12:38:28 +03:00
Aarni Koskela
165ab44f03
Use os.makedirs(..., exist_ok=True)
2023-06-13 12:35:43 +03:00
Danil Boldyrev
9a2da597c5
remove console.log
2023-06-12 22:21:42 +03:00
Danil Boldyrev
ee029a8cad
Improved error output, improved settings menu
2023-06-12 22:19:22 +03:00
w-e-w
d80962681a
remove fastapi.Response
2023-06-12 18:21:01 +09:00
w-e-w
b9664ab615
move _stop route to api
2023-06-12 18:15:27 +09:00
Su Wei
7e2d39a2d1
update model checkpoint switch code
2023-06-12 15:22:49 +08:00
w-e-w
9142be0a0d
quit restart
2023-06-10 23:36:34 +09:00
arch-fan
5576a72322
fixed typos
2023-06-09 19:59:27 +00:00
AUTOMATIC
3b11f17a37
Merge branch 'dev' into release_candidate
2023-06-09 22:48:18 +03:00
AUTOMATIC
59419bd64a
add changelog for 1.4.0
2023-06-09 22:47:58 +03:00
AUTOMATIC
cfdd1b9418
linter
2023-06-09 22:47:27 +03:00
AUTOMATIC1111
89e6c60546
Merge pull request #11092 from AUTOMATIC1111/Generate-Forever-during-generation
...
Allow activation of Generate Forever during generation
2023-06-09 22:33:23 +03:00
AUTOMATIC1111
d00139eea8
Merge pull request #11087 from AUTOMATIC1111/persistent_conds_cache
...
persistent conds cache
2023-06-09 22:32:49 +03:00
AUTOMATIC1111
b8d7506ebe
Merge pull request #11123 from akx/dont-die-on-bad-symlink-lora
...
Don't die when a LoRA is a broken symlink
2023-06-09 22:31:49 +03:00
AUTOMATIC1111
f9606b8826
Merge pull request #10295 from Splendide-Imaginarius/mk2-blur-mask
...
Split mask blur into X and Y components, patch Outpainting MK2 accordingly
2023-06-09 22:31:29 +03:00
AUTOMATIC1111
741bd71873
Merge pull request #11048 from DGdev91/force_python1_navi_renoir
...
Forcing Torch Version to 1.13.1 for RX 5000 series GPUs
2023-06-09 22:30:54 +03:00
Aarni Koskela
d75ed52bfc
Don't die when a LoRA is a broken symlink
...
Fixes #11098
2023-06-09 13:26:36 +03:00
Splendide Imaginarius
72815c0211
Split Outpainting MK2 mask blur into X and Y components
...
Fixes unexpected noise in non-outpainted borders when using MK2 script.
2023-06-09 08:37:26 +00:00
Splendide Imaginarius
1503af60b0
Split mask blur into X and Y components
...
Prequisite to fixing Outpainting MK2 mask blur bug.
2023-06-09 08:36:33 +00:00
Su Wei
8ca34ad6d8
add model exists status check to modeuls/api/api.py , /sdapi/v1/options [POST]
2023-06-09 13:14:20 +08:00
w-e-w
46e4777fd6
Generate Forever during generation
...
Generate Forever during generation
2023-06-08 17:56:03 +09:00
w-e-w
7f2214aa2b
persistent conds cache
...
Update shared.py
2023-06-08 14:27:22 +09:00
AUTOMATIC1111
cf28aed1a7
Merge pull request #11058 from AUTOMATIC1111/api-wiki
...
link footer API to Wiki when API is not active
2023-06-07 07:49:59 +03:00
AUTOMATIC1111
806ea639e6
Merge pull request #11066 from aljungberg/patch-1
...
Fix upcast attention dtype error.
2023-06-07 07:48:52 +03:00
Alexander Ljungberg
d9cc0910c8
Fix upcast attention dtype error.
...
Without this fix, enabling the "Upcast cross attention layer to float32" option while also using `--opt-sdp-attention` breaks generation with an error:
```
File "/ext3/automatic1111/stable-diffusion-webui/modules/sd_hijack_optimizations.py", line 612, in sdp_attnblock_forward
out = torch.nn.functional.scaled_dot_product_attention(q, k, v, dropout_p=0.0, is_causal=False)
RuntimeError: Expected query, key, and value to have the same dtype, but got query.dtype: float key.dtype: float and value.dtype: c10::Half instead.
```
The fix is to make sure to upcast the value tensor too.
2023-06-06 21:45:30 +01:00
DGdev91
62860c221e
Skip force pyton and pytorch ver if TORCH_COMMAND already set
2023-06-06 15:43:32 +02:00
w-e-w
96e446218c
link footer API to Wiki when API is not active
2023-06-06 18:58:44 +09:00
DGdev91
8646768801
Write "RX 5000 Series" instead of "Navi" in err
2023-06-06 10:03:20 +02:00
DGdev91
95d4d650d4
Check python version for Navi 1 only
2023-06-06 09:59:13 +02:00
DGdev91
e0d923bdf8
Force python1 for Navi1 only, use python_cmd for python
2023-06-06 09:55:49 +02:00
DGdev91
2788ce8c7b
Fix error in webui.sh
2023-06-06 01:51:35 +02:00
DGdev91
8d98532b65
Forcing Torch Version to 1.13.1 for Navi and Renoir GPUs
2023-06-06 01:05:31 +02:00
AUTOMATIC1111
a009fe15fd
Merge pull request #11047 from AUTOMATIC1111/parse_generation_parameters_with_error
...
handles exception when parsing generation parameters from png info
2023-06-06 00:13:27 +03:00