Commit Graph

6626 Commits

Author SHA1 Message Date
Lincoln Stein
6dc4ddef1b
Fix various bugs in ckpt to diffusers conversion script (#4065)
## What type of PR is this? (check all applicable)


- [X ] Bug Fix


## Have you discussed this change with the InvokeAI team?
- [ X] Yes
- [ ] No, because:

      
## Have you updated all relevant documentation?
- [ ] Yes
- [ X] No


## Description

This PR fixes several issues with the 3.0.0 conversion script:

- Handles checkpoint variants that don't put dots between fields in the
long state dict key names
- Handles ema, non-ema, pruned and non-pruned ckpts
- Does not add safety checker to converted checkpoints
- Respects precision of original checkpoint file
2023-07-30 08:16:37 -04:00
Lincoln Stein
26af5ec341
Merge branch 'main' into bugfix/model-manager-rel-paths 2023-07-30 08:08:17 -04:00
Lincoln Stein
10b182f316
Merge branch 'main' into bugfix/convert-script 2023-07-30 08:07:51 -04:00
Lincoln Stein
ac84a9f915 reenable display of autoloaded models 2023-07-30 08:05:05 -04:00
Lincoln Stein
844578ab88 fix lora loading crash 2023-07-30 07:57:10 -04:00
Lincoln Stein
444390617f rebuild front end 2023-07-29 22:00:16 -04:00
Lincoln Stein
6cb40d9d7b bump version for hotfix 3.0.1post1 2023-07-29 21:58:57 -04:00
Lincoln Stein
ca895a9cd0
Unpin pydantic and numpy in pyproject.toml (#4062)
## What type of PR is this? (check all applicable)

- [ X] Bug Fix


## Have you discussed this change with the InvokeAI team?
- [ X] Yes
- [ ] No, because:

      
## Have you updated all relevant documentation?
- [ ] Yes
- [X] Not needed

## Description

Windows users have been getting a lot of OSErrors while installing 3.0.1
during the pip dependency installation phase. Generally the errors have
involved just two packages, pydantic and numpy. Looking at the install
logs, I see that both of these packages are first installed under one
version number by a dependency, and then uninstalled and replaced by a
slightly different version specified in invoke's `pyproject.toml`. I
think this is the problem - maybe the earlier package is not completely
closed before it is uninstalled and reinstalled.

This PR relaxes pinning of numpy and pydantic in `pyproject.toml`.
Everything seems to install and run properly. Hopefully it will address
the windows install bug as well.
2023-07-29 21:57:21 -04:00
Lincoln Stein
7d27c7b1a4
Merge branch 'main' into lstein/no-pydantic-in-pyproject 2023-07-29 21:47:16 -04:00
Lincoln Stein
6c82229910 fix recovery recipe 2023-07-29 20:00:06 -04:00
Lincoln Stein
43b1eb8e84 wording changes 2023-07-29 19:49:58 -04:00
Lincoln Stein
b10b07220e blackify code 2023-07-29 19:20:20 -04:00
Lincoln Stein
c2eb50d1cd make installer use initial INVOKEAI_ROOT as default install location 2023-07-29 19:19:42 -04:00
Lincoln Stein
73f3b7f84b remove dangling comment 2023-07-29 17:32:33 -04:00
Lincoln Stein
bb18251fad Merge branch 'bugfix/convert-script' of github.com:invoke-ai/InvokeAI into bugfix/convert-script 2023-07-29 17:31:02 -04:00
Lincoln Stein
348bee8981 blackified 2023-07-29 17:30:54 -04:00
Lincoln Stein
078b33bda2
Merge branch 'main' into bugfix/convert-script 2023-07-29 17:30:40 -04:00
Lincoln Stein
e82eb0b9fc add correct optional annotation to precision arg 2023-07-29 17:30:21 -04:00
Lincoln Stein
ad976e5198
Merge branch 'main' into bugfix/model-manager-rel-paths 2023-07-29 17:27:16 -04:00
Lincoln Stein
0e28961e69
Merge branch 'main' into lstein/no-pydantic-in-pyproject 2023-07-29 17:27:02 -04:00
Lincoln Stein
6ce059f063 blackified again 2023-07-29 17:26:40 -04:00
Lincoln Stein
1de783b1ce fix mistake in indexing flat_ema_key 2023-07-29 17:20:26 -04:00
Lincoln Stein
3f9105be50 make convert script respect setting of use_ema in config file 2023-07-29 17:17:45 -04:00
Lincoln Stein
781322a647 installer respects INVOKEAI_ROOT for default root location 2023-07-29 16:16:44 -04:00
Lincoln Stein
9a1cfadd8b
fix: SDXL Metadata not being retrieved (#4057)
## What type of PR is this? (check all applicable)

- [x] Bug Fix

## Have you discussed this change with the InvokeAI team?
- [x] Yes

## Description

- SDXL Metadata was not being retrieved. This PR fixes it.
2023-07-29 15:37:02 -04:00
Lincoln Stein
2a2d988928 convert script handles more ckpt variants 2023-07-29 15:28:39 -04:00
Lincoln Stein
72c519c6ad fix incorrect key construction 2023-07-29 13:51:47 -04:00
Lincoln Stein
af12f67948 Merge branch 'lstein/no-pydantic-in-pyproject' of github.com:invoke-ai/InvokeAI into lstein/no-pydantic-in-pyproject 2023-07-29 13:28:38 -04:00
Lincoln Stein
60f5606c2d downgrade torchmetrics to fix model import problem 2023-07-29 13:28:29 -04:00
Lincoln Stein
24b19166dd further refactoring 2023-07-29 13:13:22 -04:00
Lincoln Stein
0396bce4f9
Merge branch 'main' into lstein/no-pydantic-in-pyproject 2023-07-29 13:06:30 -04:00
Lincoln Stein
71768f5988 restore unpinned versions of pydantic and numpy 2023-07-29 13:04:34 -04:00
Lincoln Stein
0fb7328022 blackify code 2023-07-29 13:00:43 -04:00
Lincoln Stein
99daa97978 more refactoring; fixed place where rel conversion missed 2023-07-29 13:00:07 -04:00
Lincoln Stein
982a568349 blackify pr 2023-07-29 10:47:55 -04:00
Lincoln Stein
d79d5a4ff7 modest refactoring 2023-07-29 10:45:26 -04:00
Lincoln Stein
9968ff2893 fix relative model paths to be against config.models_path, not root 2023-07-29 10:30:27 -04:00
blessedcoolant
6d82a1019a fix: Black linting 2023-07-29 17:34:43 +12:00
blessedcoolant
6ed1bf7084
Merge branch 'main' into metadata-fix 2023-07-29 17:33:30 +12:00
blessedcoolant
974175be45
fix: Prompt Node using incorrect output type (#4058)
## What type of PR is this? (check all applicable)

- [ ] Refactor
- [ ] Feature
- [ ] Bug Fix
- [ ] Optimization
- [ ] Documentation Update
- [ ] Community Node Submission


## Have you discussed this change with the InvokeAI team?
- [ ] Yes
- [ ] No, because:

      
## Have you updated all relevant documentation?
- [ ] Yes
- [ ] No


## Description


## Related Tickets & Documents

<!--
For pull requests that relate or close an issue, please include them
below. 

For example having the text: "closes #1234" would connect the current
pull
request to issue 1234.  And when we merge the pull request, Github will
automatically close the issue.
-->

- Related Issue #
- Closes #

## QA Instructions, Screenshots, Recordings

<!-- 
Please provide steps on how to test changes, any hardware or 
software specifications as well as any other pertinent information. 
-->

## Added/updated tests?

- [ ] Yes
- [ ] No : _please replace this line with details on why tests
      have not been included_

## [optional] Are there any post deployment tasks we need to perform?
2023-07-29 17:32:10 +12:00
blessedcoolant
bee678fdd1 fix: Prompt Node using incorrect output type 2023-07-29 17:12:25 +12:00
blessedcoolant
c5caf1e8fe fix: SDXL Metadata not being retrieved 2023-07-29 17:03:19 +12:00
blessedcoolant
72708eb53c
Feat/Nodes: Change Input to Textbox (#3853)
## What type of PR is this? (check all applicable)

- [ ] Refactor
- [X] Feature
- [ ] Bug Fix
- [ ] Optimization
- [ ] Documentation Update
- [ ] Community Node Submission


## Have you discussed this change with the InvokeAI team?
- [ ] Yes
- [X] No, because:
not yet, making pr to show
      
## Have you updated relevant documentation?
- [ ] Yes
- [ ] No


## Description
Temp Change Node String input to a textbox, to allow easier input of
prompts and larger strings, it works for me but please tell me if I did
it wrong and if the size is ok

## Related Tickets & Documents

<!--
For pull requests that relate or close an issue, please include them
below. 

For example having the text: "closes #1234" would connect the current
pull
request to issue 1234.  And when we merge the pull request, Github will
automatically close the issue.
-->

- Related Issue #
- Closes #

## QA Instructions, Screenshots, Recordings

<!-- 
Please provide steps on how to test changes, any hardware or 
software specifications as well as any other pertinent information. 
-->

## Added/updated tests?

- [ ] Yes
- [ ] No : _please replace this line with details on why tests
      have not been included_

## [optional] Are there any post deployment tasks we need to perform?
2023-07-29 16:10:32 +12:00
blessedcoolant
aae1670080 fix: Incorrect Prompt Node output type 2023-07-29 16:04:19 +12:00
blessedcoolant
1e776d2523 chore: Regen types 2023-07-29 15:59:52 +12:00
blessedcoolant
8e06e6abbc feat: Update 'style' string input to also display text area 2023-07-29 15:52:59 +12:00
blessedcoolant
8a0e1b6cfc feat: Create Prompt Input Node 2023-07-29 15:52:37 +12:00
mickr777
2d9bc79ca4
Merge branch 'main' into nodepromptsize 2023-07-29 12:43:29 +10:00
mickr777
6886eb094d
Make more Simple 2023-07-29 12:40:17 +10:00
Brandon Rising
6ca0c38ee3 Merge branch 'main' into feat/onnx 2023-07-28 22:06:28 -04:00