Commit Graph

66 Commits

Author SHA1 Message Date
psychedelicious
c238a7f18b feat(api): chore: pydantic & fastapi upgrade
Upgrade pydantic and fastapi to latest.

- pydantic~=2.4.2
- fastapi~=103.2
- fastapi-events~=0.9.1

**Big Changes**

There are a number of logic changes needed to support pydantic v2. Most changes are very simple, like using the new methods to serialized and deserialize models, but there are a few more complex changes.

**Invocations**

The biggest change relates to invocation creation, instantiation and validation.

Because pydantic v2 moves all validation logic into the rust pydantic-core, we may no longer directly stick our fingers into the validation pie.

Previously, we (ab)used models and fields to allow invocation fields to be optional at instantiation, but required when `invoke()` is called. We directly manipulated the fields and invocation models when calling `invoke()`.

With pydantic v2, this is much more involved. Changes to the python wrapper do not propagate down to the rust validation logic - you have to rebuild the model. This causes problem with concurrent access to the invocation classes and is not a free operation.

This logic has been totally refactored and we do not need to change the model any more. The details are in `baseinvocation.py`, in the `InputField` function and `BaseInvocation.invoke_internal()` method.

In the end, this implementation is cleaner.

**Invocation Fields**

In pydantic v2, you can no longer directly add or remove fields from a model.

Previously, we did this to add the `type` field to invocations.

**Invocation Decorators**

With pydantic v2, we instead use the imperative `create_model()` API to create a new model with the additional field. This is done in `baseinvocation.py` in the `invocation()` wrapper.

A similar technique is used for `invocation_output()`.

**Minor Changes**

There are a number of minor changes around the pydantic v2 models API.

**Protected `model_` Namespace**

All models' pydantic-provided methods and attributes are prefixed with `model_` and this is considered a protected namespace. This causes some conflict, because "model" means something to us, and we have a ton of pydantic models with attributes starting with "model_".

Forunately, there are no direct conflicts. However, in any pydantic model where we define an attribute or method that starts with "model_", we must tell set the protected namespaces to an empty tuple.

```py
class IPAdapterModelField(BaseModel):
    model_name: str = Field(description="Name of the IP-Adapter model")
    base_model: BaseModelType = Field(description="Base model")

    model_config = ConfigDict(protected_namespaces=())
```

**Model Serialization**

Pydantic models no longer have `Model.dict()` or `Model.json()`.

Instead, we use `Model.model_dump()` or `Model.model_dump_json()`.

**Model Deserialization**

Pydantic models no longer have `Model.parse_obj()` or `Model.parse_raw()`, and there are no `parse_raw_as()` or `parse_obj_as()` functions.

Instead, you need to create a `TypeAdapter` object to parse python objects or JSON into a model.

```py
adapter_graph = TypeAdapter(Graph)
deserialized_graph_from_json = adapter_graph.validate_json(graph_json)
deserialized_graph_from_dict = adapter_graph.validate_python(graph_dict)
```

**Field Customisation**

Pydantic `Field`s no longer accept arbitrary args.

Now, you must put all additional arbitrary args in a `json_schema_extra` arg on the field.

**Schema Customisation**

FastAPI and pydantic schema generation now follows the OpenAPI version 3.1 spec.

This necessitates two changes:
- Our schema customization logic has been revised
- Schema parsing to build node templates has been revised

The specific aren't important, but this does present additional surface area for bugs.

**Performance Improvements**

Pydantic v2 is a full rewrite with a rust backend. This offers a substantial performance improvement (pydantic claims 5x to 50x depending on the task). We'll notice this the most during serialization and deserialization of sessions/graphs, which happens very very often - a couple times per node.

I haven't done any benchmarks, but anecdotally, graph execution is much faster. Also, very larges graphs - like with massive iterators - are much, much faster.
2023-10-17 14:59:25 +11:00
Ryan Dick
78377469db
Add support for T2I-Adapter in node workflows (#4612)
* Bump diffusers to 0.21.2.

* Add T2IAdapterInvocation boilerplate.

* Add T2I-Adapter model to model-management.

* (minor) Tidy prepare_control_image(...).

* Add logic to run the T2I-Adapter models at the start of the DenoiseLatentsInvocation.

* Add logic for applying T2I-Adapter weights and accumulating.

* Add T2IAdapter to MODEL_CLASSES map.

* yarn typegen

* Add model probes for T2I-Adapter models.

* Add all of the frontend boilerplate required to use T2I-Adapter in the nodes editor.

* Add T2IAdapterModel.convert_if_required(...).

* Fix errors in T2I-Adapter input image sizing logic.

* Fix bug with handling of multiple T2I-Adapters.

* black / flake8

* Fix typo

* yarn build

* Add num_channels param to prepare_control_image(...).

* Link to upstream diffusers bugfix PR that currently requires a workaround.

* feat: Add Color Map Preprocessor

Needed for the color T2I Adapter

* feat: Add Color Map Preprocessor to Linear UI

* Revert "feat: Add Color Map Preprocessor"

This reverts commit a1119a00bf.

* Revert "feat: Add Color Map Preprocessor to Linear UI"

This reverts commit bd8a9b82d8.

* Fix T2I-Adapter field rendering in workflow editor.

* yarn build, yarn typegen

---------

Co-authored-by: blessedcoolant <54517381+blessedcoolant@users.noreply.github.com>
Co-authored-by: psychedelicious <4822129+psychedelicious@users.noreply.github.com>
2023-10-05 16:29:16 +11:00
Ryan Dick
b57acb7353 Merge branch 'main' into feat/ip-adapter 2023-09-15 13:15:25 -04:00
Ryan Dick
2c1100509f Add BaseModelType.Any to be used by CLIPVisionModel. 2023-09-14 08:19:55 -04:00
Ryan Dick
3d52656176 Add CLIPVisionModel to model management. 2023-09-13 17:14:20 -04:00
Ryan Dick
3ee9a21647 Initial (barely) working version of IP-Adapter model management. 2023-09-13 08:27:24 -04:00
Ryan Dick
163ece9aee Initial skeleton for IPAdapter model management. 2023-09-13 08:27:24 -04:00
Martin Kristiansen
caea6d11c6 isort wip 2 2023-09-12 13:01:58 -04:00
Martin Kristiansen
537ae2f901 Resolving merge conflicts for flake8 2023-08-18 15:52:04 +10:00
Lincoln Stein
e080fd1e08 blackify 2023-08-03 11:25:20 +10:00
Lincoln Stein
eeef1e08f8 restore ability to convert merged inpaint .safetensors files 2023-08-03 11:25:20 +10:00
Brandon Rising
da751da3dd Merge branch 'main' into feat/onnx 2023-07-28 09:59:35 -04:00
Brandon Rising
2b7b3dd4ba Run python black 2023-07-28 09:46:44 -04:00
Brandon Rising
1ea9ba84f5 Release session if applying ti or lora 2023-07-27 15:20:38 -04:00
Brandon Rising
bfdc8c80f3 Testing caching onnx sessions 2023-07-27 14:13:29 -04:00
Brandon Rising
59716938bf Remove TensorRT support at the current time until we validate it works, remove time step recorder 2023-07-27 11:18:50 -04:00
Martin Kristiansen
218b6d0546 Apply black 2023-07-27 10:54:01 -04:00
Brandon Rising
f7bb4c3f05 Remove more files no longer needed in main 2023-07-27 10:49:43 -04:00
Brandon Rising
989d3d7f3c Remove onnx changes from canvas img2img, inpaint, and linear image2image 2023-07-27 10:08:45 -04:00
Brandon Rising
861c0fe76b Correct issues caused by merging main 2023-07-26 12:25:46 -04:00
Brandon Rising
c16da75ac7 Merge branch 'main' into feat/onnx 2023-07-26 10:42:31 -04:00
Lincoln Stein
845d1524ad warn, do not crash, when duplicate models encountered 2023-07-21 15:00:55 -04:00
Brandon Rising
78750042f5 Pass in dim overrides 2023-07-21 12:16:24 -04:00
Brandon Rising
4e90376d11 Allow passing in of precision, use available providers if none provided 2023-07-20 13:15:45 -04:00
Brandon Rising
43b6a077fb io binding seems to be massively resource intensive compared to session.run 2023-07-19 17:42:28 -04:00
Brandon Rising
e8299d0abb Comment out erroniously removed del statement, comment out opt tests 2023-07-18 23:23:34 -04:00
Brandon Rising
ee7b36cea5 Merge branch 'main' into onnx-testing 2023-07-18 22:56:41 -04:00
Brandon Rising
e201ad2f51 Switch to io_binding for run, testing different session options 2023-07-18 21:54:54 -04:00
Brandon Rising
35d5ef9118 Emit step completions 2023-07-18 12:35:07 -04:00
Brandon Rising
bcce70fca6 Testing different session opts, added timings for testing 2023-07-17 16:27:33 -04:00
Brandon Rising
932112b640 testing being super wasteful with data 2023-07-16 00:17:33 -04:00
Lincoln Stein
ccbfa5d862 resolve conflicts 2023-07-15 19:47:50 -04:00
Lincoln Stein
2faa7cee37 add rename_model route 2023-07-14 23:03:18 -04:00
Brandon Rising
524888bf3b Merge branch 'main' into feat/onnx 2023-07-13 14:23:57 -04:00
Sergey Borisov
a328986b43 Less naive model detection 2023-07-12 08:50:19 -04:00
Lincoln Stein
bf2b5b5cd4 improvements to sdxl support in model manager
- Move SDXL-related models to models/sdxl.py
- Create separate base type BaseModelType.StableDiffusionXLRefiner for the refiner
  models.
2023-07-09 20:42:03 -04:00
Lincoln Stein
130249a2dd add model loading support for SDXL 2023-07-09 15:47:06 -04:00
Sergey Borisov
0ac9dca926 Fix loading diffusers ti 2023-07-05 19:46:00 +03:00
Lincoln Stein
e8ed0fad6c autoimport from embedding/controlnet/lora folders designated in startup file 2023-06-27 12:30:53 -04:00
Lincoln Stein
60b37b7ff4 fix model manager documentation 2023-06-25 16:04:43 -04:00
Lincoln Stein
c3c4a71173 implemented Stalker's suggested improvements 2023-06-24 12:37:26 -04:00
Lincoln Stein
ba1371a88f rename ModelType.Pipeline to ModelType.Main 2023-06-24 11:45:49 -04:00
Lincoln Stein
466ec3ab5e add router API support for model manager heuristic_import()` 2023-06-23 16:35:39 -04:00
Sergey Borisov
6c7668aaca Update onnx model structure, change code according 2023-06-22 20:03:17 +03:00
Sergey Borisov
da566b59e8 Update model format field to use enums 2023-06-22 16:51:53 +10:00
Sergey Borisov
e4dc9c5a04 Rename format to model_format(still named format when work with config) 2023-06-22 16:51:53 +10:00
Sergey Borisov
aceadacad4 Remove default model logic 2023-06-22 16:51:53 +10:00
blessedcoolant
727293d722 fix: 2.1 models breaking generation
Co-Authored-By: StAlKeR7779 <7768370+StAlKeR7779@users.noreply.github.com>
2023-06-22 16:42:59 +10:00
Sergey Borisov
ef83a2fffe Add name, base_mode, type fields to model info 2023-06-22 16:42:51 +10:00
Sergey Borisov
7759b3f75a Small refactor 2023-06-21 04:24:25 +03:00