## What type of PR is this? (check all applicable)
- [ ] Refactor
- [ ] Feature
- [x] Bug Fix
- [ ] Optimization
- [ ] Documentation Update
- [ ] Community Node Submission
## Description
[fix(stats): fix fail case when previous graph is
invalid](d1d2d5a47d)
When retrieving a graph, it is parsed through pydantic. It is possible
that this graph is invalid, and an error is thrown.
Handle this by deleting the failed graph from the stats if this occurs.
[fix(stats): fix InvocationStatsService
types](1b70bd1380)
- move docstrings to ABC
- `start_time: int` -> `start_time: float`
- remove class attribute assignments in `StatsContext`
- add `update_mem_stats()` to ABC
- add class attributes to ABC, because they are referenced in instances
of the class. if they should not be on the ABC, then maybe there needs
to be some restructuring
## QA Instructions, Screenshots, Recordings
<!--
Please provide steps on how to test changes, any hardware or
software specifications as well as any other pertinent information.
-->
On `main` (not this PR), create a situation in which an graph is valid
but will be rendered invalid on invoke. Easy way in node editor:
- create an `Integer Primitive` node, set value to 3
- create a `Resize Image` node and add an image to it
- route the output of `Integer Primitive` to the `width` of `Resize
Image`
- Invoke - this will cause first a `Validation Error` (expected), and if
you inspect the error in the JS console, you'll see it is a "session
retrieval error"
- Invoke again - this will also cause a `Validation Error`, but if you
inspect the error you should see it originates in the stats module (this
is the error this PR fixes)
- Fix the graph by setting the `Integer Primitive` to 512
- Invoke again - you get the same `Validation Error` originating from
stats, even tho there are no issues
Switch to this PR, and then you should only ever get the `Validation
Error` that that is classified as a "session retrieval error".
This improves the overall responsiveness of the system substantially, but does make each iteration *slightly* slower, distributing the up-front cost across the batch.
Two main changes:
1. Create BatchSessions immediately, but do not create a whole graph execution state until the batch is executed.
BatchSessions are created with a `session_id` that does not exist in sessions database.
The default state is changed to `"uninitialized"` to better represent this.
Results: Time to create 5000 batches reduced from over 30s to 2.5s
2. Use `executemany()` to retrieve lists of created sessions.
Results: time to create 5000 batches reduced from 2.5s to under 0.5s
Other changes:
- set BatchSession state to `"in_progress"` just before `invoke()` is called
- rename a few methods to accomodate the new behaviour
- remove unused `BatchProcessStorage.get_created_sessions()` method
It is `"invocation"` for invocations and `"output"` for outputs. Clients may use this to confidently and positively identify if an OpenAPI schema object is an invocation or output, instead of using a potentially fragile heuristic.
Doing this via `BaseInvocation`'s `Config.schema_extra()` means all clients get an accurate OpenAPI schema.
Shifts the responsibility of correct types to the backend, where previously it was on the client.
Doing this via these classes' `Config.schema_extra()` method makes it unintrusive and clients will get the correct types for these properties.
Shifts the responsibility of correct types to the backend, where previously it was on the client.
The `type` property is required on all of them, but because this is defined in pydantic as a Literal, it is not required in the OpenAPI schema. Easier to fix this by changing the generated types than fiddling around with pydantic.
- move docstrings to ABC
- `start_time: int` -> `start_time: float`
- remove class attribute assignments in `StatsContext`
- add `update_mem_stats()` to ABC
- add class attributes to ABC, because they are referenced in instances of the class. if they should not be on the ABC, then maybe there needs to be some restructuring
When retrieving a graph, it is parsed through pydantic. It is possible that this graph is invalid, and an error is thrown.
Handle this by deleting the failed graph from the stats if this occurs.