mirror of
https://github.com/invoke-ai/InvokeAI
synced 2024-08-30 20:32:17 +00:00
88ae19a768
This improves the overall responsiveness of the system substantially, but does make each iteration *slightly* slower, distributing the up-front cost across the batch. Two main changes: 1. Create BatchSessions immediately, but do not create a whole graph execution state until the batch is executed. BatchSessions are created with a `session_id` that does not exist in sessions database. The default state is changed to `"uninitialized"` to better represent this. Results: Time to create 5000 batches reduced from over 30s to 2.5s 2. Use `executemany()` to retrieve lists of created sessions. Results: time to create 5000 batches reduced from 2.5s to under 0.5s Other changes: - set BatchSession state to `"in_progress"` just before `invoke()` is called - rename a few methods to accomodate the new behaviour - remove unused `BatchProcessStorage.get_created_sessions()` method |
||
---|---|---|
.. | ||
app | ||
assets | ||
backend | ||
configs | ||
frontend | ||
version | ||
__init__.py | ||
README |
Organization of the source tree: app -- Home of nodes invocations and services assets -- Images and other data files used by InvokeAI backend -- Non-user facing libraries, including the rendering core. configs -- Configuration files used at install and run times frontend -- User-facing scripts, including the CLI and the WebUI version -- Current InvokeAI version string, stored in version/invokeai_version.py