mirror of
https://github.com/invoke-ai/InvokeAI
synced 2024-08-30 20:32:17 +00:00
88ae19a768
This improves the overall responsiveness of the system substantially, but does make each iteration *slightly* slower, distributing the up-front cost across the batch. Two main changes: 1. Create BatchSessions immediately, but do not create a whole graph execution state until the batch is executed. BatchSessions are created with a `session_id` that does not exist in sessions database. The default state is changed to `"uninitialized"` to better represent this. Results: Time to create 5000 batches reduced from over 30s to 2.5s 2. Use `executemany()` to retrieve lists of created sessions. Results: time to create 5000 batches reduced from 2.5s to under 0.5s Other changes: - set BatchSession state to `"in_progress"` just before `invoke()` is called - rename a few methods to accomodate the new behaviour - remove unused `BatchProcessStorage.get_created_sessions()` method |
||
---|---|---|
.. | ||
models | ||
__init__.py | ||
batch_manager_storage.py | ||
batch_manager.py | ||
board_image_record_storage.py | ||
board_images.py | ||
board_record_storage.py | ||
boards.py | ||
config.py | ||
default_graphs.py | ||
events.py | ||
graph.py | ||
image_file_storage.py | ||
image_record_storage.py | ||
images.py | ||
invocation_queue.py | ||
invocation_services.py | ||
invocation_stats.py | ||
invoker.py | ||
item_storage.py | ||
latent_storage.py | ||
model_manager_service.py | ||
processor.py | ||
resource_name.py | ||
sqlite.py | ||
urls.py |