# InvokeAI Web UI - [InvokeAI Web UI](#invokeai-web-ui) - [Core Libraries](#core-libraries) - [Redux Toolkit](#redux-toolkit) - [Socket\.IO](#socketio) - [Chakra UI](#chakra-ui) - [KonvaJS](#konvajs) - [Vite](#vite) - [i18next & Weblate](#i18next--weblate) - [openapi-typescript](#openapi-typescript) - [reactflow](#reactflow) - [zod](#zod) - [Client Types Generation](#client-types-generation) - [Package Scripts](#package-scripts) - [Contributing](#contributing) - [Dev Environment](#dev-environment) - [VSCode Remote Dev](#vscode-remote-dev) - [Production builds](#production-builds) The UI is a fairly straightforward Typescript React app. ## Core Libraries InvokeAI's UI is made possible by a number of excellent open-source libraries. The most heavily-used are listed below, but there are many others. ### Redux Toolkit [Redux Toolkit] is used for state management and fetching/caching: - `RTK-Query` for data fetching and caching - `createAsyncThunk` for a couple other HTTP requests - `createEntityAdapter` to normalize things like images and models - `createListenerMiddleware` for async workflows We use [redux-remember] for persistence. ### Socket\.IO [Socket.IO] is used for server-to-client events, like generation process and queue state changes. ### Chakra UI [Chakra UI] is our primary UI library, but we also use a few components from [Mantine v6]. ### KonvaJS [KonvaJS] powers the canvas. In the future, we'd like to explore [PixiJS] or WebGPU. ### Vite [Vite] is our bundler. ### i18next & Weblate We use [i18next] for localization, but translation to languages other than English happens on our [Weblate] project. **Only the English source strings should be changed on this repo.** ### openapi-typescript [openapi-typescript] is used to generate types from the server's OpenAPI schema. See TYPES_CODEGEN.md. ### reactflow [reactflow] powers the Workflow Editor. ### zod [zod] schemas are used to model data structures and provide runtime validation. ## Client Types Generation We use [openapi-typescript] to generate types from the app's OpenAPI schema. The generated types are written to `invokeai/frontend/web/src/services/api/schema.d.ts`. This file is committed to the repo. The server must be started and available at . ```sh # from the repo root, start the server python scripts/invokeai-web.py # from invokeai/frontend/web/, run the script yarn typegen ``` ## Package Scripts See `package.json` for all scripts. Run with `yarn