Cache updates (#7388)

* Add redis container to development setup

* Improve configurability of global cache:

- Push configuration into separate file

* Settings cache default depends on global cache configuration

* Update docker compose file

* Remove debug flag

* Allow caching for registry checks

* Cleanup docstring

* Adjust defautl behaviour

* Update docs for caching

* Adjust default docker compose file

* Update docs for devcontainer

* Cleanup config template file

* Update docker docs

* Update cache behaviour
This commit is contained in:
Oliver 2024-06-02 21:43:31 +10:00 committed by GitHub
parent df619ec17d
commit cdac7465b2
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
14 changed files with 190 additions and 100 deletions

View File

@ -13,6 +13,12 @@ services:
POSTGRES_USER: inventree_user
POSTGRES_PASSWORD: inventree_password
redis:
image: redis:7.0
restart: always
expose:
- 6379
inventree:
build:
context: ..
@ -31,6 +37,8 @@ services:
INVENTREE_DB_HOST: db
INVENTREE_DB_USER: inventree_user
INVENTREE_DB_PASSWORD: inventree_password
INVENTREE_CACHE_HOST: redis
INVENTREE_CACHE_PORT: 6379
INVENTREE_PLUGINS_ENABLED: True
INVENTREE_SITE_URL: http://localhost:8000
INVENTREE_CORS_ORIGIN_ALLOW_ALL: True

View File

@ -28,6 +28,7 @@ INVENTREE_DB_PASSWORD=pgpassword
# Un-comment the following lines to enable Redis cache
# Note that you will also have to run docker-compose with the --profile redis command
# Refer to settings.py for other cache options
#INVENTREE_CACHE_ENABLED=True
#INVENTREE_CACHE_HOST=inventree-cache
#INVENTREE_CACHE_PORT=6379

View File

@ -53,14 +53,9 @@ services:
restart: unless-stopped
# redis acts as database cache manager
# only runs under the "redis" profile : https://docs.docker.com/compose/profiles/
inventree-cache:
image: redis:7.0
container_name: inventree-cache
depends_on:
- inventree-db
profiles:
- redis
env_file:
- .env
expose:

View File

@ -18,12 +18,13 @@ You need to make sure that you have the following tools installed before continu
#### Docker Containers
The InvenTree devcontainer setup will install two docker containers:
The InvenTree devcontainer setup will install the following docker containers:
| Container | Description |
| --- | --- |
| inventree | InvenTree host server |
| db | InvenTree database (postgresql) |
| inventree | InvenTree server |
| redis | Redis server for caching |
#### Setup/Installation
@ -119,3 +120,9 @@ If you are running a devcontainer in Windows, you may experience some performanc
For a significant improvement in performance, the source code should be installed into the **WSL 2** filesystem (not on your "Windows" filesystem). This will greatly improve file access performance, and also make the devcontainer much more responsive to file system changes.
You can also refer to the [Improve disk performance guide](https://code.visualstudio.com/remote/advancedcontainers/improve-performance) for more information.
### Redis Caching
The devcontainer setup provides a [redis](https://redis.io/) container which can be used for managing global cache. By default this is disabled, but it can be easily enabled for testing or developing with the [redis cache](../start/config.md#caching) enabled.
To enable the cache, locate the InvenTree configuration file (`./dev/config.yaml`) and set the `cache.enabled` setting to `True`.

View File

@ -202,6 +202,35 @@ If running with a MySQL database backend, the following additional options are a
| --- | --- | --- | --- |
| INVENTREE_DB_ISOLATION_SERIALIZABLE | database.serializable | Database isolation level configured to "serializable" | False |
## Caching
InvenTree can be configured to use [redis](https://redis.io) as a global cache backend.
Enabling a global cache can provide significant performance improvements for InvenTree.
### Cache Server
Enabling global caching requires connection to a redis server (which is separate from the InvenTree database and web server). Setup and configuration of this server is outside the scope of this documentation. It is assumed that if you are configuring a cache server, you have already set one up, and are comfortable configuring it.
!!! tip "Docker Support"
If you are running [InvenTree under docker](./docker.md), we provide a redis container as part of our docker compose file - so redis caching works out of the box.
### Cache Settings
The following cache settings are available:
| Environment Variable | Configuration File | Description | Default |
| --- | --- | --- | --- |
| INVENTREE_CACHE_ENABLED | cache.enabled | Enable redis caching | False |
| INVENTREE_CACHE_HOST | cache.host | Cache server host | *Not specified* |
| INVENTREE_CACHE_PORT | cache.port | Cache server port | 6379 |
| INVENTREE_CACHE_CONNECT_TIMEOUT | cache.connect_timeout | Cache connection timeout (seconds) | 3 |
| INVENTREE_CACHE_TIMEOUT | cache.timeout | Cache timeout (seconds) | 3 |
| INVENTREE_CACHE_TCP_KEEPALIVE | cache.tcp_keepalive | Cache TCP keepalive | True |
| INVENTREE_CACHE_KEEPALIVE_COUNT | cache.keepalive_count | Cache keepalive count | 5 |
| INVENTREE_CACHE_KEEPALIVE_IDLE | cache.keepalive_idle | Cache keepalive idle | 1 |
| INVENTREE_CACHE_KEEPALIVE_INTERVAL | cache.keepalive_interval | Cache keepalive interval | 1 |
| INVENTREE_CACHE_USER_TIMEOUT | cache.user_timeout | Cache user timeout | 1000 |
## Email Settings
To enable [email functionality](../settings/email.md), email settings must be configured here, either via environment variables or within the configuration file.

View File

@ -136,10 +136,8 @@ This container uses the official [redis image](https://hub.docker.com/_/redis).
Docker adds an additional network layer - that might lead to lower performance than bare metal.
To optimize and configure your redis deployment follow the [official docker guide](https://redis.io/docs/getting-started/install-stack/docker/#configuration).
!!! warning "Disabled by default"
The *redis* container is not enabled in the default configuration. This is provided as an example for users wishing to use redis.
To enable the *redis* container, run any `docker compose` commands with the `--profile redis` flag.
You will also need to un-comment the `INVENTREE_CACHE_<...>` variables in the `.env` file.
!!! tip "Enable Cache"
While a redis container is provided in the default configuration, by default it is not enabled in the Inventree server. You can enable redis cache support by following the [caching configuration guide](./config.md#caching)
### Data Volume

View File

@ -0,0 +1,105 @@
"""Configuration options for InvenTree external cache."""
import logging
import socket
import InvenTree.config
import InvenTree.ready
logger = logging.getLogger('inventree')
def cache_setting(name, default=None, **kwargs):
"""Return a cache setting."""
return InvenTree.config.get_setting(
f'INVENTREE_CACHE_{name.upper()}', f'cache.{name.lower()}', default, **kwargs
)
def cache_host():
"""Return the cache host address."""
return cache_setting('host', None)
def cache_port():
"""Return the cache port."""
return cache_setting('port', '6379', typecast=int)
def is_global_cache_enabled():
"""Check if the global cache is enabled.
- Test if the user has enabled and configured global cache
- Test if it is appropriate to enable global cache based on the current operation.
"""
host = cache_host()
# Test if cache is enabled
# If the cache host is set, then the "default" action is to enable the cache
if not cache_setting('enabled', host is not None, typecast=bool):
return False
# Test if the cache is configured
if not cache_host():
logger.warning('Global cache is enabled, but no cache host is configured!')
return False
# The cache should not be used during certain operations
if any((
InvenTree.ready.isRunningBackup(),
InvenTree.ready.isRunningMigrations(),
InvenTree.ready.isRebuildingData(),
InvenTree.ready.isImportingData(),
InvenTree.ready.isInTestMode(),
)):
logger.info('Global cache bypassed for this operation')
return False
logger.info('Global cache enabled')
return True
def get_cache_config(global_cache: bool) -> dict:
"""Return the cache configuration options.
Args:
global_cache: True if the global cache is enabled.
Returns:
A dictionary containing the cache configuration options.
"""
if global_cache:
return {
'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': f'redis://{cache_host()}:{cache_port()}/0',
'OPTIONS': {
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
'SOCKET_CONNECT_TIMEOUT': cache_setting(
'connect_timeout', 5, typecast=int
),
'SOCKET_TIMEOUT': cache_setting('timeout', 3, typecast=int),
'CONNECTION_POOL_KWARGS': {
'socket_keepalive': cache_setting(
'tcp_keepalive', True, typecast=bool
),
'socket_keepalive_options': {
socket.TCP_KEEPCNT: cache_setting(
'keepalive_count', 5, typecast=int
),
socket.TCP_KEEPIDLE: cache_setting(
'keepalive_idle', 1, typecast=int
),
socket.TCP_KEEPINTVL: cache_setting(
'keepalive_interval', 1, typecast=int
),
socket.TCP_USER_TIMEOUT: cache_setting(
'user_timeout', 1000, typecast=int
),
},
},
},
}
# Default: Use django local memory cache
return {'BACKEND': 'django.core.cache.backends.locmem.LocMemCache'}

View File

@ -63,7 +63,7 @@ def get_base_url(request=None):
# Check if a global InvenTree setting is provided
try:
if site_url := common.models.InvenTreeSetting.get_setting(
'INVENTREE_BASE_URL', create=False, cache=False
'INVENTREE_BASE_URL', create=False
):
return site_url
except (ProgrammingError, OperationalError):

View File

@ -11,7 +11,6 @@ database setup in this file.
import logging
import os
import socket
import sys
from pathlib import Path
@ -25,6 +24,7 @@ import moneyed
import pytz
from dotenv import load_dotenv
from InvenTree.cache import get_cache_config, is_global_cache_enabled
from InvenTree.config import get_boolean_setting, get_custom_file, get_setting
from InvenTree.ready import isInMainThread
from InvenTree.sentry import default_sentry_dsn, init_sentry
@ -804,38 +804,9 @@ if TRACING_ENABLED: # pragma: no cover
# endregion
# Cache configuration
cache_host = get_setting('INVENTREE_CACHE_HOST', 'cache.host', None)
cache_port = get_setting('INVENTREE_CACHE_PORT', 'cache.port', '6379', typecast=int)
GLOBAL_CACHE_ENABLED = is_global_cache_enabled()
if cache_host: # pragma: no cover
# We are going to rely upon a possibly non-localhost for our cache,
# so don't wait too long for the cache as nothing in the cache should be
# irreplaceable.
_cache_options = {
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
'SOCKET_CONNECT_TIMEOUT': int(os.getenv('CACHE_CONNECT_TIMEOUT', '2')),
'SOCKET_TIMEOUT': int(os.getenv('CACHE_SOCKET_TIMEOUT', '2')),
'CONNECTION_POOL_KWARGS': {
'socket_keepalive': config.is_true(os.getenv('CACHE_TCP_KEEPALIVE', '1')),
'socket_keepalive_options': {
socket.TCP_KEEPCNT: int(os.getenv('CACHE_KEEPALIVES_COUNT', '5')),
socket.TCP_KEEPIDLE: int(os.getenv('CACHE_KEEPALIVES_IDLE', '1')),
socket.TCP_KEEPINTVL: int(os.getenv('CACHE_KEEPALIVES_INTERVAL', '1')),
socket.TCP_USER_TIMEOUT: int(
os.getenv('CACHE_TCP_USER_TIMEOUT', '1000')
),
},
},
}
CACHES = {
'default': {
'BACKEND': 'django_redis.cache.RedisCache',
'LOCATION': f'redis://{cache_host}:{cache_port}/0',
'OPTIONS': _cache_options,
}
}
else:
CACHES = {'default': {'BACKEND': 'django.core.cache.backends.locmem.LocMemCache'}}
CACHES = {'default': get_cache_config(GLOBAL_CACHE_ENABLED)}
_q_worker_timeout = int(
get_setting('INVENTREE_BACKGROUND_TIMEOUT', 'background.timeout', 90)
@ -866,7 +837,7 @@ Q_CLUSTER = {
if SENTRY_ENABLED and SENTRY_DSN:
Q_CLUSTER['error_reporter'] = {'sentry': {'dsn': SENTRY_DSN}}
if cache_host: # pragma: no cover
if GLOBAL_CACHE_ENABLED: # pragma: no cover
# If using external redis cache, make the cache the broker for Django Q
# as well
Q_CLUSTER['django_redis'] = 'worker'

View File

@ -19,13 +19,13 @@ from secrets import compare_digest
from typing import Any, Callable, TypedDict, Union
from django.apps import apps
from django.conf import settings
from django.conf import settings as django_settings
from django.contrib.auth.models import Group, User
from django.contrib.contenttypes.fields import GenericForeignKey
from django.contrib.contenttypes.models import ContentType
from django.contrib.humanize.templatetags.humanize import naturaltime
from django.core.cache import cache
from django.core.exceptions import AppRegistryNotReady, ValidationError
from django.core.exceptions import ValidationError
from django.core.validators import MaxValueValidator, MinValueValidator, URLValidator
from django.db import models, transaction
from django.db.models.signals import post_delete, post_save
@ -101,7 +101,7 @@ class BaseURLValidator(URLValidator):
value = str(value).strip()
# If a configuration level value has been specified, prevent change
if settings.SITE_URL and value != settings.SITE_URL:
if django_settings.SITE_URL and value != django_settings.SITE_URL:
raise ValidationError(_('Site URL is locked by configuration'))
if len(value) == 0:
@ -561,7 +561,7 @@ class BaseInvenTreeSetting(models.Model):
create = kwargs.pop('create', True)
# Specify if cache lookup should be performed
do_cache = kwargs.pop('cache', False)
do_cache = kwargs.pop('cache', django_settings.GLOBAL_CACHE_ENABLED)
# Prevent saving to the database during data import
if InvenTree.ready.isImportingData():
@ -1117,7 +1117,7 @@ def settings_group_options():
def update_instance_url(setting):
"""Update the first site objects domain to url."""
if not settings.SITE_MULTI:
if not django_settings.SITE_MULTI:
return
try:
@ -1133,7 +1133,7 @@ def update_instance_url(setting):
def update_instance_name(setting):
"""Update the first site objects name to instance name."""
if not settings.SITE_MULTI:
if not django_settings.SITE_MULTI:
return
try:
@ -2653,14 +2653,14 @@ class ColorTheme(models.Model):
@classmethod
def get_color_themes_choices(cls):
"""Get all color themes from static folder."""
if not settings.STATIC_COLOR_THEMES_DIR.exists():
if not django_settings.STATIC_COLOR_THEMES_DIR.exists():
logger.error('Theme directory does not exist')
return []
# Get files list from css/color-themes/ folder
files_list = []
for file in settings.STATIC_COLOR_THEMES_DIR.iterdir():
for file in django_settings.STATIC_COLOR_THEMES_DIR.iterdir():
files_list.append([file.stem, file.suffix])
# Get color themes choices (CSS sheets)
@ -3011,7 +3011,7 @@ class NotificationMessage(models.Model):
# Add timezone information if TZ is enabled (in production mode mostly)
delta = now() - (
self.creation.replace(tzinfo=timezone.utc)
if settings.USE_TZ
if django_settings.USE_TZ
else self.creation
)
return delta.seconds

View File

@ -91,24 +91,10 @@ sentry_enabled: False
#sentry_sample_rate: 0.1
#sentry_dsn: https://custom@custom.ingest.sentry.io/custom
# OpenTelemetry tracing/metrics - disabled by default
# OpenTelemetry tracing/metrics - disabled by default - refer to the documentation for full list of options
# This can be used to send tracing data, logs and metrics to OpenTelemtry compatible backends
# See https://opentelemetry.io/ecosystem/vendors/ for a list of supported backends
# Alternatively, use environment variables eg. INVENTREE_TRACING_ENABLED, INVENTREE_TRACING_HEADERS, INVENTREE_TRACING_AUTH
#tracing:
# enabled: true
# endpoint: https://otlp-gateway-prod-eu-west-0.grafana.net/otlp
# headers:
# api-key: 'sample'
# auth:
# basic:
# username: '******'
# password: 'glc_****'
# is_http: true
# append_http: true
# console: false
# resources:
# CUSTOM_KEY: 'CUSTOM_VALUE'
tracing:
enabled: false
# Set this variable to True to enable InvenTree Plugins, or use the environment variable INVENTREE_PLUGINS_ENABLED
plugins_enabled: False
@ -171,6 +157,13 @@ background:
timeout: 90
max_attempts: 5
# External cache configuration (refer to the documentation for full list of options)
cache:
enabled: false
host: 'inventree-cache'
port: 6379
# Login configuration
login_confirm_days: 3
login_attempts: 5

View File

@ -2748,15 +2748,11 @@ class PartPricing(common.models.MetaMixin):
purchase_max = purchase_cost
# Also check if manual stock item pricing is included
if InvenTreeSetting.get_setting('PRICING_USE_STOCK_PRICING', True, cache=False):
if InvenTreeSetting.get_setting('PRICING_USE_STOCK_PRICING', True):
items = self.part.stock_items.all()
# Limit to stock items updated within a certain window
days = int(
InvenTreeSetting.get_setting(
'PRICING_STOCK_ITEM_AGE_DAYS', 0, cache=False
)
)
days = int(InvenTreeSetting.get_setting('PRICING_STOCK_ITEM_AGE_DAYS', 0))
if days > 0:
date_threshold = InvenTree.helpers.current_date() - timedelta(days=days)
@ -2792,7 +2788,7 @@ class PartPricing(common.models.MetaMixin):
min_int_cost = None
max_int_cost = None
if InvenTreeSetting.get_setting('PART_INTERNAL_PRICE', False, cache=False):
if InvenTreeSetting.get_setting('PART_INTERNAL_PRICE', False):
# Only calculate internal pricing if internal pricing is enabled
for pb in self.part.internalpricebreaks.all():
cost = self.convert(pb.price)
@ -2911,12 +2907,10 @@ class PartPricing(common.models.MetaMixin):
max_costs = [self.bom_cost_max, self.purchase_cost_max, self.internal_cost_max]
purchase_history_override = InvenTreeSetting.get_setting(
'PRICING_PURCHASE_HISTORY_OVERRIDES_SUPPLIER', False, cache=False
'PRICING_PURCHASE_HISTORY_OVERRIDES_SUPPLIER', False
)
if InvenTreeSetting.get_setting(
'PRICING_USE_SUPPLIER_PRICING', True, cache=False
):
if InvenTreeSetting.get_setting('PRICING_USE_SUPPLIER_PRICING', True):
# Add supplier pricing data, *unless* historical pricing information should override
if self.purchase_cost_min is None or not purchase_history_override:
min_costs.append(self.supplier_price_min)
@ -2924,9 +2918,7 @@ class PartPricing(common.models.MetaMixin):
if self.purchase_cost_max is None or not purchase_history_override:
max_costs.append(self.supplier_price_max)
if InvenTreeSetting.get_setting(
'PRICING_USE_VARIANT_PRICING', True, cache=False
):
if InvenTreeSetting.get_setting('PRICING_USE_VARIANT_PRICING', True):
# Include variant pricing in overall calculations
min_costs.append(self.variant_cost_min)
max_costs.append(self.variant_cost_max)
@ -2953,9 +2945,7 @@ class PartPricing(common.models.MetaMixin):
if overall_max is None or cost > overall_max:
overall_max = cost
if InvenTreeSetting.get_setting(
'PART_BOM_USE_INTERNAL_PRICE', False, cache=False
):
if InvenTreeSetting.get_setting('PART_BOM_USE_INTERNAL_PRICE', False):
# Check if internal pricing should override other pricing
if self.internal_cost_min is not None:
overall_min = self.internal_cost_min
@ -4300,7 +4290,7 @@ class BomItem(
"""Return the price-range for this BOM item."""
# get internal price setting
use_internal = common.models.InvenTreeSetting.get_setting(
'PART_BOM_USE_INTERNAL_PRICE', False, cache=False
'PART_BOM_USE_INTERNAL_PRICE', False
)
prange = self.sub_part.get_price_range(
self.quantity, internal=use_internal and internal

View File

@ -55,9 +55,7 @@ def register_event(event, *args, **kwargs):
logger.debug("Registering triggered event: '%s'", event)
# Determine if there are any plugins which are interested in responding
if settings.PLUGIN_TESTING or InvenTreeSetting.get_setting(
'ENABLE_PLUGINS_EVENTS', cache=False
):
if settings.PLUGIN_TESTING or InvenTreeSetting.get_setting('ENABLE_PLUGINS_EVENTS'):
# Check if the plugin registry needs to be reloaded
registry.check_reload()

View File

@ -789,7 +789,7 @@ class PluginsRegistry:
for k in self.plugin_settings_keys():
try:
val = InvenTreeSetting.get_setting(k, False, cache=False, create=False)
val = InvenTreeSetting.get_setting(k, False, create=False)
msg = f'{k}-{val}'
data.update(msg.encode())
@ -799,12 +799,7 @@ class PluginsRegistry:
return str(data.hexdigest())
def check_reload(self):
"""Determine if the registry needs to be reloaded.
- If a "request" object is available, then we can cache the result and attach it.
- The assumption is that plugins will not change during a single request.
"""
"""Determine if the registry needs to be reloaded."""
from common.models import InvenTreeSetting
if settings.TESTING:
@ -823,7 +818,7 @@ class PluginsRegistry:
try:
reg_hash = InvenTreeSetting.get_setting(
'_PLUGIN_REGISTRY_HASH', '', create=False, cache=False
'_PLUGIN_REGISTRY_HASH', '', create=False
)
except Exception as exc:
logger.exception('Failed to retrieve plugin registry hash: %s', str(exc))