[WIP] Data importer (#6911)

* Adds new model for DataImportSession

* Add file extension validation

Expose to admin interface also

* Switch to new 'importer' app

* Refactoring to help prevent circular imports

* Add serializer registry

- Use @register_importer tag for any serializer class

* Cleanup migration file

- Do not use one-time hard-coded values here

* Refactor code into registry.py

* Add validation for the uploaded file

- Must be importable by tablib

* Refactoring

* Adds property to retrieve matching serializer class

* Update helper functions

* Add hook to auto-assign columns on initial creation

* Rename field

* Enforce initial status value

* Add model for individual rows in the data import

* Add DataImportRow model

* Extract data rows as dict

* Update fields

- Remove "progress" field (will be calculated)
- Added "timestamp" field
- Added "complete" field to DataImportRow

* Auto-map column names

- Provide "sensible" default values

* Add API endpoint for DataImportSession

* Offload data import operation

- For large data files this may take a significant amount of time
- Offload it to the background worker process

* Refactor data import code

* Update models

- Add "columns" field to DataImportSession
- Add "errors" field to DataImportRow

* Move field mapping to a new model type

- Simpler validation

* Save "valid" status for each data row

* Include session defaults when validating row data

* Update content_excludes

- Ignore importer models in import/export

* Remove port from ALLOWED_HOST entries

* Skip table events for importer models

* Bug fixes

* Serializer updates

* Add more endpoints

- DataImportColumnMappingList
- DataImportRowList

* further updates:

- Add 'get_api_url' method
- Handle case where

* Expose "available fields" to the DataImportSession serializer

Uses the (already available) inventree metadata middleware

* Add detail endpoints

* Clear existing column mappings

* Add endpoint for accepting column mappings

* Add API endpoint exposing available importer serializers

* Add simple playground area for testing data importer

* Adds simple form to start new import session

- Needs work, file field does not currently function correctly

* data_file is *not* read_only

* Add check for file type

* Remove debug statements

* Refactor column mapping

- Generate mapping for each column
- Remove "columns" field
- Column names are calculated dynamically

* Fix uniqueness requirements on mapping table

* Admin updates

- Prevent deletion of mappings
- Prevent addition of mappings

* API endpoint updates

- Prevent mappings from being deleted
- Prevent mappings from being created

* Update importer drawer

* Add widget for selecting data columns

* UI tweaks

* Delete import session when closing modal

* Allow empty string value

* Complete column mapping

* Adds ability to remove rows

* Adjust drawer specs

* Add column 'description' to serializer

* Add option to hide labels in API form field

* Update column heading

* Fix frontend linting errors

* Revert drawer position

* Return correct type

* Fix shadowing

* Fix f-string

* simplify frontend code

* Move importer app

* Update API version

* Reintroduce export formats

* Add new models to RuleSet

* typescript cleanup

* Typescript cleanup

* Improvement for Switch / boolean field

* Display original row data on popover

* Only display mapped columns

* Add DataExportMixin class

- Replaces existing APIDownloadMixin
- Uses DRF serializers for exporting
- *much* more efficient

* Create new file: importer.mixins.py

* Add new mixin to existing views which support data export

* Better error handling

* Cleanup:

- Remove references to APIDownloadMixin
- Remove download_queryset method
- All now handled by API-based export functionality

* Replace table with InvenTreeTable

- Paginate imported rows
- Data can be searched, ordered,

* Make 'pathstring' fields read-only

* Expose list of valid importer types to the API

* Exclude read-only fields

* Cleanup

* Updates for session model

- Column is now editable on mapping object
- Field is no  longer editable
- Improve admin integration

* Adds new custom hook for controlling data import session

* Refactor column mapping widget

* Refactor ImportDataSelector

* Working on ImportDataSelector component

* Adds method for editing fields in import table

- Cell edit mode
- Row edit mode
- Form submission still needs work!

* Adds background task for removing old import sessions

* Fix api_version.py

* Update src/frontend/src/components/importer/ImportDataSelector.tsx

Co-authored-by: Lukas <76838159+wolflu05@users.noreply.github.com>

* Update model verbose names

* Rename mixin class

* Add serializer mixin classes

- Will allow for fine-tuning of the import/export proces

* @register_importer requires specific mixin

* Check subclass for export

* Fix typos

* Refactor export serializer

- Keep operations local to the class

* Add shim class to process an individual row before exporting it

* Add mixin to existing serializers

* Add export functionality for company serializers

* Adds  placeholder for custom admin class

* Update mantine styling

* spacing -> gap

* Add functionality to pre-process form data before upload

* Remove old references to download_queryset

* Improvements for data import drawer:

- Pin title at top of drawer

* Further improvements

* Fix column selection input

* Formatting improvements

* Use a <Stepper> component for better progress display

* Cleanup text

* Add export-only fields to BuildItem queryset

* Expand "export" fields for BuildItem dataset

* Skip backup and static steps in CI

* Remove hard-coded paths

* Fix for "accept_mapping" method

* Present required fields first on import session

* Add "get_importable_fields" method

* Add method for commiting imported row to database

* Cleanup

* Save "complete" state after row import

* Allow prevention of column caching

* Remove debug statement

* Add basic admin table for import sessions

* Fix for table filter functions

- New mantine version requires string values

* Add filters for import session table

* Remove debug message

* fix for <FilterItem />

* Create new import session from admin page

* Cleanup playground

* Re-open an existing import session

* Memoize cell value

* Update <ImportDataSelector>

* Enable download of build line data

* Add extra detail fields

* Register data importers for the stock app

* Enable download of stock item tracking data

* Register importerrs for "company" app

* Register importers for the "order" app

* Add extra fields to purchase order line item serializer

* Update verbose names for order models

* Cleanup import data table rendering

* Pass session information through to cell renderer

* add separate 'field_overrides' field

* Expose 'field_overrides' to API

* Refactor import field selection

* Use override data if provided

* Fix data extraction

- Ignore columns which are not mapped

* Fix fields.pop

- Provide 'None' argument

* Update import data rendering

* Handle missing / empty column names when importing data

* Bug fixin'

* Update hook

* Adds button to upload data straight to table

* Cache "available_fields"

- Reduces API access time by 85%

* Fix calculation of completed_row_count

* Import individual rows from import session

* Allow import of multiple simultaneous records

* Improve extraction of metadata

- Especially for related fields
- Request object no longer required

* Implement suspended rendering of model instances

* Cleanup

* Implement more columns for StockTable

* Allow stock filtering by packaging field

* Fix "stock_value" column

* Improve metadata extraction

- Handle read_only_fields in Meta
- Handle write_only_fields in Meta

* Increase maximum number of importable rows

* Force data import to run on background worker

* Add export-only fields to StockItemSerializer class

* Data conversion when performing initial import

* Various tweaks

* Fix order of operations for data import

* Rename component

* Allow import/export of more model types

* Fix verbose name

* Import rows as a bulk db operation

* Enable download for PartCategoryTemplateTable

* Update stock item export

* Updates for unit tests

* Remove xls format for now

- Causes some bug in tablib
- Surely xlsx is OK?

* More unit test updates

* Future proof migration

* Updates

* unit tests

* Unit test fix

* Remove 'field_overrides'

- field_defaults will suffice

* Remove 'xls' as download option from frontend

* Add simple unit test for data import

* PUI tweaks

---------

Co-authored-by: Lukas <76838159+wolflu05@users.noreply.github.com>
This commit is contained in:
Oliver 2024-07-06 18:29:52 +10:00 committed by GitHub
parent 58f12f5ce5
commit 1f6cd9fc54
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
121 changed files with 3747 additions and 508 deletions

View File

@ -98,4 +98,4 @@ runs:
- name: Run invoke update
if: ${{ inputs.update == 'true' }}
shell: bash
run: invoke update --uv
run: invoke update --uv --skip-backup --skip-static

View File

@ -383,42 +383,6 @@ class ListCreateDestroyAPIView(BulkDeleteMixin, ListCreateAPI):
...
class APIDownloadMixin:
"""Mixin for enabling a LIST endpoint to be downloaded a file.
To download the data, add the ?export=<fmt> to the query string.
The implementing class must provided a download_queryset method,
e.g.
def download_queryset(self, queryset, export_format):
dataset = StockItemResource().export(queryset=queryset)
filedata = dataset.export(export_format)
filename = 'InvenTree_Stocktake_{date}.{fmt}'.format(
date=datetime.now().strftime("%d-%b-%Y"),
fmt=export_format
)
return DownloadFile(filedata, filename)
"""
def get(self, request, *args, **kwargs):
"""Generic handler for a download request."""
export_format = request.query_params.get('export', None)
if export_format and export_format in ['csv', 'tsv', 'xls', 'xlsx']:
queryset = self.filter_queryset(self.get_queryset())
return self.download_queryset(queryset, export_format)
# Default to the parent class implementation
return super().get(request, *args, **kwargs)
def download_queryset(self, queryset, export_format):
"""This function must be implemented to provide a downloadFile request."""
raise NotImplementedError('download_queryset method not implemented!')
class APISearchViewSerializer(serializers.Serializer):
"""Serializer for the APISearchView."""

View File

@ -1,11 +1,14 @@
"""InvenTree API version information."""
# InvenTree API version
INVENTREE_API_VERSION = 210
INVENTREE_API_VERSION = 211
"""Increment this API version number whenever there is a significant change to the API that any clients need to know about."""
INVENTREE_API_TEXT = """
v211 - 2024-06-26 : https://github.com/inventree/InvenTree/pull/6911
- Adds API endpoints for managing data import and export
v210 - 2024-06-26 : https://github.com/inventree/InvenTree/pull/7518
- Adds translateable text to User API fields

View File

@ -190,7 +190,7 @@ class CustomSignupForm(SignupForm):
# check for two password fields
if not get_global_setting('LOGIN_SIGNUP_PWD_TWICE'):
self.fields.pop('password2')
self.fields.pop('password2', None)
# reorder fields
set_form_field_order(

View File

@ -429,8 +429,8 @@ def MakeBarcode(cls_name, object_pk: int, object_data=None, **kwargs):
def GetExportFormats():
"""Return a list of allowable file formats for exporting data."""
return ['csv', 'tsv', 'xls', 'xlsx', 'json', 'yaml']
"""Return a list of allowable file formats for importing or exporting tabular data."""
return ['csv', 'xlsx', 'tsv', 'json']
def DownloadFile(

View File

@ -252,7 +252,7 @@ def render_currency(
def getModelsWithMixin(mixin_class) -> list:
"""Return a list of models that inherit from the given mixin class.
"""Return a list of database models that inherit from the given mixin class.
Args:
mixin_class: The mixin class to search for

View File

@ -137,10 +137,10 @@ class InvenTreeMetadata(SimpleMetadata):
- field_value: The value of the field (if available)
- model_value: The equivalent value of the model (if available)
"""
if model_value and not field_value:
if field_value is None and model_value is not None:
return model_value
if field_value and not model_value:
if model_value is None and field_value is not None:
return field_value
# Callable values will be evaluated later
@ -160,6 +160,8 @@ class InvenTreeMetadata(SimpleMetadata):
"""Override get_serializer_info so that we can add 'default' values to any fields whose Meta.model specifies a default value."""
self.serializer = serializer
request = getattr(self, 'request', None)
serializer_info = super().get_serializer_info(serializer)
# Look for any dynamic fields which were not available when the serializer was instantiated
@ -169,12 +171,19 @@ class InvenTreeMetadata(SimpleMetadata):
# Already know about this one
continue
if hasattr(serializer, field_name):
field = getattr(serializer, field_name)
if field := getattr(serializer, field_name, None):
serializer_info[field_name] = self.get_field_info(field)
model_class = None
# Extract read_only_fields and write_only_fields from the Meta class (if available)
if meta := getattr(serializer, 'Meta', None):
read_only_fields = getattr(meta, 'read_only_fields', [])
write_only_fields = getattr(meta, 'write_only_fields', [])
else:
read_only_fields = []
write_only_fields = []
# Attributes to copy extra attributes from the model to the field (if they don't exist)
# Note that the attributes may be named differently on the underlying model!
extra_attributes = {
@ -188,16 +197,20 @@ class InvenTreeMetadata(SimpleMetadata):
model_fields = model_meta.get_field_info(model_class)
model_default_func = getattr(model_class, 'api_defaults', None)
if model_default_func:
model_default_values = model_class.api_defaults(self.request)
if model_default_func := getattr(model_class, 'api_defaults', None):
model_default_values = model_default_func(request=request) or {}
else:
model_default_values = {}
# Iterate through simple fields
for name, field in model_fields.fields.items():
if name in serializer_info.keys():
if name in read_only_fields:
serializer_info[name]['read_only'] = True
if name in write_only_fields:
serializer_info[name]['write_only'] = True
if field.has_default():
default = field.default
@ -231,6 +244,12 @@ class InvenTreeMetadata(SimpleMetadata):
# Ignore reverse relations
continue
if name in read_only_fields:
serializer_info[name]['read_only'] = True
if name in write_only_fields:
serializer_info[name]['write_only'] = True
# Extract and provide the "limit_choices_to" filters
# This is used to automatically filter AJAX requests
serializer_info[name]['filters'] = (
@ -261,7 +280,8 @@ class InvenTreeMetadata(SimpleMetadata):
if instance is None and model_class is not None:
# Attempt to find the instance based on kwargs lookup
kwargs = getattr(self.view, 'kwargs', None)
view = getattr(self, 'view', None)
kwargs = getattr(view, 'kwargs', None) if view else None
if kwargs:
pk = None
@ -318,8 +338,10 @@ class InvenTreeMetadata(SimpleMetadata):
# Force non-nullable fields to read as "required"
# (even if there is a default value!)
if not field.allow_null and not (
hasattr(field, 'allow_blank') and field.allow_blank
if (
'required' not in field_info
and not field.allow_null
and not (hasattr(field, 'allow_blank') and field.allow_blank)
):
field_info['required'] = True
@ -346,8 +368,11 @@ class InvenTreeMetadata(SimpleMetadata):
field_info['api_url'] = '/api/user/'
elif field_info['model'] == 'contenttype':
field_info['api_url'] = '/api/contenttype/'
else:
elif hasattr(model, 'get_api_url'):
field_info['api_url'] = model.get_api_url()
else:
logger.warning("'get_api_url' method not defined for %s", model)
field_info['api_url'] = getattr(model, 'api_url', None)
# Handle custom 'primary key' field
field_info['pk_field'] = getattr(field, 'pk_field', 'pk') or 'pk'

View File

@ -222,6 +222,9 @@ class DataImportMixin(object):
Models which implement this mixin should provide information on the fields available for import
"""
# TODO: This mixin should be removed after https://github.com/inventree/InvenTree/pull/6911 is implemented
# TODO: This approach to data import functionality is *outdated*
# Define a map of fields available for import
IMPORT_FIELDS = {}

View File

@ -856,7 +856,7 @@ class RemoteImageMixin(metaclass=serializers.SerializerMetaclass):
remote_image = serializers.URLField(
required=False,
allow_blank=False,
allow_blank=True,
write_only=True,
label=_('Remote Image'),
help_text=_('URL of remote image file'),

View File

@ -198,6 +198,7 @@ INSTALLED_APPS = [
'stock.apps.StockConfig',
'users.apps.UsersConfig',
'machine.apps.MachineConfig',
'importer.apps.ImporterConfig',
'web',
'generic',
'InvenTree.apps.InvenTreeConfig', # InvenTree app runs last

View File

@ -60,10 +60,6 @@ function exportFormatOptions() {
value: 'tsv',
display_name: 'TSV',
},
{
value: 'xls',
display_name: 'XLS',
},
{
value: 'xlsx',
display_name: 'XLSX',

View File

@ -256,8 +256,8 @@ def offload_task(
_func(*args, **kwargs)
except Exception as exc:
log_error('InvenTree.offload_task')
raise_warning(f"WARNING: '{taskname}' not started due to {str(exc)}")
return False
raise_warning(f"WARNING: '{taskname}' failed due to {str(exc)}")
raise exc
# Finally, task either completed successfully or was offloaded
return True

View File

@ -455,7 +455,10 @@ def get_user_color_theme(user):
"""Get current user color theme."""
from common.models import ColorTheme
if not user.is_authenticated:
try:
if not user.is_authenticated:
return 'default'
except Exception:
return 'default'
try:

View File

@ -412,12 +412,12 @@ class InvenTreeAPITestCase(ExchangeRateMixin, UserMixin, APITestCase):
# Extract filename
disposition = response.headers['Content-Disposition']
result = re.search(r'attachment; filename="([\w.]+)"', disposition)
result = re.search(r'attachment; filename="([\w\d\-.]+)"', disposition)
fn = result.groups()[0]
if expected_fn is not None:
self.assertEqual(expected_fn, fn)
self.assertRegex(fn, expected_fn)
if decode:
# Decode data and return as StringIO file object

View File

@ -21,6 +21,7 @@ from sesame.views import LoginView
import build.api
import common.api
import company.api
import importer.api
import machine.api
import order.api
import part.api
@ -80,11 +81,19 @@ admin.site.site_header = 'InvenTree Admin'
apipatterns = [
# Global search
path('admin/', include(common.api.admin_api_urls)),
path('bom/', include(part.api.bom_api_urls)),
path('build/', include(build.api.build_api_urls)),
path('company/', include(company.api.company_api_urls)),
path('importer/', include(importer.api.importer_api_urls)),
path('label/', include(report.api.label_api_urls)),
path('machine/', include(machine.api.machine_api_urls)),
path('order/', include(order.api.order_api_urls)),
path('part/', include(part.api.part_api_urls)),
path('report/', include(report.api.report_api_urls)),
path('search/', APISearchView.as_view(), name='api-search'),
path('settings/', include(common.api.settings_api_urls)),
path('part/', include(part.api.part_api_urls)),
path('bom/', include(part.api.bom_api_urls)),
path('company/', include(company.api.company_api_urls)),
path('stock/', include(stock.api.stock_api_urls)),
path(
'generate/',
include([
@ -100,14 +109,7 @@ apipatterns = [
),
]),
),
path('stock/', include(stock.api.stock_api_urls)),
path('build/', include(build.api.build_api_urls)),
path('order/', include(order.api.order_api_urls)),
path('label/', include(report.api.label_api_urls)),
path('report/', include(report.api.report_api_urls)),
path('machine/', include(machine.api.machine_api_urls)),
path('user/', include(users.api.user_urls)),
path('admin/', include(common.api.admin_api_urls)),
path('web/', include(web_api_urls)),
# Plugin endpoints
path('', include(plugin.api.plugin_api_urls)),

View File

@ -11,9 +11,11 @@ from rest_framework.exceptions import ValidationError
from django_filters.rest_framework import DjangoFilterBackend
from django_filters import rest_framework as rest_filters
from InvenTree.api import APIDownloadMixin, MetadataView
from importer.mixins import DataExportViewMixin
from InvenTree.api import MetadataView
from generic.states.api import StatusView
from InvenTree.helpers import str2bool, isNull, DownloadFile
from InvenTree.helpers import str2bool, isNull
from build.status_codes import BuildStatus, BuildStatusGroups
from InvenTree.mixins import CreateAPI, RetrieveUpdateDestroyAPI, ListCreateAPI
@ -125,7 +127,7 @@ class BuildMixin:
return queryset
class BuildList(APIDownloadMixin, BuildMixin, ListCreateAPI):
class BuildList(DataExportViewMixin, BuildMixin, ListCreateAPI):
"""API endpoint for accessing a list of Build objects.
- GET: Return list of objects (with filters)
@ -176,15 +178,6 @@ class BuildList(APIDownloadMixin, BuildMixin, ListCreateAPI):
return queryset
def download_queryset(self, queryset, export_format):
"""Download the queryset data as a file."""
dataset = build.admin.BuildResource().export(queryset=queryset)
filedata = dataset.export(export_format)
filename = f"InvenTree_BuildOrders.{export_format}"
return DownloadFile(filedata, filename)
def filter_queryset(self, queryset):
"""Custom query filtering for the BuildList endpoint."""
queryset = super().filter_queryset(queryset)
@ -351,7 +344,7 @@ class BuildLineEndpoint:
return queryset
class BuildLineList(BuildLineEndpoint, ListCreateAPI):
class BuildLineList(BuildLineEndpoint, DataExportViewMixin, ListCreateAPI):
"""API endpoint for accessing a list of BuildLine objects"""
filterset_class = BuildLineFilter
@ -553,7 +546,7 @@ class BuildItemFilter(rest_filters.FilterSet):
return queryset.filter(install_into=None)
class BuildItemList(ListCreateAPI):
class BuildItemList(DataExportViewMixin, ListCreateAPI):
"""API endpoint for accessing a list of BuildItem objects.
- GET: Return list of objects
@ -583,10 +576,15 @@ class BuildItemList(ListCreateAPI):
queryset = queryset.select_related(
'build_line',
'build_line__build',
'build_line__bom_item',
'install_into',
'stock_item',
'stock_item__location',
'stock_item__part',
'stock_item__supplier_part',
'stock_item__supplier_part__manufacturer_part',
).prefetch_related(
'stock_item__location__tags',
)
return queryset

View File

@ -104,7 +104,7 @@ class Build(
}
@classmethod
def api_defaults(cls, request):
def api_defaults(cls, request=None):
"""Return default values for this model when issuing an API OPTIONS request."""
defaults = {
'reference': generate_next_build_reference(),

View File

@ -25,6 +25,7 @@ from stock.serializers import StockItemSerializerBrief, LocationSerializer
import common.models
from common.serializers import ProjectCodeSerializer
from importer.mixins import DataImportExportSerializerMixin
import part.filters
from part.serializers import BomItemSerializer, PartSerializer, PartBriefSerializer
from users.serializers import OwnerSerializer
@ -32,7 +33,7 @@ from users.serializers import OwnerSerializer
from .models import Build, BuildLine, BuildItem
class BuildSerializer(NotesFieldMixin, InvenTreeModelSerializer):
class BuildSerializer(NotesFieldMixin, DataImportExportSerializerMixin, InvenTreeModelSerializer):
"""Serializes a Build object."""
class Meta:
@ -50,6 +51,7 @@ class BuildSerializer(NotesFieldMixin, InvenTreeModelSerializer):
'destination',
'parent',
'part',
'part_name',
'part_detail',
'project_code',
'project_code_detail',
@ -84,6 +86,8 @@ class BuildSerializer(NotesFieldMixin, InvenTreeModelSerializer):
part_detail = PartBriefSerializer(source='part', many=False, read_only=True)
part_name = serializers.CharField(source='part.name', read_only=True, label=_('Part Name'))
quantity = InvenTreeDecimalField()
overdue = serializers.BooleanField(required=False, read_only=True)
@ -124,7 +128,7 @@ class BuildSerializer(NotesFieldMixin, InvenTreeModelSerializer):
super().__init__(*args, **kwargs)
if part_detail is not True:
self.fields.pop('part_detail')
self.fields.pop('part_detail', None)
reference = serializers.CharField(required=True)
@ -1049,8 +1053,17 @@ class BuildAutoAllocationSerializer(serializers.Serializer):
raise ValidationError(_("Failed to start auto-allocation task"))
class BuildItemSerializer(InvenTreeModelSerializer):
"""Serializes a BuildItem object."""
class BuildItemSerializer(DataImportExportSerializerMixin, InvenTreeModelSerializer):
"""Serializes a BuildItem object, which is an allocation of a stock item against a build order."""
# These fields are only used for data export
export_only_fields = [
'build_reference',
'bom_reference',
'sku',
'mpn',
'location_name',
]
class Meta:
"""Serializer metaclass"""
@ -1062,18 +1075,36 @@ class BuildItemSerializer(InvenTreeModelSerializer):
'install_into',
'stock_item',
'quantity',
'location',
# Detail fields, can be included or excluded
'build_detail',
'location_detail',
'part_detail',
'stock_item_detail',
'build_detail',
# The following fields are only used for data export
'bom_reference',
'build_reference',
'location_name',
'mpn',
'sku',
]
# Export-only fields
sku = serializers.CharField(source='stock_item.supplier_part.SKU', label=_('Supplier Part Number'), read_only=True)
mpn = serializers.CharField(source='stock_item.supplier_part.manufacturer_part.MPN', label=_('Manufacturer Part Number'), read_only=True)
location_name = serializers.CharField(source='stock_item.location.name', label=_('Location Name'), read_only=True)
build_reference = serializers.CharField(source='build.reference', label=_('Build Reference'), read_only=True)
bom_reference = serializers.CharField(source='build_line.bom_item.reference', label=_('BOM Reference'), read_only=True)
# Annotated fields
build = serializers.PrimaryKeyRelatedField(source='build_line.build', many=False, read_only=True)
# Extra (optional) detail fields
part_detail = PartBriefSerializer(source='stock_item.part', many=False, read_only=True, pricing=False)
stock_item_detail = StockItemSerializerBrief(source='stock_item', read_only=True)
location = serializers.PrimaryKeyRelatedField(source='stock_item.location', many=False, read_only=True)
location_detail = LocationSerializer(source='stock_item.location', read_only=True)
build_detail = BuildSerializer(source='build_line.build', many=False, read_only=True)
@ -1089,21 +1120,25 @@ class BuildItemSerializer(InvenTreeModelSerializer):
super().__init__(*args, **kwargs)
if not part_detail:
self.fields.pop('part_detail')
self.fields.pop('part_detail', None)
if not location_detail:
self.fields.pop('location_detail')
self.fields.pop('location_detail', None)
if not stock_detail:
self.fields.pop('stock_item_detail')
self.fields.pop('stock_item_detail', None)
if not build_detail:
self.fields.pop('build_detail')
self.fields.pop('build_detail', None)
class BuildLineSerializer(InvenTreeModelSerializer):
class BuildLineSerializer(DataImportExportSerializerMixin, InvenTreeModelSerializer):
"""Serializer for a BuildItem object."""
export_exclude_fields = [
'allocations',
]
class Meta:
"""Serializer metaclass"""
@ -1117,6 +1152,17 @@ class BuildLineSerializer(InvenTreeModelSerializer):
'quantity',
'allocations',
# BOM item detail fields
'reference',
'consumable',
'optional',
'trackable',
# Part detail fields
'part',
'part_name',
'part_IPN',
# Annotated fields
'allocated',
'in_production',
@ -1134,7 +1180,18 @@ class BuildLineSerializer(InvenTreeModelSerializer):
'allocations',
]
quantity = serializers.FloatField()
# Part info fields
part = serializers.PrimaryKeyRelatedField(source='bom_item.sub_part', label=_('Part'), many=False, read_only=True)
part_name = serializers.CharField(source='bom_item.sub_part.name', label=_('Part Name'), read_only=True)
part_IPN = serializers.CharField(source='bom_item.sub_part.IPN', label=_('Part IPN'), read_only=True)
# BOM item info fields
reference = serializers.CharField(source='bom_item.reference', label=_('Reference'), read_only=True)
consumable = serializers.BooleanField(source='bom_item.consumable', label=_('Consumable'), read_only=True)
optional = serializers.BooleanField(source='bom_item.optional', label=_('Optional'), read_only=True)
trackable = serializers.BooleanField(source='bom_item.sub_part.trackable', label=_('Trackable'), read_only=True)
quantity = serializers.FloatField(label=_('Quantity'))
bom_item = serializers.PrimaryKeyRelatedField(label=_('BOM Item'), read_only=True)
@ -1164,10 +1221,10 @@ class BuildLineSerializer(InvenTreeModelSerializer):
read_only=True
)
available_substitute_stock = serializers.FloatField(read_only=True)
available_variant_stock = serializers.FloatField(read_only=True)
total_available_stock = serializers.FloatField(read_only=True)
external_stock = serializers.FloatField(read_only=True)
available_substitute_stock = serializers.FloatField(read_only=True, label=_('Available Substitute Stock'))
available_variant_stock = serializers.FloatField(read_only=True, label=_('Available Variant Stock'))
total_available_stock = serializers.FloatField(read_only=True, label=_('Total Available Stock'))
external_stock = serializers.FloatField(read_only=True, label=_('External Stock'))
@staticmethod
def annotate_queryset(queryset, build=None):

View File

@ -564,16 +564,16 @@ class BuildTest(BuildAPITest):
def test_download_build_orders(self):
"""Test that we can download a list of build orders via the API"""
required_cols = [
'reference',
'status',
'completed',
'batch',
'notes',
'title',
'part',
'part_name',
'id',
'quantity',
'Reference',
'Build Status',
'Completed items',
'Batch Code',
'Notes',
'Description',
'Part',
'Part Name',
'ID',
'Quantity',
]
excluded_cols = [
@ -597,13 +597,13 @@ class BuildTest(BuildAPITest):
for row in data:
build = Build.objects.get(pk=row['id'])
build = Build.objects.get(pk=row['ID'])
self.assertEqual(str(build.part.pk), row['part'])
self.assertEqual(build.part.full_name, row['part_name'])
self.assertEqual(str(build.part.pk), row['Part'])
self.assertEqual(build.part.name, row['Part Name'])
self.assertEqual(build.reference, row['reference'])
self.assertEqual(build.title, row['title'])
self.assertEqual(build.reference, row['Reference'])
self.assertEqual(build.title, row['Description'])
class BuildAllocationTest(BuildAPITest):

View File

@ -27,6 +27,7 @@ import common.models
import common.serializers
from common.settings import get_global_setting
from generic.states.api import AllStatusViews, StatusView
from importer.mixins import DataExportViewMixin
from InvenTree.api import BulkDeleteMixin, MetadataView
from InvenTree.config import CONFIG_LOOKUPS
from InvenTree.filters import ORDER_FILTER, SEARCH_ORDER_FILTER
@ -494,7 +495,7 @@ class NotesImageList(ListCreateAPI):
image.save()
class ProjectCodeList(ListCreateAPI):
class ProjectCodeList(DataExportViewMixin, ListCreateAPI):
"""List view for all project codes."""
queryset = common.models.ProjectCode.objects.all()
@ -515,7 +516,7 @@ class ProjectCodeDetail(RetrieveUpdateDestroyAPI):
permission_classes = [permissions.IsAuthenticated, IsStaffOrReadOnly]
class CustomUnitList(ListCreateAPI):
class CustomUnitList(DataExportViewMixin, ListCreateAPI):
"""List view for custom units."""
queryset = common.models.CustomUnit.objects.all()

View File

@ -17,5 +17,8 @@ class Migration(migrations.Migration):
('code', models.CharField(help_text='Unique project code', max_length=50, unique=True, verbose_name='Project Code')),
('description', models.CharField(blank=True, help_text='Project description', max_length=200, verbose_name='Description')),
],
options={
'verbose_name': 'Project Code',
},
),
]

View File

@ -18,5 +18,8 @@ class Migration(migrations.Migration):
('symbol', models.CharField(blank=True, help_text='Optional unit symbol', max_length=10, unique=True, verbose_name='Symbol')),
('definition', models.CharField(help_text='Unit definition', max_length=50, verbose_name='Definition')),
],
options={
'verbose_name': 'Custom Unit',
},
),
]

View File

@ -1,5 +1,6 @@
# Generated by Django 4.2.12 on 2024-06-02 13:32
from django.conf import settings
from django.db import migrations
from moneyed import CURRENCIES
@ -47,16 +48,20 @@ def set_currencies(apps, schema_editor):
return
value = ','.join(valid_codes)
print(f"Found existing currency codes:", value)
if not settings.TESTING:
print(f"Found existing currency codes:", value)
setting = InvenTreeSetting.objects.filter(key=key).first()
if setting:
print(f"- Updating existing setting for currency codes")
if not settings.TESTING:
print(f"- Updating existing setting for currency codes")
setting.value = value
setting.save()
else:
print(f"- Creating new setting for currency codes")
if not settings.TESTING:
print(f"- Creating new setting for currency codes")
setting = InvenTreeSetting(key=key, value=value)
setting.save()

View File

@ -116,6 +116,11 @@ class BaseURLValidator(URLValidator):
class ProjectCode(InvenTree.models.InvenTreeMetadataModel):
"""A ProjectCode is a unique identifier for a project."""
class Meta:
"""Class options for the ProjectCode model."""
verbose_name = _('Project Code')
@staticmethod
def get_api_url():
"""Return the API URL for this model."""
@ -3048,6 +3053,11 @@ class CustomUnit(models.Model):
https://pint.readthedocs.io/en/stable/advanced/defining.html
"""
class Meta:
"""Class meta options."""
verbose_name = _('Custom Unit')
def fmt_string(self):
"""Construct a unit definition string e.g. 'dog_year = 52 * day = dy'."""
fmt = f'{self.name} = {self.definition}'

View File

@ -14,6 +14,8 @@ from taggit.serializers import TagListSerializerField
import common.models as common_models
import common.validators
from importer.mixins import DataImportExportSerializerMixin
from importer.registry import register_importer
from InvenTree.helpers import get_objectreference
from InvenTree.helpers_model import construct_absolute_url
from InvenTree.serializers import (
@ -293,7 +295,8 @@ class NotesImageSerializer(InvenTreeModelSerializer):
image = InvenTreeImageSerializerField(required=True)
class ProjectCodeSerializer(InvenTreeModelSerializer):
@register_importer()
class ProjectCodeSerializer(DataImportExportSerializerMixin, InvenTreeModelSerializer):
"""Serializer for the ProjectCode model."""
class Meta:
@ -341,7 +344,8 @@ class ContentTypeSerializer(serializers.Serializer):
return obj.app_label in plugin_registry.installed_apps
class CustomUnitSerializer(InvenTreeModelSerializer):
@register_importer()
class CustomUnitSerializer(DataImportExportSerializerMixin, InvenTreeModelSerializer):
"""DRF serializer for CustomUnit model."""
class Meta:

View File

@ -1376,7 +1376,7 @@ class ProjectCodesTest(InvenTreeAPITestCase):
)
self.assertIn(
'project code with this Project Code already exists',
'Project Code with this Project Code already exists',
str(response.data['code']),
)

View File

@ -6,6 +6,8 @@ from import_export import widgets
from import_export.admin import ImportExportModelAdmin
from import_export.fields import Field
import company.serializers
import importer.admin
from InvenTree.admin import InvenTreeResource
from part.models import Part
@ -33,9 +35,10 @@ class CompanyResource(InvenTreeResource):
@admin.register(Company)
class CompanyAdmin(ImportExportModelAdmin):
class CompanyAdmin(importer.admin.DataExportAdmin, ImportExportModelAdmin):
"""Admin class for the Company model."""
serializer_class = company.serializers.CompanySerializer
resource_class = CompanyResource
list_display = ('name', 'website', 'contact')

View File

@ -7,12 +7,9 @@ from django.utils.translation import gettext_lazy as _
from django_filters import rest_framework as rest_filters
import part.models
from importer.mixins import DataExportViewMixin
from InvenTree.api import ListCreateDestroyAPIView, MetadataView
from InvenTree.filters import (
ORDER_FILTER,
SEARCH_ORDER_FILTER,
SEARCH_ORDER_FILTER_ALIAS,
)
from InvenTree.filters import SEARCH_ORDER_FILTER, SEARCH_ORDER_FILTER_ALIAS
from InvenTree.helpers import str2bool
from InvenTree.mixins import ListCreateAPI, RetrieveUpdateDestroyAPI
@ -36,7 +33,7 @@ from .serializers import (
)
class CompanyList(ListCreateAPI):
class CompanyList(DataExportViewMixin, ListCreateAPI):
"""API endpoint for accessing a list of Company objects.
Provides two methods:
@ -84,7 +81,7 @@ class CompanyDetail(RetrieveUpdateDestroyAPI):
return queryset
class ContactList(ListCreateDestroyAPIView):
class ContactList(DataExportViewMixin, ListCreateDestroyAPIView):
"""API endpoint for list view of Company model."""
queryset = Contact.objects.all()
@ -108,7 +105,7 @@ class ContactDetail(RetrieveUpdateDestroyAPI):
serializer_class = ContactSerializer
class AddressList(ListCreateDestroyAPIView):
class AddressList(DataExportViewMixin, ListCreateDestroyAPIView):
"""API endpoint for list view of Address model."""
queryset = Address.objects.all()
@ -149,7 +146,7 @@ class ManufacturerPartFilter(rest_filters.FilterSet):
)
class ManufacturerPartList(ListCreateDestroyAPIView):
class ManufacturerPartList(DataExportViewMixin, ListCreateDestroyAPIView):
"""API endpoint for list view of ManufacturerPart object.
- GET: Return list of ManufacturerPart objects
@ -297,7 +294,7 @@ class SupplierPartFilter(rest_filters.FilterSet):
)
class SupplierPartList(ListCreateDestroyAPIView):
class SupplierPartList(DataExportViewMixin, ListCreateDestroyAPIView):
"""API endpoint for list view of SupplierPart object.
- GET: Return list of SupplierPart objects

View File

@ -44,6 +44,9 @@ class Migration(migrations.Migration):
('email', models.EmailField(blank=True, max_length=254)),
('role', models.CharField(blank=True, max_length=100)),
],
options={
'verbose_name': 'Contact',
}
),
migrations.CreateModel(
name='SupplierPart',
@ -75,6 +78,7 @@ class Migration(migrations.Migration):
('part', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='pricebreaks', to='company.SupplierPart')),
],
options={
'verbose_name': 'Supplier Price Break',
'db_table': 'part_supplierpricebreak',
},
),

View File

@ -23,17 +23,17 @@ class Migration(migrations.Migration):
migrations.AlterField(
model_name='company',
name='is_customer',
field=models.BooleanField(default=False, help_text='Do you sell items to this company?', verbose_name='is customer'),
field=models.BooleanField(default=False, help_text='Do you sell items to this company?', verbose_name='Is customer'),
),
migrations.AlterField(
model_name='company',
name='is_manufacturer',
field=models.BooleanField(default=False, help_text='Does this company manufacture parts?', verbose_name='is manufacturer'),
field=models.BooleanField(default=False, help_text='Does this company manufacture parts?', verbose_name='Is manufacturer'),
),
migrations.AlterField(
model_name='company',
name='is_supplier',
field=models.BooleanField(default=True, help_text='Do you purchase items from this company?', verbose_name='is supplier'),
field=models.BooleanField(default=True, help_text='Do you purchase items from this company?', verbose_name='Is supplier'),
),
migrations.AlterField(
model_name='company',

View File

@ -21,6 +21,7 @@ class Migration(migrations.Migration):
('manufacturer_part', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='parameters', to='company.manufacturerpart', verbose_name='Manufacturer Part')),
],
options={
'verbose_name': 'Manufacturer Part Parameter',
'unique_together': {('manufacturer_part', 'name')},
},
),

View File

@ -12,7 +12,10 @@ class Migration(migrations.Migration):
operations = [
migrations.AlterModelOptions(
name='address',
options={'verbose_name_plural': 'Addresses'},
options={
'verbose_name': 'Address',
'verbose_name_plural': 'Addresses'
},
),
migrations.AlterField(
model_name='address',

View File

@ -165,19 +165,19 @@ class Company(
is_customer = models.BooleanField(
default=False,
verbose_name=_('is customer'),
verbose_name=_('Is customer'),
help_text=_('Do you sell items to this company?'),
)
is_supplier = models.BooleanField(
default=True,
verbose_name=_('is supplier'),
verbose_name=_('Is supplier'),
help_text=_('Do you purchase items from this company?'),
)
is_manufacturer = models.BooleanField(
default=False,
verbose_name=_('is manufacturer'),
verbose_name=_('Is manufacturer'),
help_text=_('Does this company manufacture parts?'),
)
@ -269,6 +269,11 @@ class Contact(InvenTree.models.InvenTreeMetadataModel):
role: position in company
"""
class Meta:
"""Metaclass defines extra model options."""
verbose_name = _('Contact')
@staticmethod
def get_api_url():
"""Return the API URL associated with the Contcat model."""
@ -306,7 +311,8 @@ class Address(InvenTree.models.InvenTreeModel):
class Meta:
"""Metaclass defines extra model options."""
verbose_name_plural = 'Addresses'
verbose_name = _('Address')
verbose_name_plural = _('Addresses')
def __init__(self, *args, **kwargs):
"""Custom init function."""
@ -560,6 +566,7 @@ class ManufacturerPartParameter(InvenTree.models.InvenTreeModel):
class Meta:
"""Metaclass defines extra model options."""
verbose_name = _('Manufacturer Part Parameter')
unique_together = ('manufacturer_part', 'name')
@staticmethod
@ -1005,6 +1012,7 @@ class SupplierPriceBreak(common.models.PriceBreak):
class Meta:
"""Metaclass defines extra model options."""
verbose_name = _('Supplier Price Break')
unique_together = ('part', 'quantity')
# This model was moved from the 'Part' app

View File

@ -10,6 +10,8 @@ from sql_util.utils import SubqueryCount
from taggit.serializers import TagListSerializerField
import part.filters
from importer.mixins import DataImportExportSerializerMixin
from importer.registry import register_importer
from InvenTree.serializers import (
InvenTreeCurrencySerializer,
InvenTreeDecimalField,
@ -56,7 +58,8 @@ class CompanyBriefSerializer(InvenTreeModelSerializer):
thumbnail = serializers.CharField(source='get_thumbnail_url', read_only=True)
class AddressSerializer(InvenTreeModelSerializer):
@register_importer()
class AddressSerializer(DataImportExportSerializerMixin, InvenTreeModelSerializer):
"""Serializer for the Address Model."""
class Meta:
@ -100,9 +103,19 @@ class AddressBriefSerializer(InvenTreeModelSerializer):
]
class CompanySerializer(NotesFieldMixin, RemoteImageMixin, InvenTreeModelSerializer):
@register_importer()
class CompanySerializer(
DataImportExportSerializerMixin,
NotesFieldMixin,
RemoteImageMixin,
InvenTreeModelSerializer,
):
"""Serializer for Company object (full detail)."""
export_exclude_fields = ['url', 'primary_address']
import_exclude_fields = ['image']
class Meta:
"""Metaclass options."""
@ -183,17 +196,25 @@ class CompanySerializer(NotesFieldMixin, RemoteImageMixin, InvenTreeModelSeriali
return self.instance
class ContactSerializer(InvenTreeModelSerializer):
@register_importer()
class ContactSerializer(DataImportExportSerializerMixin, InvenTreeModelSerializer):
"""Serializer class for the Contact model."""
class Meta:
"""Metaclass options."""
model = Contact
fields = ['pk', 'company', 'name', 'phone', 'email', 'role']
fields = ['pk', 'company', 'company_name', 'name', 'phone', 'email', 'role']
company_name = serializers.CharField(
label=_('Company Name'), source='company.name', read_only=True
)
class ManufacturerPartSerializer(InvenTreeTagModelSerializer):
@register_importer()
class ManufacturerPartSerializer(
DataImportExportSerializerMixin, InvenTreeTagModelSerializer
):
"""Serializer for ManufacturerPart object."""
class Meta:
@ -225,13 +246,13 @@ class ManufacturerPartSerializer(InvenTreeTagModelSerializer):
super().__init__(*args, **kwargs)
if part_detail is not True:
self.fields.pop('part_detail')
self.fields.pop('part_detail', None)
if manufacturer_detail is not True:
self.fields.pop('manufacturer_detail')
self.fields.pop('manufacturer_detail', None)
if prettify is not True:
self.fields.pop('pretty_name')
self.fields.pop('pretty_name', None)
part_detail = PartBriefSerializer(source='part', many=False, read_only=True)
@ -246,7 +267,10 @@ class ManufacturerPartSerializer(InvenTreeTagModelSerializer):
)
class ManufacturerPartParameterSerializer(InvenTreeModelSerializer):
@register_importer()
class ManufacturerPartParameterSerializer(
DataImportExportSerializerMixin, InvenTreeModelSerializer
):
"""Serializer for the ManufacturerPartParameter model."""
class Meta:
@ -270,14 +294,17 @@ class ManufacturerPartParameterSerializer(InvenTreeModelSerializer):
super().__init__(*args, **kwargs)
if not man_detail:
self.fields.pop('manufacturer_part_detail')
self.fields.pop('manufacturer_part_detail', None)
manufacturer_part_detail = ManufacturerPartSerializer(
source='manufacturer_part', many=False, read_only=True
)
class SupplierPartSerializer(InvenTreeTagModelSerializer):
@register_importer()
class SupplierPartSerializer(
DataImportExportSerializerMixin, InvenTreeTagModelSerializer
):
"""Serializer for SupplierPart object."""
class Meta:
@ -341,17 +368,17 @@ class SupplierPartSerializer(InvenTreeTagModelSerializer):
super().__init__(*args, **kwargs)
if part_detail is not True:
self.fields.pop('part_detail')
self.fields.pop('part_detail', None)
if supplier_detail is not True:
self.fields.pop('supplier_detail')
self.fields.pop('supplier_detail', None)
if manufacturer_detail is not True:
self.fields.pop('manufacturer_detail')
self.fields.pop('manufacturer_part_detail')
self.fields.pop('manufacturer_detail', None)
self.fields.pop('manufacturer_part_detail', None)
if prettify is not True:
self.fields.pop('pretty_name')
self.fields.pop('pretty_name', None)
# Annotated field showing total in-stock quantity
in_stock = serializers.FloatField(read_only=True, label=_('In Stock'))
@ -435,7 +462,10 @@ class SupplierPartSerializer(InvenTreeTagModelSerializer):
return supplier_part
class SupplierPriceBreakSerializer(InvenTreeModelSerializer):
@register_importer()
class SupplierPriceBreakSerializer(
DataImportExportSerializerMixin, InvenTreeModelSerializer
):
"""Serializer for SupplierPriceBreak object."""
class Meta:
@ -462,10 +492,10 @@ class SupplierPriceBreakSerializer(InvenTreeModelSerializer):
super().__init__(*args, **kwargs)
if not supplier_detail:
self.fields.pop('supplier_detail')
self.fields.pop('supplier_detail', None)
if not part_detail:
self.fields.pop('part_detail')
self.fields.pop('part_detail', None)
quantity = InvenTreeDecimalField()

View File

@ -0,0 +1,80 @@
"""Admin site specification for the 'importer' app."""
from django.contrib import admin
from django.urls import path
import importer.models
import importer.registry
class DataImportColumnMapAdmin(admin.TabularInline):
"""Inline admin for DataImportColumnMap model."""
model = importer.models.DataImportColumnMap
can_delete = False
max_num = 0
def get_readonly_fields(self, request, obj=None):
"""Return the readonly fields for the admin interface."""
return ['field']
def formfield_for_dbfield(self, db_field, request, **kwargs):
"""Override the choices for the column field."""
if db_field.name == 'column':
# TODO: Implement this!
queryset = self.get_queryset(request)
if queryset.count() > 0:
session = queryset.first().session
db_field.choices = [(col, col) for col in session.columns]
return super().formfield_for_choice_field(db_field, request, **kwargs)
@admin.register(importer.models.DataImportSession)
class DataImportSessionAdmin(admin.ModelAdmin):
"""Admin interface for the DataImportSession model."""
list_display = ['id', 'data_file', 'status', 'user']
list_filter = ['status']
inlines = [DataImportColumnMapAdmin]
def get_readonly_fields(self, request, obj=None):
"""Update the readonly fields for the admin interface."""
fields = ['columns', 'status', 'timestamp']
# Prevent data file from being edited after upload!
if obj:
fields += ['data_file']
else:
fields += ['field_mapping']
return fields
def formfield_for_dbfield(self, db_field, request, **kwargs):
"""Override the choices for the model_type field."""
if db_field.name == 'model_type':
db_field.choices = importer.registry.supported_model_options()
return super().formfield_for_dbfield(db_field, request, **kwargs)
@admin.register(importer.models.DataImportRow)
class DataImportRowAdmin(admin.ModelAdmin):
"""Admin interface for the DataImportRow model."""
list_display = ['id', 'session', 'row_index']
def get_readonly_fields(self, request, obj=None):
"""Return the readonly fields for the admin interface."""
return ['session', 'row_index', 'row_data', 'errors', 'valid']
class DataExportAdmin(admin.ModelAdmin):
"""Custom admin class mixin allowing for data export functionality."""
serializer_class = None
# TODO: Add custom admin action to export queryset data

View File

@ -0,0 +1,200 @@
"""API endpoints for the importer app."""
from django.shortcuts import get_object_or_404
from django.urls import include, path
from drf_spectacular.utils import extend_schema
from rest_framework import permissions
from rest_framework.response import Response
from rest_framework.views import APIView
import importer.models
import importer.registry
import importer.serializers
from InvenTree.api import BulkDeleteMixin
from InvenTree.filters import SEARCH_ORDER_FILTER
from InvenTree.mixins import (
CreateAPI,
ListAPI,
ListCreateAPI,
RetrieveUpdateAPI,
RetrieveUpdateDestroyAPI,
)
class DataImporterModelList(APIView):
"""API endpoint for displaying a list of models available for import."""
permission_classes = [permissions.IsAuthenticated]
def get(self, request):
"""Return a list of models available for import."""
models = []
for serializer in importer.registry.get_supported_serializers():
model = serializer.Meta.model
url = model.get_api_url() if hasattr(model, 'get_api_url') else None
models.append({
'serializer': str(serializer.__name__),
'model_type': model.__name__.lower(),
'api_url': url,
})
return Response(models)
class DataImportSessionList(BulkDeleteMixin, ListCreateAPI):
"""API endpoint for accessing a list of DataImportSession objects."""
queryset = importer.models.DataImportSession.objects.all()
serializer_class = importer.serializers.DataImportSessionSerializer
filter_backends = SEARCH_ORDER_FILTER
filterset_fields = ['model_type', 'status', 'user']
ordering_fields = ['timestamp', 'status', 'model_type']
class DataImportSessionDetail(RetrieveUpdateDestroyAPI):
"""Detail endpoint for a single DataImportSession object."""
queryset = importer.models.DataImportSession.objects.all()
serializer_class = importer.serializers.DataImportSessionSerializer
class DataImportSessionAcceptFields(APIView):
"""API endpoint to accept the field mapping for a DataImportSession."""
permission_classes = [permissions.IsAuthenticated]
@extend_schema(
responses={200: importer.serializers.DataImportSessionSerializer(many=False)}
)
def post(self, request, pk):
"""Accept the field mapping for a DataImportSession."""
session = get_object_or_404(importer.models.DataImportSession, pk=pk)
# Attempt to accept the mapping (may raise an exception if the mapping is invalid)
session.accept_mapping()
return Response(importer.serializers.DataImportSessionSerializer(session).data)
class DataImportSessionAcceptRows(CreateAPI):
"""API endpoint to accept the rows for a DataImportSession."""
queryset = importer.models.DataImportSession.objects.all()
serializer_class = importer.serializers.DataImportAcceptRowSerializer
def get_serializer_context(self):
"""Add the import session object to the serializer context."""
ctx = super().get_serializer_context()
try:
ctx['session'] = importer.models.DataImportSession.objects.get(
pk=self.kwargs.get('pk', None)
)
except Exception:
pass
ctx['request'] = self.request
return ctx
class DataImportColumnMappingList(ListAPI):
"""API endpoint for accessing a list of DataImportColumnMap objects."""
queryset = importer.models.DataImportColumnMap.objects.all()
serializer_class = importer.serializers.DataImportColumnMapSerializer
filter_backends = SEARCH_ORDER_FILTER
filterset_fields = ['session']
class DataImportColumnMappingDetail(RetrieveUpdateAPI):
"""Detail endpoint for a single DataImportColumnMap object."""
queryset = importer.models.DataImportColumnMap.objects.all()
serializer_class = importer.serializers.DataImportColumnMapSerializer
class DataImportRowList(BulkDeleteMixin, ListAPI):
"""API endpoint for accessing a list of DataImportRow objects."""
queryset = importer.models.DataImportRow.objects.all()
serializer_class = importer.serializers.DataImportRowSerializer
filter_backends = SEARCH_ORDER_FILTER
filterset_fields = ['session', 'valid', 'complete']
ordering_fields = ['pk', 'row_index', 'valid']
ordering = 'row_index'
class DataImportRowDetail(RetrieveUpdateDestroyAPI):
"""Detail endpoint for a single DataImportRow object."""
queryset = importer.models.DataImportRow.objects.all()
serializer_class = importer.serializers.DataImportRowSerializer
importer_api_urls = [
path('models/', DataImporterModelList.as_view(), name='api-importer-model-list'),
path(
'session/',
include([
path(
'<int:pk>/',
include([
path(
'accept_fields/',
DataImportSessionAcceptFields.as_view(),
name='api-import-session-accept-fields',
),
path(
'accept_rows/',
DataImportSessionAcceptRows.as_view(),
name='api-import-session-accept-rows',
),
path(
'',
DataImportSessionDetail.as_view(),
name='api-import-session-detail',
),
]),
),
path('', DataImportSessionList.as_view(), name='api-importer-session-list'),
]),
),
path(
'column-mapping/',
include([
path(
'<int:pk>/',
DataImportColumnMappingDetail.as_view(),
name='api-importer-mapping-detail',
),
path(
'',
DataImportColumnMappingList.as_view(),
name='api-importer-mapping-list',
),
]),
),
path(
'row/',
include([
path(
'<int:pk>/',
DataImportRowDetail.as_view(),
name='api-importer-row-detail',
),
path('', DataImportRowList.as_view(), name='api-importer-row-list'),
]),
),
]

View File

@ -0,0 +1,10 @@
"""AppConfig for the 'importer' app."""
from django.apps import AppConfig
class ImporterConfig(AppConfig):
"""AppConfig class for the 'importer' app."""
default_auto_field = 'django.db.models.BigAutoField'
name = 'importer'

View File

@ -0,0 +1,56 @@
# Generated by Django 4.2.12 on 2024-06-30 04:42
from django.conf import settings
import django.core.validators
from django.db import migrations, models
import django.db.models.deletion
import importer.validators
import InvenTree.helpers
from importer.status_codes import DataImportStatusCode
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='DataImportSession',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('timestamp', models.DateTimeField(auto_now_add=True, verbose_name='Timestamp')),
('data_file', models.FileField(help_text='Data file to import', upload_to='import', validators=[django.core.validators.FileExtensionValidator(allowed_extensions=InvenTree.helpers.GetExportFormats()), importer.validators.validate_data_file], verbose_name='Data File')),
('columns', models.JSONField(blank=True, null=True, verbose_name='Columns')),
('model_type', models.CharField(max_length=100, validators=[importer.validators.validate_importer_model_type])),
('status', models.PositiveIntegerField(choices=DataImportStatusCode.items(), default=DataImportStatusCode.INITIAL.value, help_text='Import status')),
('field_defaults', models.JSONField(blank=True, null=True, validators=[importer.validators.validate_field_defaults], verbose_name='Field Defaults')),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL, verbose_name='User')),
],
),
migrations.CreateModel(
name='DataImportRow',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('row_index', models.PositiveIntegerField(default=0, verbose_name='Row Index')),
('row_data', models.JSONField(blank=True, null=True, verbose_name='Original row data')),
('data', models.JSONField(blank=True, null=True, verbose_name='Data')),
('errors', models.JSONField(blank=True, null=True, verbose_name='Errors')),
('valid', models.BooleanField(default=False, verbose_name='Valid')),
('complete', models.BooleanField(default=False, verbose_name='Complete')),
('session', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='rows', to='importer.dataimportsession', verbose_name='Import Session')),
],
),
migrations.CreateModel(
name='DataImportColumnMap',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('field', models.CharField(max_length=100, verbose_name='Field')),
('column', models.CharField(blank=True, max_length=100, verbose_name='Column')),
('session', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='column_mappings', to='importer.dataimportsession', verbose_name='Import Session')),
],
),
]

View File

@ -0,0 +1,267 @@
"""Mixin classes for data import/export functionality."""
from django.core.exceptions import ValidationError
from django.utils.translation import gettext_lazy as _
import tablib
from rest_framework import fields, serializers
import importer.operations
from InvenTree.helpers import DownloadFile, GetExportFormats, current_date
class DataImportSerializerMixin:
"""Mixin class for adding data import functionality to a DRF serializer."""
import_only_fields = []
import_exclude_fields = []
def get_import_only_fields(self, **kwargs) -> list:
"""Return the list of field names which are only used during data import."""
return self.import_only_fields
def get_import_exclude_fields(self, **kwargs) -> list:
"""Return the list of field names which are excluded during data import."""
return self.import_exclude_fields
def __init__(self, *args, **kwargs):
"""Initialise the DataImportSerializerMixin.
Determine if the serializer is being used for data import,
and if so, adjust the serializer fields accordingly.
"""
importing = kwargs.pop('importing', False)
super().__init__(*args, **kwargs)
if importing:
# Exclude any fields which are not able to be imported
importable_field_names = list(self.get_importable_fields().keys())
field_names = list(self.fields.keys())
for field in field_names:
if field not in importable_field_names:
self.fields.pop(field, None)
# Exclude fields which are excluded for data import
for field in self.get_import_exclude_fields(**kwargs):
self.fields.pop(field, None)
else:
# Exclude fields which are only used for data import
for field in self.get_import_only_fields(**kwargs):
self.fields.pop(field, None)
def get_importable_fields(self) -> dict:
"""Return a dict of fields which can be imported against this serializer instance.
Returns:
dict: A dictionary of field names and field objects
"""
importable_fields = {}
if meta := getattr(self, 'Meta', None):
read_only_fields = getattr(meta, 'read_only_fields', [])
else:
read_only_fields = []
for name, field in self.fields.items():
# Skip read-only fields
if getattr(field, 'read_only', False):
continue
if name in read_only_fields:
continue
# Skip fields which are themselves serializers
if issubclass(field.__class__, serializers.Serializer):
continue
# Skip file fields
if issubclass(field.__class__, fields.FileField):
continue
importable_fields[name] = field
return importable_fields
class DataExportSerializerMixin:
"""Mixin class for adding data export functionality to a DRF serializer."""
export_only_fields = []
export_exclude_fields = []
def get_export_only_fields(self, **kwargs) -> list:
"""Return the list of field names which are only used during data export."""
return self.export_only_fields
def get_export_exclude_fields(self, **kwargs) -> list:
"""Return the list of field names which are excluded during data export."""
return self.export_exclude_fields
def __init__(self, *args, **kwargs):
"""Initialise the DataExportSerializerMixin.
Determine if the serializer is being used for data export,
and if so, adjust the serializer fields accordingly.
"""
exporting = kwargs.pop('exporting', False)
super().__init__(*args, **kwargs)
if exporting:
# Exclude fields which are not required for data export
for field in self.get_export_exclude_fields(**kwargs):
self.fields.pop(field, None)
else:
# Exclude fields which are only used for data export
for field in self.get_export_only_fields(**kwargs):
self.fields.pop(field, None)
def get_exportable_fields(self) -> dict:
"""Return a dict of fields which can be exported against this serializer instance.
Note: Any fields which should be excluded from export have already been removed
Returns:
dict: A dictionary of field names and field objects
"""
fields = {}
if meta := getattr(self, 'Meta', None):
write_only_fields = getattr(meta, 'write_only_fields', [])
else:
write_only_fields = []
for name, field in self.fields.items():
# Skip write-only fields
if getattr(field, 'write_only', False):
continue
if name in write_only_fields:
continue
# Skip fields which are themselves serializers
if issubclass(field.__class__, serializers.Serializer):
continue
fields[name] = field
return fields
def get_exported_filename(self, export_format) -> str:
"""Return the filename for the exported data file.
An implementing class can override this implementation if required.
Arguments:
export_format: The file format to be exported
Returns:
str: The filename for the exported file
"""
model = self.Meta.model
date = current_date().isoformat()
return f'InvenTree_{model.__name__}_{date}.{export_format}'
@classmethod
def arrange_export_headers(cls, headers: list) -> list:
"""Optional method to arrange the export headers."""
return headers
def process_row(self, row):
"""Optional method to process a row before exporting it."""
return row
def export_to_file(self, data, file_format):
"""Export the queryset to a file in the specified format.
Arguments:
queryset: The queryset to export
data: The serialized dataset to export
file_format: The file format to export to
Returns:
File object containing the exported data
"""
# Extract all exportable fields from this serializer
fields = self.get_exportable_fields()
field_names = self.arrange_export_headers(list(fields.keys()))
# Extract human-readable field names
headers = []
for field_name, field in fields.items():
field = fields[field_name]
headers.append(importer.operations.get_field_label(field) or field_name)
dataset = tablib.Dataset(headers=headers)
for row in data:
row = self.process_row(row)
dataset.append([row.get(field, None) for field in field_names])
return dataset.export(file_format)
class DataImportExportSerializerMixin(
DataImportSerializerMixin, DataExportSerializerMixin
):
"""Mixin class for adding data import/export functionality to a DRF serializer."""
pass
class DataExportViewMixin:
"""Mixin class for exporting a dataset via the API.
Adding this mixin to an API view allows the user to export the dataset to file in a variety of formats.
We achieve this by overriding the 'get' method, and checking for the presence of the required query parameter.
"""
EXPORT_QUERY_PARAMETER = 'export'
def export_data(self, export_format):
"""Export the data in the specified format.
Use the provided serializer to generate the data, and return it as a file download.
"""
serializer_class = self.get_serializer_class()
if not issubclass(serializer_class, DataExportSerializerMixin):
raise TypeError(
'Serializer class must inherit from DataExportSerialierMixin'
)
queryset = self.filter_queryset(self.get_queryset())
serializer = serializer_class(exporting=True)
serializer.initial_data = queryset
# Export dataset with a second copy of the serializer
# This is because when we pass many=True, the returned class is a ListSerializer
data = serializer_class(queryset, many=True, exporting=True).data
filename = serializer.get_exported_filename(export_format)
datafile = serializer.export_to_file(data, export_format)
return DownloadFile(datafile, filename=filename)
def get(self, request, *args, **kwargs):
"""Override the 'get' method to check for the export query parameter."""
if export_format := request.query_params.get(self.EXPORT_QUERY_PARAMETER, None):
export_format = str(export_format).strip().lower()
if export_format in GetExportFormats():
return self.export_data(export_format)
else:
raise ValidationError({
self.EXPORT_QUERY_PARAMETER: _('Invalid export format')
})
# If the export query parameter is not present, return the default response
return super().get(request, *args, **kwargs)

View File

@ -0,0 +1,575 @@
"""Model definitions for the 'importer' app."""
import logging
from django.contrib.auth.models import User
from django.core.exceptions import ValidationError as DjangoValidationError
from django.core.validators import FileExtensionValidator
from django.db import models
from django.urls import reverse
from django.utils.translation import gettext_lazy as _
from rest_framework.exceptions import ValidationError as DRFValidationError
import importer.operations
import importer.registry
import importer.tasks
import importer.validators
import InvenTree.helpers
from importer.status_codes import DataImportStatusCode
logger = logging.getLogger('inventree')
class DataImportSession(models.Model):
"""Database model representing a data import session.
An initial file is uploaded, and used to populate the database.
Fields:
timestamp: Timestamp for the import session
data_file: FileField for the data file to import
status: IntegerField for the status of the import session
user: ForeignKey to the User who initiated the import
field_defaults: JSONField for field default values
"""
@staticmethod
def get_api_url():
"""Return the API URL associated with the DataImportSession model."""
return reverse('api-importer-session-list')
def save(self, *args, **kwargs):
"""Save the DataImportSession object."""
initial = self.pk is None
self.clean()
super().save(*args, **kwargs)
if initial:
# New object - run initial setup
self.status = DataImportStatusCode.INITIAL.value
self.progress = 0
self.extract_columns()
timestamp = models.DateTimeField(auto_now_add=True, verbose_name=_('Timestamp'))
data_file = models.FileField(
upload_to='import',
verbose_name=_('Data File'),
help_text=_('Data file to import'),
validators=[
FileExtensionValidator(
allowed_extensions=InvenTree.helpers.GetExportFormats()
),
importer.validators.validate_data_file,
],
)
columns = models.JSONField(blank=True, null=True, verbose_name=_('Columns'))
model_type = models.CharField(
blank=False,
max_length=100,
validators=[importer.validators.validate_importer_model_type],
)
status = models.PositiveIntegerField(
default=DataImportStatusCode.INITIAL.value,
choices=DataImportStatusCode.items(),
help_text=_('Import status'),
)
user = models.ForeignKey(
User, on_delete=models.SET_NULL, blank=True, null=True, verbose_name=_('User')
)
field_defaults = models.JSONField(
blank=True,
null=True,
verbose_name=_('Field Defaults'),
validators=[importer.validators.validate_field_defaults],
)
@property
def field_mapping(self):
"""Construct a dict of field mappings for this import session.
Returns: A dict of field: column mappings
"""
mapping = {}
for map in self.column_mappings.all():
mapping[map.field] = map.column
return mapping
@property
def serializer_class(self):
"""Return the serializer class for this importer."""
from importer.registry import supported_models
return supported_models().get(self.model_type, None)
def extract_columns(self):
"""Run initial column extraction and mapping.
This method is called when the import session is first created.
- Extract column names from the data file
- Create a default mapping for each field in the serializer
"""
# Extract list of column names from the file
self.columns = importer.operations.extract_column_names(self.data_file)
serializer_fields = self.available_fields()
# Remove any existing mappings
self.column_mappings.all().delete()
column_mappings = []
matched_columns = set()
# Create a default mapping for each available field in the database
for field, field_def in serializer_fields.items():
# Generate a list of possible column names for this field
field_options = [
field,
field_def.get('label', field),
field_def.get('help_text', field),
]
column_name = ''
for column in self.columns:
# No title provided for the column
if not column:
continue
# Ignore if we have already matched this column to a field
if column in matched_columns:
continue
# Try direct match
if column in field_options:
column_name = column
break
# Try lower case match
if column.lower() in [f.lower() for f in field_options]:
column_name = column
break
column_mappings.append(
DataImportColumnMap(session=self, column=column_name, field=field)
)
# Create the column mappings
DataImportColumnMap.objects.bulk_create(column_mappings)
self.status = DataImportStatusCode.MAPPING.value
self.save()
def accept_mapping(self):
"""Accept current mapping configuration.
- Validate that the current column mapping is correct
- Trigger the data import process
"""
# First, we need to ensure that all the *required* columns have been mapped
required_fields = self.required_fields()
field_defaults = self.field_defaults or {}
missing_fields = []
for field in required_fields.keys():
# A default value exists
if field in field_defaults and field_defaults[field]:
continue
# The field has been mapped to a data column
if mapping := self.column_mappings.filter(field=field).first():
if mapping.column:
continue
missing_fields.append(field)
if len(missing_fields) > 0:
raise DjangoValidationError({
'error': _('Some required fields have not been mapped'),
'fields': missing_fields,
})
# No errors, so trigger the data import process
self.trigger_data_import()
def trigger_data_import(self):
"""Trigger the data import process for this session.
Offloads the task to the background worker process.
"""
from InvenTree.tasks import offload_task
# Mark the import task status as "IMPORTING"
self.status = DataImportStatusCode.IMPORTING.value
self.save()
offload_task(importer.tasks.import_data, self.pk)
def import_data(self):
"""Perform the data import process for this session."""
# Clear any existing data rows
self.rows.all().delete()
df = importer.operations.load_data_file(self.data_file)
if df is None:
# TODO: Log an error message against the import session
logger.error('Failed to load data file')
return
headers = df.headers
imported_rows = []
field_mapping = self.field_mapping
available_fields = self.available_fields()
# Iterate through each "row" in the data file, and create a new DataImportRow object
for idx, row in enumerate(df):
row_data = dict(zip(headers, row))
# Skip completely empty rows
if not any(row_data.values()):
continue
row = importer.models.DataImportRow(
session=self, row_data=row_data, row_index=idx
)
row.extract_data(
field_mapping=field_mapping,
available_fields=available_fields,
commit=False,
)
row.valid = row.validate(commit=False)
imported_rows.append(row)
# Perform database writes as a single operation
importer.models.DataImportRow.objects.bulk_create(imported_rows)
# Mark the import task as "PROCESSING"
self.status = DataImportStatusCode.PROCESSING.value
self.save()
@property
def row_count(self):
"""Return the number of rows in the import session."""
return self.rows.count()
@property
def completed_row_count(self):
"""Return the number of completed rows for this session."""
return self.rows.filter(complete=True).count()
def available_fields(self):
"""Returns information on the available fields.
- This method is designed to be introspected by the frontend, for rendering the various fields.
- We make use of the InvenTree.metadata module to provide extra information about the fields.
Note that we cache these fields, as they are expensive to compute.
"""
if fields := getattr(self, '_available_fields', None):
return fields
from InvenTree.metadata import InvenTreeMetadata
metadata = InvenTreeMetadata()
if serializer_class := self.serializer_class:
serializer = serializer_class(data={}, importing=True)
fields = metadata.get_serializer_info(serializer)
else:
fields = {}
self._available_fields = fields
return fields
def required_fields(self):
"""Returns information on which fields are *required* for import."""
fields = self.available_fields()
required = {}
for field, info in fields.items():
if info.get('required', False):
required[field] = info
return required
class DataImportColumnMap(models.Model):
"""Database model representing a mapping between a file column and serializer field.
- Each row maps a "column" (in the import file) to a "field" (in the serializer)
- Column must exist in the file
- Field must exist in the serializer (and not be read-only)
"""
@staticmethod
def get_api_url():
"""Return the API URL associated with the DataImportColumnMap model."""
return reverse('api-importer-mapping-list')
def save(self, *args, **kwargs):
"""Save the DataImportColumnMap object."""
self.clean()
self.validate_unique()
super().save(*args, **kwargs)
def validate_unique(self, exclude=None):
"""Ensure that the column mapping is unique within the session."""
super().validate_unique(exclude)
columns = self.session.column_mappings.exclude(pk=self.pk)
if (
self.column not in ['', None]
and columns.filter(column=self.column).exists()
):
raise DjangoValidationError({
'column': _('Column is already mapped to a database field')
})
if columns.filter(field=self.field).exists():
raise DjangoValidationError({
'field': _('Field is already mapped to a data column')
})
def clean(self):
"""Validate the column mapping."""
super().clean()
if not self.session:
raise DjangoValidationError({
'session': _('Column mapping must be linked to a valid import session')
})
if self.column and self.column not in self.session.columns:
raise DjangoValidationError({
'column': _('Column does not exist in the data file')
})
field_def = self.field_definition
if not field_def:
raise DjangoValidationError({
'field': _('Field does not exist in the target model')
})
if field_def.get('read_only', False):
raise DjangoValidationError({'field': _('Selected field is read-only')})
session = models.ForeignKey(
DataImportSession,
on_delete=models.CASCADE,
verbose_name=_('Import Session'),
related_name='column_mappings',
)
field = models.CharField(max_length=100, verbose_name=_('Field'))
column = models.CharField(blank=True, max_length=100, verbose_name=_('Column'))
@property
def available_fields(self):
"""Return a list of available fields for this import session.
These fields get cached, as they are expensive to compute.
"""
if fields := getattr(self, '_available_fields', None):
return fields
self._available_fields = self.session.available_fields()
return self._available_fields
@property
def field_definition(self):
"""Return the field definition associated with this column mapping."""
fields = self.available_fields
return fields.get(self.field, None)
@property
def label(self):
"""Extract the 'label' associated with the mapped field."""
if field_def := self.field_definition:
return field_def.get('label', None)
@property
def description(self):
"""Extract the 'description' associated with the mapped field."""
description = None
if field_def := self.field_definition:
description = field_def.get('help_text', None)
if not description:
description = self.label
return description
class DataImportRow(models.Model):
"""Database model representing a single row in a data import session.
Each row corresponds to a single row in the import file, and is used to populate the database.
Fields:
session: ForeignKey to the parent DataImportSession object
data: JSONField for the data in this row
status: IntegerField for the status of the row import
"""
@staticmethod
def get_api_url():
"""Return the API URL associated with the DataImportRow model."""
return reverse('api-importer-row-list')
def save(self, *args, **kwargs):
"""Save the DataImportRow object."""
self.valid = self.validate()
super().save(*args, **kwargs)
session = models.ForeignKey(
DataImportSession,
on_delete=models.CASCADE,
verbose_name=_('Import Session'),
related_name='rows',
)
row_index = models.PositiveIntegerField(default=0, verbose_name=_('Row Index'))
row_data = models.JSONField(
blank=True, null=True, verbose_name=_('Original row data')
)
data = models.JSONField(blank=True, null=True, verbose_name=_('Data'))
errors = models.JSONField(blank=True, null=True, verbose_name=_('Errors'))
valid = models.BooleanField(default=False, verbose_name=_('Valid'))
complete = models.BooleanField(default=False, verbose_name=_('Complete'))
def extract_data(
self, available_fields: dict = None, field_mapping: dict = None, commit=True
):
"""Extract row data from the provided data dictionary."""
if not field_mapping:
field_mapping = self.session.field_mapping
if not available_fields:
available_fields = self.session.available_fields()
default_values = self.session.field_defaults or {}
data = {}
# We have mapped column (file) to field (serializer) already
for field, col in field_mapping.items():
# If this field is *not* mapped to any column, skip
if not col:
continue
# Extract field type
field_def = available_fields.get(field, {})
field_type = field_def.get('type', None)
value = self.row_data.get(col, None)
if field_type == 'boolean':
value = InvenTree.helpers.str2bool(value)
elif field_type == 'date':
value = value or None
# Use the default value, if provided
if value in [None, ''] and field in default_values:
value = default_values[field]
data[field] = value
self.data = data
if commit:
self.save()
def serializer_data(self):
"""Construct data object to be sent to the serializer.
- If available, we use the "default" values provided by the import session
- If available, we use the "override" values provided by the import session
"""
data = self.session.field_defaults or {}
if self.data:
data.update(self.data)
return data
def construct_serializer(self):
"""Construct a serializer object for this row."""
if serializer_class := self.session.serializer_class:
return serializer_class(data=self.serializer_data())
def validate(self, commit=False) -> bool:
"""Validate the data in this row against the linked serializer.
Arguments:
commit: If True, the data is saved to the database (if validation passes)
Returns:
True if the data is valid, False otherwise
Raises:
ValidationError: If the linked serializer is not valid
"""
if self.complete:
# Row has already been completed
return True
serializer = self.construct_serializer()
if not serializer:
self.errors = {
'non_field_errors': 'No serializer class linked to this import session'
}
return False
result = False
try:
result = serializer.is_valid(raise_exception=True)
except (DjangoValidationError, DRFValidationError) as e:
self.errors = e.detail
if result:
self.errors = None
if commit:
try:
serializer.save()
self.complete = True
self.save()
except Exception as e:
self.errors = {'non_field_errors': str(e)}
result = False
return result

View File

@ -0,0 +1,122 @@
"""Data import operational functions."""
from django.core.exceptions import ValidationError
from django.utils.translation import gettext_lazy as _
import tablib
import InvenTree.helpers
def load_data_file(data_file, file_format=None):
"""Load data file into a tablib dataset.
Arguments:
data_file: django file object containing data to import (should be already opened!)
file_format: Format specifier for the data file
"""
# Introspect the file format based on the provided file
if not file_format:
file_format = data_file.name.split('.')[-1]
if file_format and file_format.startswith('.'):
file_format = file_format[1:]
file_format = file_format.strip().lower()
if file_format not in InvenTree.helpers.GetExportFormats():
raise ValidationError(_('Unsupported data file format'))
file_object = data_file.file
if hasattr(file_object, 'open'):
file_object.open('r')
file_object.seek(0)
try:
data = file_object.read()
except (IOError, FileNotFoundError):
raise ValidationError(_('Failed to open data file'))
# Excel formats expect binary data
if file_format not in ['xls', 'xlsx']:
data = data.decode()
try:
data = tablib.Dataset().load(data, headers=True, format=file_format)
except tablib.core.UnsupportedFormat:
raise ValidationError(_('Unsupported data file format'))
except tablib.core.InvalidDimensions:
raise ValidationError(_('Invalid data file dimensions'))
return data
def extract_column_names(data_file) -> list:
"""Extract column names from a data file.
Uses the tablib library to extract column names from a data file.
Args:
data_file: File object containing data to import
Returns:
List of column names extracted from the file
Raises:
ValidationError: If the data file is not in a valid format
"""
data = load_data_file(data_file)
headers = []
for idx, header in enumerate(data.headers):
if header:
headers.append(header)
else:
# If the header is empty, generate a default header
headers.append(f'Column {idx + 1}')
return headers
def extract_rows(data_file) -> list:
"""Extract rows from the data file.
Each returned row is a dictionary of column_name: value pairs.
"""
data = load_data_file(data_file)
headers = data.headers
rows = []
for row in data:
rows.append(dict(zip(headers, row)))
return rows
def get_field_label(field) -> str:
"""Return the label for a field in a serializer class.
Check for labels in the following order of descending priority:
- The serializer class has a 'label' specified for the field
- The underlying model has a 'verbose_name' specified
- The field name is used as the label
Arguments:
field: Field instance from a serializer class
Returns:
str: Field label
"""
if field:
if label := getattr(field, 'label', None):
return label
# TODO: Check if the field is a model field
return None

View File

@ -0,0 +1,72 @@
"""Registry for supported serializers for data import operations."""
import logging
from rest_framework.serializers import Serializer
from importer.mixins import DataImportSerializerMixin
logger = logging.getLogger('inventree')
class DataImportSerializerRegister:
"""Registry for supported serializers for data import operations.
To add a new serializer to the registry, add the @register_importer decorator to the serializer class.
"""
supported_serializers: list[Serializer] = []
def register(self, serializer) -> None:
"""Register a new serializer with the importer registry."""
if not issubclass(serializer, DataImportSerializerMixin):
logger.debug('Invalid serializer class: %s', type(serializer))
return
if not issubclass(serializer, Serializer):
logger.debug('Invalid serializer class: %s', type(serializer))
return
logger.debug('Registering serializer class for import: %s', type(serializer))
if serializer not in self.supported_serializers:
self.supported_serializers.append(serializer)
_serializer_registry = DataImportSerializerRegister()
def get_supported_serializers():
"""Return a list of supported serializers which can be used for importing data."""
return _serializer_registry.supported_serializers
def supported_models():
"""Return a map of supported models to their respective serializers."""
data = {}
for serializer in get_supported_serializers():
model = serializer.Meta.model
data[model.__name__.lower()] = serializer
return data
def supported_model_options():
"""Return a list of supported model options for importing data."""
options = []
for model_name, serializer in supported_models().items():
options.append((model_name, serializer.Meta.model._meta.verbose_name))
return options
def register_importer():
"""Decorator function to register a serializer with the importer registry."""
def _decorator(cls):
_serializer_registry.register(cls)
return cls
return _decorator

View File

@ -0,0 +1,170 @@
"""API serializers for the importer app."""
from django.core.exceptions import ValidationError
from django.utils.translation import gettext_lazy as _
from rest_framework import serializers
import importer.models
import importer.registry
from InvenTree.serializers import (
InvenTreeAttachmentSerializerField,
InvenTreeModelSerializer,
UserSerializer,
)
class DataImportColumnMapSerializer(InvenTreeModelSerializer):
"""Serializer for the DataImportColumnMap model."""
class Meta:
"""Meta class options for the serializer."""
model = importer.models.DataImportColumnMap
fields = ['pk', 'session', 'column', 'field', 'label', 'description']
read_only_fields = ['field', 'session']
label = serializers.CharField(read_only=True)
description = serializers.CharField(read_only=True)
class DataImportSessionSerializer(InvenTreeModelSerializer):
"""Serializer for the DataImportSession model."""
class Meta:
"""Meta class options for the serializer."""
model = importer.models.DataImportSession
fields = [
'pk',
'timestamp',
'data_file',
'model_type',
'available_fields',
'status',
'user',
'user_detail',
'columns',
'column_mappings',
'field_defaults',
'row_count',
'completed_row_count',
]
read_only_fields = ['pk', 'user', 'status', 'columns']
def __init__(self, *args, **kwargs):
"""Override the constructor for the DataImportSession serializer."""
super().__init__(*args, **kwargs)
self.fields['model_type'].choices = importer.registry.supported_model_options()
data_file = InvenTreeAttachmentSerializerField()
model_type = serializers.ChoiceField(
required=True,
allow_blank=False,
choices=importer.registry.supported_model_options(),
)
available_fields = serializers.JSONField(read_only=True)
row_count = serializers.IntegerField(read_only=True)
completed_row_count = serializers.IntegerField(read_only=True)
column_mappings = DataImportColumnMapSerializer(many=True, read_only=True)
user_detail = UserSerializer(source='user', read_only=True, many=False)
def create(self, validated_data):
"""Override create method for this serializer.
Attach user information based on provided session data.
"""
session = super().create(validated_data)
request = self.context.get('request', None)
if request:
session.user = request.user
session.save()
return session
class DataImportRowSerializer(InvenTreeModelSerializer):
"""Serializer for the DataImportRow model."""
class Meta:
"""Meta class options for the serializer."""
model = importer.models.DataImportRow
fields = [
'pk',
'session',
'row_index',
'row_data',
'data',
'errors',
'valid',
'complete',
]
read_only_fields = [
'pk',
'session',
'row_index',
'row_data',
'errors',
'valid',
'complete',
]
class DataImportAcceptRowSerializer(serializers.Serializer):
"""Serializer for accepting rows of data."""
class Meta:
"""Serializer meta options."""
fields = ['rows']
rows = serializers.PrimaryKeyRelatedField(
queryset=importer.models.DataImportRow.objects.all(),
many=True,
required=True,
label=_('Rows'),
help_text=_('List of row IDs to accept'),
)
def validate_rows(self, rows):
"""Ensure that the provided rows are valid.
- Row must point to the same import session
- Row must contain valid data
- Row must not have already been completed
"""
session = self.context.get('session', None)
if not rows or len(rows) == 0:
raise ValidationError(_('No rows provided'))
for row in rows:
if row.session != session:
raise ValidationError(_('Row does not belong to this session'))
if not row.valid:
raise ValidationError(_('Row contains invalid data'))
if row.complete:
raise ValidationError(_('Row has already been completed'))
return rows
def save(self):
"""Complete the provided rows."""
rows = self.validated_data['rows']
for row in rows:
row.validate(commit=True)
return rows

View File

@ -0,0 +1,19 @@
"""Status codes for common model types."""
from django.utils.translation import gettext_lazy as _
from generic.states import StatusCode
class DataImportStatusCode(StatusCode):
"""Defines a set of status codes for a DataImportSession."""
INITIAL = 0, _('Initializing'), 'secondary' # Import session has been created
MAPPING = 10, _('Mapping Columns'), 'primary' # Import fields are being mapped
IMPORTING = 20, _('Importing Data'), 'primary' # Data is being imported
PROCESSING = (
30,
_('Processing Data'),
'primary',
) # Data is being processed by the user
COMPLETE = 40, _('Complete'), 'success' # Import has been completed

View File

@ -0,0 +1,53 @@
"""Task definitions for the 'importer' app."""
import logging
from datetime import timedelta
import InvenTree.helpers
import InvenTree.tasks
logger = logging.getLogger('inventree')
def import_data(session_id: int):
"""Load data from the provided file.
Attempt to load data from the provided file, and potentially handle any errors.
"""
import importer.models
import importer.operations
import importer.status_codes
try:
session = importer.models.DataImportSession.objects.get(pk=session_id)
logger.info("Loading data from session ID '%s'", session_id)
session.import_data()
except (ValueError, importer.models.DataImportSession.DoesNotExist):
logger.error("Data import session with ID '%s' does not exist", session_id)
return
@InvenTree.tasks.scheduled_task(InvenTree.tasks.ScheduledTask.DAILY)
def cleanup_import_sessions():
"""Periodically remove old import sessions.
Every 5 days, remove any importer sessions that are more than 5 days old
"""
CLEANUP_DAYS = 5
import importer.models
if not InvenTree.tasks.check_daily_holdoff('cleanup_import_sessions', CLEANUP_DAYS):
return
logger.info('Cleaning old data import sessions')
before = InvenTree.helpers.current_date() - timedelta(days=CLEANUP_DAYS)
sessions = importer.models.DataImportSession.objects.filter(timestamp__lte=before)
if sessions.count() > 0:
logger.info('Deleting %s old data import sessions', sessions.count())
sessions.delete()
InvenTree.tasks.record_task_success('cleanup_import_sessions')

View File

@ -0,0 +1,13 @@
ID,Company name,Company description,Website,Phone number,Address,Email,Currency,Contact,Link,Image,Active,Is customer,Is manufacturer,Is supplier,Notes,Parts supplied,Parts manufactured,Address count
3,Arrow,Arrow Electronics,https://www.arrow.com/,,"70680 Shannon Rapid Apt. 570, 96124, Jenniferport, Arkansas, Holy See (Vatican City State)",,AUD,,,/media/company_images/company_3_img.jpg,True,False,False,True,,60,0,2
1,DigiKey,DigiKey Electronics,https://www.digikey.com/,,"04964 Cox View Suite 815, 94832, Wesleyport, Delaware, Bolivia",,USD,,,/media/company_images/company_1_img.jpg,True,False,False,True,,200,0,2
41,Future,Electronic components distributor,https://www.futureelectronics.com/,,"Wogan Terrace 79, 20157, Teasdale, Lebanon",,USD,,,/media/company_images/company_41_img.png,True,False,False,True,,60,0,4
39,LCSC,Electronic components distributor,https://lcsc.com/,,"77673 Bishop Turnpike, 74969, North Cheryl, Hawaii, Portugal",,USD,,,/media/company_images/company_39_img.webp,True,False,False,True,,60,0,2
38,McMaster-Carr,Supplier of mechanical components,https://www.mcmaster.com/,,"Schroeders Avenue 56, 8014, Sylvanite, Cayman Islands",,USD,,,/media/company_images/company_38_img.png,True,False,False,True,,240,0,1
2,Mouser,Mouser Electronics,https://mouser.com/,,"Ashford Street 71, 24165, Leland, Jamaica",,AUD,,,/media/company_images/company_2_img.jpg,True,False,False,True,,61,0,2
40,Newark,Online distributor of electronic components,https://www.newark.com/,,"Dekoven Court 3, 18301, Emison, Tuvalu",,USD,,,/media/company_images/company_40_img.png,True,False,False,True,,60,0,1
36,Paint by Numbers,Supplier of high quality paint,,,"Orient Avenue 59, 18609, Corinne, Alabama, France, Metropolitan",,EUR,Pippy Painter,,/media/company_images/company_36_img.jpg,True,False,False,True,,15,0,1
43,PCBWOY,PCB fabricator / supplier,,,"McKibben Street 77, 12370, Russellville, Benin",,USD,,,/media/company_images/company_43_img.png,True,False,False,True,,1,0,2
29,Texas Instruments,,https://www.ti.com/,,"264 David Villages, 97718, Lake Michael, New Mexico, Kenya",,USD,,,/media/company_images/company_29_img.jpg,True,False,True,True,,0,1,2
44,Wire-E-Coyote,American wire supplier,,,"Fountain Avenue 74, 12115, Gulf, Seychelles",,USD,,,,True,False,False,True,,5,0,3
42,Wirey,Supplier of wire,,,"Preston Court 80, 4462, Manila, Russian Federation",,USD,,,/media/company_images/company_42_img.jpg,True,False,False,True,,11,0,2
1 ID Company name Company description Website Phone number Address Email Currency Contact Link Image Active Is customer Is manufacturer Is supplier Notes Parts supplied Parts manufactured Address count
2 3 Arrow Arrow Electronics https://www.arrow.com/ 70680 Shannon Rapid Apt. 570, 96124, Jenniferport, Arkansas, Holy See (Vatican City State) AUD /media/company_images/company_3_img.jpg True False False True 60 0 2
3 1 DigiKey DigiKey Electronics https://www.digikey.com/ 04964 Cox View Suite 815, 94832, Wesleyport, Delaware, Bolivia USD /media/company_images/company_1_img.jpg True False False True 200 0 2
4 41 Future Electronic components distributor https://www.futureelectronics.com/ Wogan Terrace 79, 20157, Teasdale, Lebanon USD /media/company_images/company_41_img.png True False False True 60 0 4
5 39 LCSC Electronic components distributor https://lcsc.com/ 77673 Bishop Turnpike, 74969, North Cheryl, Hawaii, Portugal USD /media/company_images/company_39_img.webp True False False True 60 0 2
6 38 McMaster-Carr Supplier of mechanical components https://www.mcmaster.com/ Schroeders Avenue 56, 8014, Sylvanite, Cayman Islands USD /media/company_images/company_38_img.png True False False True 240 0 1
7 2 Mouser Mouser Electronics https://mouser.com/ Ashford Street 71, 24165, Leland, Jamaica AUD /media/company_images/company_2_img.jpg True False False True 61 0 2
8 40 Newark Online distributor of electronic components https://www.newark.com/ Dekoven Court 3, 18301, Emison, Tuvalu USD /media/company_images/company_40_img.png True False False True 60 0 1
9 36 Paint by Numbers Supplier of high quality paint Orient Avenue 59, 18609, Corinne, Alabama, France, Metropolitan EUR Pippy Painter /media/company_images/company_36_img.jpg True False False True 15 0 1
10 43 PCBWOY PCB fabricator / supplier McKibben Street 77, 12370, Russellville, Benin USD /media/company_images/company_43_img.png True False False True 1 0 2
11 29 Texas Instruments https://www.ti.com/ 264 David Villages, 97718, Lake Michael, New Mexico, Kenya USD /media/company_images/company_29_img.jpg True False True True 0 1 2
12 44 Wire-E-Coyote American wire supplier Fountain Avenue 74, 12115, Gulf, Seychelles USD True False False True 5 0 3
13 42 Wirey Supplier of wire Preston Court 80, 4462, Manila, Russian Federation USD /media/company_images/company_42_img.jpg True False False True 11 0 2

View File

@ -0,0 +1,64 @@
"""Unit tests for the 'importer' app."""
import os
from django.core.files.base import ContentFile
from importer.models import DataImportSession
from InvenTree.unit_test import InvenTreeTestCase
class ImporterTest(InvenTreeTestCase):
"""Basic tests for file imports."""
def test_import_session(self):
"""Test creation of a data import session."""
from company.models import Company
n = Company.objects.count()
fn = os.path.join(os.path.dirname(__file__), 'test_data', 'companies.csv')
with open(fn, 'r') as input_file:
data = input_file.read()
session = DataImportSession.objects.create(
data_file=ContentFile(data, 'companies.csv'), model_type='company'
)
session.extract_columns()
self.assertEqual(session.column_mappings.count(), 14)
# Check some of the field mappings
for field, col in [
('website', 'Website'),
('is_customer', 'Is customer'),
('phone', 'Phone number'),
('description', 'Company description'),
('active', 'Active'),
]:
self.assertTrue(
session.column_mappings.filter(field=field, column=col).exists()
)
# Run the data import
session.import_data()
self.assertEqual(session.rows.count(), 12)
# Check that some data has been imported
for row in session.rows.all():
self.assertIsNotNone(row.data.get('name', None))
self.assertTrue(row.valid)
row.validate(commit=True)
self.assertTrue(row.complete)
self.assertEqual(session.completed_row_count, 12)
# Check that the new companies have been created
self.assertEqual(n + 12, Company.objects.count())
def test_field_defaults(self):
"""Test default field values."""
...

View File

@ -0,0 +1,49 @@
"""Custom validation routines for the 'importer' app."""
import os
from django.core.exceptions import ValidationError
from django.utils.translation import gettext_lazy as _
# Define maximum limits for imported file data
IMPORTER_MAX_FILE_SIZE = 32 * 1024 * 1042
IMPORTER_MAX_ROWS = 5000
IMPORTER_MAX_COLS = 1000
def validate_data_file(data_file):
"""Validate the provided data file."""
import importer.operations
filesize = data_file.size
if filesize > IMPORTER_MAX_FILE_SIZE:
raise ValidationError(_('Data file exceeds maximum size limit'))
dataset = importer.operations.load_data_file(data_file)
if not dataset.headers or len(dataset.headers) == 0:
raise ValidationError(_('Data file contains no headers'))
if len(dataset.headers) > IMPORTER_MAX_COLS:
raise ValidationError(_('Data file contains too many columns'))
if len(dataset) > IMPORTER_MAX_ROWS:
raise ValidationError(_('Data file contains too many rows'))
def validate_importer_model_type(value):
"""Validate that the given model type is supported for importing."""
from importer.registry import supported_models
if value not in supported_models().keys():
raise ValidationError(f"Unsupported model type '{value}'")
def validate_field_defaults(value):
"""Validate that the provided value is a valid dict."""
if value is None:
return
if type(value) is not dict:
raise ValidationError(_('Value must be a valid dictionary object'))

View File

@ -21,21 +21,13 @@ import common.models
import common.settings
import company.models
from generic.states.api import StatusView
from InvenTree.api import APIDownloadMixin, ListCreateDestroyAPIView, MetadataView
from importer.mixins import DataExportViewMixin
from InvenTree.api import ListCreateDestroyAPIView, MetadataView
from InvenTree.filters import SEARCH_ORDER_FILTER, SEARCH_ORDER_FILTER_ALIAS
from InvenTree.helpers import DownloadFile, str2bool
from InvenTree.helpers import str2bool
from InvenTree.helpers_model import construct_absolute_url, get_base_url
from InvenTree.mixins import CreateAPI, ListAPI, ListCreateAPI, RetrieveUpdateDestroyAPI
from order import models, serializers
from order.admin import (
PurchaseOrderExtraLineResource,
PurchaseOrderLineItemResource,
PurchaseOrderResource,
ReturnOrderResource,
SalesOrderExtraLineResource,
SalesOrderLineItemResource,
SalesOrderResource,
)
from order.status_codes import (
PurchaseOrderStatus,
PurchaseOrderStatusGroups,
@ -48,7 +40,7 @@ from part.models import Part
from users.models import Owner
class GeneralExtraLineList(APIDownloadMixin):
class GeneralExtraLineList(DataExportViewMixin):
"""General template for ExtraLine API classes."""
def get_serializer(self, *args, **kwargs):
@ -211,7 +203,7 @@ class PurchaseOrderMixin:
return queryset
class PurchaseOrderList(PurchaseOrderMixin, APIDownloadMixin, ListCreateAPI):
class PurchaseOrderList(PurchaseOrderMixin, DataExportViewMixin, ListCreateAPI):
"""API endpoint for accessing a list of PurchaseOrder objects.
- GET: Return list of PurchaseOrder objects (with filters)
@ -268,16 +260,6 @@ class PurchaseOrderList(PurchaseOrderMixin, APIDownloadMixin, ListCreateAPI):
serializer.data, status=status.HTTP_201_CREATED, headers=headers
)
def download_queryset(self, queryset, export_format):
"""Download the filtered queryset as a file."""
dataset = PurchaseOrderResource().export(queryset=queryset)
filedata = dataset.export(export_format)
filename = f'InvenTree_PurchaseOrders.{export_format}'
return DownloadFile(filedata, filename)
def filter_queryset(self, queryset):
"""Custom queryset filtering."""
# Perform basic filtering
@ -529,7 +511,7 @@ class PurchaseOrderLineItemMixin:
class PurchaseOrderLineItemList(
PurchaseOrderLineItemMixin, APIDownloadMixin, ListCreateDestroyAPIView
PurchaseOrderLineItemMixin, DataExportViewMixin, ListCreateDestroyAPIView
):
"""API endpoint for accessing a list of PurchaseOrderLineItem objects.
@ -577,16 +559,6 @@ class PurchaseOrderLineItemList(
serializer.data, status=status.HTTP_201_CREATED, headers=headers
)
def download_queryset(self, queryset, export_format):
"""Download the requested queryset as a file."""
dataset = PurchaseOrderLineItemResource().export(queryset=queryset)
filedata = dataset.export(export_format)
filename = f'InvenTree_PurchaseOrderItems.{export_format}'
return DownloadFile(filedata, filename)
filter_backends = SEARCH_ORDER_FILTER_ALIAS
ordering_field_aliases = {
@ -632,14 +604,6 @@ class PurchaseOrderExtraLineList(GeneralExtraLineList, ListCreateAPI):
queryset = models.PurchaseOrderExtraLine.objects.all()
serializer_class = serializers.PurchaseOrderExtraLineSerializer
def download_queryset(self, queryset, export_format):
"""Download this queryset as a file."""
dataset = PurchaseOrderExtraLineResource().export(queryset=queryset)
filedata = dataset.export(export_format)
filename = f'InvenTree_ExtraPurchaseOrderLines.{export_format}'
return DownloadFile(filedata, filename)
class PurchaseOrderExtraLineDetail(RetrieveUpdateDestroyAPI):
"""API endpoint for detail view of a PurchaseOrderExtraLine object."""
@ -689,7 +653,7 @@ class SalesOrderMixin:
return queryset
class SalesOrderList(SalesOrderMixin, APIDownloadMixin, ListCreateAPI):
class SalesOrderList(SalesOrderMixin, DataExportViewMixin, ListCreateAPI):
"""API endpoint for accessing a list of SalesOrder objects.
- GET: Return list of SalesOrder objects (with filters)
@ -712,16 +676,6 @@ class SalesOrderList(SalesOrderMixin, APIDownloadMixin, ListCreateAPI):
serializer.data, status=status.HTTP_201_CREATED, headers=headers
)
def download_queryset(self, queryset, export_format):
"""Download this queryset as a file."""
dataset = SalesOrderResource().export(queryset=queryset)
filedata = dataset.export(export_format)
filename = f'InvenTree_SalesOrders.{export_format}'
return DownloadFile(filedata, filename)
def filter_queryset(self, queryset):
"""Perform custom filtering operations on the SalesOrder queryset."""
queryset = super().filter_queryset(queryset)
@ -871,20 +825,13 @@ class SalesOrderLineItemMixin:
return queryset
class SalesOrderLineItemList(SalesOrderLineItemMixin, APIDownloadMixin, ListCreateAPI):
class SalesOrderLineItemList(
SalesOrderLineItemMixin, DataExportViewMixin, ListCreateAPI
):
"""API endpoint for accessing a list of SalesOrderLineItem objects."""
filterset_class = SalesOrderLineItemFilter
def download_queryset(self, queryset, export_format):
"""Download the requested queryset as a file."""
dataset = SalesOrderLineItemResource().export(queryset=queryset)
filedata = dataset.export(export_format)
filename = f'InvenTree_SalesOrderItems.{export_format}'
return DownloadFile(filedata, filename)
filter_backends = SEARCH_ORDER_FILTER_ALIAS
ordering_fields = [
@ -919,14 +866,6 @@ class SalesOrderExtraLineList(GeneralExtraLineList, ListCreateAPI):
queryset = models.SalesOrderExtraLine.objects.all()
serializer_class = serializers.SalesOrderExtraLineSerializer
def download_queryset(self, queryset, export_format):
"""Download this queryset as a file."""
dataset = SalesOrderExtraLineResource().export(queryset=queryset)
filedata = dataset.export(export_format)
filename = f'InvenTree_ExtraSalesOrderLines.{export_format}'
return DownloadFile(filedata, filename)
class SalesOrderExtraLineDetail(RetrieveUpdateDestroyAPI):
"""API endpoint for detail view of a SalesOrderExtraLine object."""
@ -1175,7 +1114,7 @@ class ReturnOrderMixin:
return queryset
class ReturnOrderList(ReturnOrderMixin, APIDownloadMixin, ListCreateAPI):
class ReturnOrderList(ReturnOrderMixin, DataExportViewMixin, ListCreateAPI):
"""API endpoint for accessing a list of ReturnOrder objects."""
filterset_class = ReturnOrderFilter
@ -1194,14 +1133,6 @@ class ReturnOrderList(ReturnOrderMixin, APIDownloadMixin, ListCreateAPI):
serializer.data, status=status.HTTP_201_CREATED, headers=headers
)
def download_queryset(self, queryset, export_format):
"""Download this queryset as a file."""
dataset = ReturnOrderResource().export(queryset=queryset)
filedata = dataset.export(export_format)
filename = f'InvenTree_ReturnOrders.{export_format}'
return DownloadFile(filedata, filename)
filter_backends = SEARCH_ORDER_FILTER_ALIAS
ordering_field_aliases = {
@ -1336,18 +1267,12 @@ class ReturnOrderLineItemMixin:
class ReturnOrderLineItemList(
ReturnOrderLineItemMixin, APIDownloadMixin, ListCreateAPI
ReturnOrderLineItemMixin, DataExportViewMixin, ListCreateAPI
):
"""API endpoint for accessing a list of ReturnOrderLineItemList objects."""
filterset_class = ReturnOrderLineItemFilter
def download_queryset(self, queryset, export_format):
"""Download the requested queryset as a file."""
raise NotImplementedError(
'download_queryset not yet implemented for this endpoint'
)
filter_backends = SEARCH_ORDER_FILTER
ordering_fields = ['reference', 'target_date', 'received_date']
@ -1372,10 +1297,6 @@ class ReturnOrderExtraLineList(GeneralExtraLineList, ListCreateAPI):
queryset = models.ReturnOrderExtraLine.objects.all()
serializer_class = serializers.ReturnOrderExtraLineSerializer
def download_queryset(self, queryset, export_format):
"""Download this queryset as a file."""
raise NotImplementedError('download_queryset not yet implemented')
class ReturnOrderExtraLineDetail(RetrieveUpdateDestroyAPI):
"""API endpoint for detail view of a ReturnOrderExtraLine object."""

View File

@ -44,6 +44,7 @@ class Migration(migrations.Migration):
],
options={
'abstract': False,
'verbose_name': 'Purchase Order Line Item'
},
),
]

View File

@ -59,6 +59,7 @@ class Migration(migrations.Migration):
],
options={
'abstract': False,
'verbose_name': 'Sales Order Line Item',
},
),
migrations.CreateModel(

View File

@ -22,5 +22,8 @@ class Migration(migrations.Migration):
('item', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, related_name='sales_order_allocation', to='stock.StockItem')),
('line', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='allocations', to='order.SalesOrderLineItem')),
],
options={
'verbose_name': 'Sales Order Allocation',
},
),
]

View File

@ -23,5 +23,8 @@ class Migration(migrations.Migration):
('checked_by', models.ForeignKey(blank=True, help_text='User who checked this shipment', null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to=settings.AUTH_USER_MODEL, verbose_name='Checked By')),
('order', models.ForeignKey(help_text='Sales Order', on_delete=django.db.models.deletion.CASCADE, related_name='shipments', to='order.salesorder', verbose_name='Order')),
],
options={
'verbose_name': 'Sales Order Shipment',
},
),
]

View File

@ -86,6 +86,7 @@ class Migration(migrations.Migration):
],
options={
'abstract': False,
'verbose_name': 'Sales Order Extra Line',
},
),
migrations.CreateModel(
@ -103,6 +104,7 @@ class Migration(migrations.Migration):
],
options={
'abstract': False,
'verbose_name': 'Purchase Order Extra Line',
},
),
migrations.RunPython(convert_line_items, reverse_code=nunconvert_line_items),

View File

@ -30,6 +30,7 @@ class Migration(migrations.Migration):
],
options={
'abstract': False,
'verbose_name': 'Return Order Extra Line',
},
),
]

View File

@ -44,6 +44,7 @@ class Migration(migrations.Migration):
],
options={
'unique_together': {('order', 'item')},
'verbose_name': 'Return Order Line Item',
},
),
]

View File

@ -400,7 +400,7 @@ class PurchaseOrder(TotalPriceMixin, Order):
return PurchaseOrderStatusGroups
@classmethod
def api_defaults(cls, request):
def api_defaults(cls, request=None):
"""Return default values for this model when issuing an API OPTIONS request."""
defaults = {
'reference': order.validators.generate_next_purchase_order_reference()
@ -865,7 +865,7 @@ class SalesOrder(TotalPriceMixin, Order):
return SalesOrderStatusGroups
@classmethod
def api_defaults(cls, request):
def api_defaults(cls, request=None):
"""Return default values for this model when issuing an API OPTIONS request."""
defaults = {'reference': order.validators.generate_next_sales_order_reference()}
@ -1355,6 +1355,11 @@ class PurchaseOrderLineItem(OrderLineItem):
order: Reference to a PurchaseOrder object
"""
class Meta:
"""Model meta options."""
verbose_name = _('Purchase Order Line Item')
# Filter for determining if a particular PurchaseOrderLineItem is overdue
OVERDUE_FILTER = (
Q(received__lt=F('quantity'))
@ -1492,6 +1497,11 @@ class PurchaseOrderExtraLine(OrderExtraLine):
price: The unit price for this OrderLine
"""
class Meta:
"""Model meta options."""
verbose_name = _('Purchase Order Extra Line')
@staticmethod
def get_api_url():
"""Return the API URL associated with the PurchaseOrderExtraLine model."""
@ -1516,6 +1526,11 @@ class SalesOrderLineItem(OrderLineItem):
shipped: The number of items which have actually shipped against this line item
"""
class Meta:
"""Model meta options."""
verbose_name = _('Sales Order Line Item')
# Filter for determining if a particular SalesOrderLineItem is overdue
OVERDUE_FILTER = (
Q(shipped__lt=F('quantity'))
@ -1649,6 +1664,7 @@ class SalesOrderShipment(
# Shipment reference must be unique for a given sales order
unique_together = ['order', 'reference']
verbose_name = _('Sales Order Shipment')
@staticmethod
def get_api_url():
@ -1806,6 +1822,11 @@ class SalesOrderExtraLine(OrderExtraLine):
price: The unit price for this OrderLine
"""
class Meta:
"""Model meta options."""
verbose_name = _('Sales Order Extra Line')
@staticmethod
def get_api_url():
"""Return the API URL associated with the SalesOrderExtraLine model."""
@ -1830,6 +1851,11 @@ class SalesOrderAllocation(models.Model):
quantity: Quantity to take from the StockItem
"""
class Meta:
"""Model meta options."""
verbose_name = _('Sales Order Allocation')
@staticmethod
def get_api_url():
"""Return the API URL associated with the SalesOrderAllocation model."""
@ -2001,7 +2027,7 @@ class ReturnOrder(TotalPriceMixin, Order):
return ReturnOrderStatusGroups
@classmethod
def api_defaults(cls, request):
def api_defaults(cls, request=None):
"""Return default values for this model when issuing an API OPTIONS request."""
defaults = {
'reference': order.validators.generate_next_return_order_reference()
@ -2208,6 +2234,7 @@ class ReturnOrderLineItem(OrderLineItem):
class Meta:
"""Metaclass options for this model."""
verbose_name = _('Return Order Line Item')
unique_together = [('order', 'item')]
@staticmethod
@ -2270,6 +2297,11 @@ class ReturnOrderLineItem(OrderLineItem):
class ReturnOrderExtraLine(OrderExtraLine):
"""Model for a single ExtraLine in a ReturnOrder."""
class Meta:
"""Metaclass options for this model."""
verbose_name = _('Return Order Extra Line')
@staticmethod
def get_api_url():
"""Return the API URL associated with the ReturnOrderExtraLine model."""

View File

@ -33,6 +33,8 @@ from company.serializers import (
ContactSerializer,
SupplierPartSerializer,
)
from importer.mixins import DataImportExportSerializerMixin
from importer.registry import register_importer
from InvenTree.helpers import (
current_date,
extract_serial_numbers,
@ -72,9 +74,11 @@ class TotalPriceMixin(serializers.Serializer):
)
class AbstractOrderSerializer(serializers.Serializer):
class AbstractOrderSerializer(DataImportExportSerializerMixin, serializers.Serializer):
"""Abstract serializer class which provides fields common to all order types."""
export_exclude_fields = ['notes']
# Number of line items in this order
line_items = serializers.IntegerField(read_only=True, label=_('Line Items'))
@ -100,6 +104,10 @@ class AbstractOrderSerializer(serializers.Serializer):
source='responsible', read_only=True, many=False
)
project_code = serializers.CharField(
source='project_code.code', label=_('Project Code'), read_only=True
)
# Detail for project code field
project_code_detail = ProjectCodeSerializer(
source='project_code', read_only=True, many=False
@ -159,7 +167,17 @@ class AbstractOrderSerializer(serializers.Serializer):
] + extra_fields
class AbstractExtraLineSerializer(serializers.Serializer):
class AbstractLineItemSerializer:
"""Abstract serializer for LineItem object."""
target_date = serializers.DateField(
required=False, allow_null=True, label=_('Target Date')
)
class AbstractExtraLineSerializer(
DataImportExportSerializerMixin, serializers.Serializer
):
"""Abstract Serializer for a ExtraLine object."""
def __init__(self, *args, **kwargs):
@ -169,7 +187,7 @@ class AbstractExtraLineSerializer(serializers.Serializer):
super().__init__(*args, **kwargs)
if order_detail is not True:
self.fields.pop('order_detail')
self.fields.pop('order_detail', None)
quantity = serializers.FloatField()
@ -196,6 +214,7 @@ class AbstractExtraLineMeta:
]
@register_importer()
class PurchaseOrderSerializer(
NotesFieldMixin, TotalPriceMixin, AbstractOrderSerializer, InvenTreeModelSerializer
):
@ -230,7 +249,7 @@ class PurchaseOrderSerializer(
super().__init__(*args, **kwargs)
if supplier_detail is not True:
self.fields.pop('supplier_detail')
self.fields.pop('supplier_detail', None)
@staticmethod
def annotate_queryset(queryset):
@ -338,7 +357,12 @@ class PurchaseOrderIssueSerializer(serializers.Serializer):
order.place_order()
class PurchaseOrderLineItemSerializer(InvenTreeModelSerializer):
@register_importer()
class PurchaseOrderLineItemSerializer(
DataImportExportSerializerMixin,
AbstractLineItemSerializer,
InvenTreeModelSerializer,
):
"""Serializer class for the PurchaseOrderLineItem model."""
class Meta:
@ -367,6 +391,11 @@ class PurchaseOrderLineItemSerializer(InvenTreeModelSerializer):
'total_price',
'link',
'merge_items',
'sku',
'mpn',
'ipn',
'internal_part',
'internal_part_name',
]
def __init__(self, *args, **kwargs):
@ -378,11 +407,11 @@ class PurchaseOrderLineItemSerializer(InvenTreeModelSerializer):
super().__init__(*args, **kwargs)
if part_detail is not True:
self.fields.pop('part_detail')
self.fields.pop('supplier_part_detail')
self.fields.pop('part_detail', None)
self.fields.pop('supplier_part_detail', None)
if order_detail is not True:
self.fields.pop('order_detail')
self.fields.pop('order_detail', None)
def skip_create_fields(self):
"""Return a list of fields to skip when creating a new object."""
@ -480,6 +509,25 @@ class PurchaseOrderLineItemSerializer(InvenTreeModelSerializer):
'Merge items with the same part, destination and target date into one line item'
),
default=True,
write_only=True,
)
sku = serializers.CharField(source='part.SKU', read_only=True, label=_('SKU'))
mpn = serializers.CharField(
source='part.manufacturer_part.MPN', read_only=True, label=_('MPN')
)
ipn = serializers.CharField(
source='part.part.IPN', read_only=True, label=_('Internal Part Number')
)
internal_part = serializers.PrimaryKeyRelatedField(
source='part.part', read_only=True, many=False, label=_('Internal Part')
)
internal_part_name = serializers.CharField(
source='part.part.name', read_only=True, label=_('Internal Part Name')
)
def validate(self, data):
@ -513,6 +561,7 @@ class PurchaseOrderLineItemSerializer(InvenTreeModelSerializer):
return data
@register_importer()
class PurchaseOrderExtraLineSerializer(
AbstractExtraLineSerializer, InvenTreeModelSerializer
):
@ -755,6 +804,7 @@ class PurchaseOrderReceiveSerializer(serializers.Serializer):
raise ValidationError(detail=serializers.as_serializer_error(exc))
@register_importer()
class SalesOrderSerializer(
NotesFieldMixin, TotalPriceMixin, AbstractOrderSerializer, InvenTreeModelSerializer
):
@ -785,7 +835,7 @@ class SalesOrderSerializer(
super().__init__(*args, **kwargs)
if customer_detail is not True:
self.fields.pop('customer_detail')
self.fields.pop('customer_detail', None)
@staticmethod
def annotate_queryset(queryset):
@ -872,19 +922,19 @@ class SalesOrderAllocationSerializer(InvenTreeModelSerializer):
super().__init__(*args, **kwargs)
if not order_detail:
self.fields.pop('order_detail')
self.fields.pop('order_detail', None)
if not part_detail:
self.fields.pop('part_detail')
self.fields.pop('part_detail', None)
if not item_detail:
self.fields.pop('item_detail')
self.fields.pop('item_detail', None)
if not location_detail:
self.fields.pop('location_detail')
self.fields.pop('location_detail', None)
if not customer_detail:
self.fields.pop('customer_detail')
self.fields.pop('customer_detail', None)
part = serializers.PrimaryKeyRelatedField(source='item.part', read_only=True)
order = serializers.PrimaryKeyRelatedField(
@ -914,7 +964,12 @@ class SalesOrderAllocationSerializer(InvenTreeModelSerializer):
)
class SalesOrderLineItemSerializer(InvenTreeModelSerializer):
@register_importer()
class SalesOrderLineItemSerializer(
DataImportExportSerializerMixin,
AbstractLineItemSerializer,
InvenTreeModelSerializer,
):
"""Serializer for a SalesOrderLineItem object."""
class Meta:
@ -957,16 +1012,16 @@ class SalesOrderLineItemSerializer(InvenTreeModelSerializer):
super().__init__(*args, **kwargs)
if part_detail is not True:
self.fields.pop('part_detail')
self.fields.pop('part_detail', None)
if order_detail is not True:
self.fields.pop('order_detail')
self.fields.pop('order_detail', None)
if allocations is not True:
self.fields.pop('allocations')
self.fields.pop('allocations', None)
if customer_detail is not True:
self.fields.pop('customer_detail')
self.fields.pop('customer_detail', None)
@staticmethod
def annotate_queryset(queryset):
@ -1063,6 +1118,7 @@ class SalesOrderLineItemSerializer(InvenTreeModelSerializer):
)
@register_importer()
class SalesOrderShipmentSerializer(NotesFieldMixin, InvenTreeModelSerializer):
"""Serializer for the SalesOrderShipment class."""
@ -1499,6 +1555,7 @@ class SalesOrderShipmentAllocationSerializer(serializers.Serializer):
allocation.save()
@register_importer()
class SalesOrderExtraLineSerializer(
AbstractExtraLineSerializer, InvenTreeModelSerializer
):
@ -1512,6 +1569,7 @@ class SalesOrderExtraLineSerializer(
order_detail = SalesOrderSerializer(source='order', many=False, read_only=True)
@register_importer()
class ReturnOrderSerializer(
NotesFieldMixin, AbstractOrderSerializer, TotalPriceMixin, InvenTreeModelSerializer
):
@ -1539,7 +1597,7 @@ class ReturnOrderSerializer(
super().__init__(*args, **kwargs)
if customer_detail is not True:
self.fields.pop('customer_detail')
self.fields.pop('customer_detail', None)
@staticmethod
def annotate_queryset(queryset):
@ -1690,7 +1748,12 @@ class ReturnOrderReceiveSerializer(serializers.Serializer):
order.receive_line_item(line_item, location, request.user)
class ReturnOrderLineItemSerializer(InvenTreeModelSerializer):
@register_importer()
class ReturnOrderLineItemSerializer(
DataImportExportSerializerMixin,
AbstractLineItemSerializer,
InvenTreeModelSerializer,
):
"""Serializer for a ReturnOrderLineItem object."""
class Meta:
@ -1725,13 +1788,13 @@ class ReturnOrderLineItemSerializer(InvenTreeModelSerializer):
super().__init__(*args, **kwargs)
if not order_detail:
self.fields.pop('order_detail')
self.fields.pop('order_detail', None)
if not item_detail:
self.fields.pop('item_detail')
self.fields.pop('item_detail', None)
if not part_detail:
self.fields.pop('part_detail')
self.fields.pop('part_detail', None)
order_detail = ReturnOrderSerializer(source='order', many=False, read_only=True)
item_detail = stock.serializers.StockItemSerializer(
@ -1743,6 +1806,7 @@ class ReturnOrderLineItemSerializer(InvenTreeModelSerializer):
price_currency = InvenTreeCurrencySerializer(help_text=_('Line price currency'))
@register_importer()
class ReturnOrderExtraLineSerializer(
AbstractExtraLineSerializer, InvenTreeModelSerializer
):

View File

@ -793,14 +793,14 @@ class PurchaseOrderDownloadTest(OrderTest):
"""Unit tests for downloading PurchaseOrder data via the API endpoint."""
required_cols = [
'id',
'line_items',
'description',
'issue_date',
'notes',
'reference',
'status',
'supplier_reference',
'ID',
'Line Items',
'Description',
'Issue Date',
'Order Currency',
'Reference',
'Order Status',
'Supplier Reference',
]
excluded_cols = ['metadata']
@ -818,7 +818,7 @@ class PurchaseOrderDownloadTest(OrderTest):
reverse('api-po-list'),
{'export': 'csv'},
expected_code=200,
expected_fn='InvenTree_PurchaseOrders.csv',
expected_fn=r'InvenTree_PurchaseOrder_.+\.csv',
) as file:
data = self.process_csv(
file,
@ -828,10 +828,10 @@ class PurchaseOrderDownloadTest(OrderTest):
)
for row in data:
order = models.PurchaseOrder.objects.get(pk=row['id'])
order = models.PurchaseOrder.objects.get(pk=row['ID'])
self.assertEqual(order.description, row['description'])
self.assertEqual(order.reference, row['reference'])
self.assertEqual(order.description, row['Description'])
self.assertEqual(order.reference, row['Reference'])
def test_download_line_items(self):
"""Test that the PurchaseOrderLineItems can be downloaded to a file."""
@ -840,7 +840,7 @@ class PurchaseOrderDownloadTest(OrderTest):
{'export': 'xlsx'},
decode=False,
expected_code=200,
expected_fn='InvenTree_PurchaseOrderItems.xlsx',
expected_fn=r'InvenTree_PurchaseOrderLineItem.+\.xlsx',
) as file:
self.assertIsInstance(file, io.BytesIO)
@ -1473,13 +1473,13 @@ class SalesOrderTest(OrderTest):
order.save()
# Download file, check we get a 200 response
for fmt in ['csv', 'xls', 'xlsx']:
for fmt in ['csv', 'xlsx', 'tsv']:
self.download_file(
reverse('api-so-list'),
{'export': fmt},
decode=True if fmt == 'csv' else False,
expected_code=200,
expected_fn=f'InvenTree_SalesOrders.{fmt}',
expected_fn=r'InvenTree_SalesOrder_.+',
)
def test_sales_order_complete(self):
@ -1635,17 +1635,13 @@ class SalesOrderDownloadTest(OrderTest):
with self.assertRaises(ValueError):
self.download_file(url, {}, expected_code=200)
def test_download_xls(self):
"""Test xls file download."""
def test_download_xlsx(self):
"""Test xlsx file download."""
url = reverse('api-so-list')
# Download .xls file
with self.download_file(
url,
{'export': 'xls'},
expected_code=200,
expected_fn='InvenTree_SalesOrders.xls',
decode=False,
url, {'export': 'xlsx'}, expected_code=200, decode=False
) as file:
self.assertIsInstance(file, io.BytesIO)
@ -1654,25 +1650,22 @@ class SalesOrderDownloadTest(OrderTest):
url = reverse('api-so-list')
required_cols = [
'line_items',
'id',
'reference',
'customer',
'status',
'shipment_date',
'notes',
'description',
'Line Items',
'ID',
'Reference',
'Customer',
'Order Status',
'Shipment Date',
'Description',
'Project Code',
'Responsible',
]
excluded_cols = ['metadata']
# Download .xls file
with self.download_file(
url,
{'export': 'csv'},
expected_code=200,
expected_fn='InvenTree_SalesOrders.csv',
decode=True,
url, {'export': 'csv'}, expected_code=200, decode=True
) as file:
data = self.process_csv(
file,
@ -1682,18 +1675,14 @@ class SalesOrderDownloadTest(OrderTest):
)
for line in data:
order = models.SalesOrder.objects.get(pk=line['id'])
order = models.SalesOrder.objects.get(pk=line['ID'])
self.assertEqual(line['description'], order.description)
self.assertEqual(line['status'], str(order.status))
self.assertEqual(line['Description'], order.description)
self.assertEqual(line['Order Status'], str(order.status))
# Download only outstanding sales orders
with self.download_file(
url,
{'export': 'tsv', 'outstanding': True},
expected_code=200,
expected_fn='InvenTree_SalesOrders.tsv',
decode=True,
url, {'export': 'tsv', 'outstanding': True}, expected_code=200, decode=True
) as file:
self.process_csv(
file,

View File

@ -19,7 +19,8 @@ import order.models
import part.filters
from build.models import Build, BuildItem
from build.status_codes import BuildStatusGroups
from InvenTree.api import APIDownloadMixin, ListCreateDestroyAPIView, MetadataView
from importer.mixins import DataExportViewMixin
from InvenTree.api import ListCreateDestroyAPIView, MetadataView
from InvenTree.filters import (
ORDER_FILTER,
ORDER_FILTER_ALIAS,
@ -28,7 +29,7 @@ from InvenTree.filters import (
InvenTreeDateFilter,
InvenTreeSearchFilter,
)
from InvenTree.helpers import DownloadFile, increment_serial_number, isNull, str2bool
from InvenTree.helpers import increment_serial_number, isNull, str2bool
from InvenTree.mixins import (
CreateAPI,
CustomRetrieveUpdateDestroyAPI,
@ -228,7 +229,7 @@ class CategoryFilter(rest_filters.FilterSet):
return queryset
class CategoryList(CategoryMixin, APIDownloadMixin, ListCreateAPI):
class CategoryList(CategoryMixin, DataExportViewMixin, ListCreateAPI):
"""API endpoint for accessing a list of PartCategory objects.
- GET: Return a list of PartCategory objects
@ -237,14 +238,6 @@ class CategoryList(CategoryMixin, APIDownloadMixin, ListCreateAPI):
filterset_class = CategoryFilter
def download_queryset(self, queryset, export_format):
"""Download the filtered queryset as a data file."""
dataset = PartCategoryResource().export(queryset=queryset)
filedata = dataset.export(export_format)
filename = f'InvenTree_Categories.{export_format}'
return DownloadFile(filedata, filename)
filter_backends = SEARCH_ORDER_FILTER
ordering_fields = ['name', 'pathstring', 'level', 'tree_id', 'lft', 'part_count']
@ -327,7 +320,7 @@ class CategoryTree(ListAPI):
return queryset
class CategoryParameterList(ListCreateAPI):
class CategoryParameterList(DataExportViewMixin, ListCreateAPI):
"""API endpoint for accessing a list of PartCategoryParameterTemplate objects.
- GET: Return a list of PartCategoryParameterTemplate objects
@ -382,7 +375,7 @@ class PartSalePriceDetail(RetrieveUpdateDestroyAPI):
serializer_class = part_serializers.PartSalePriceSerializer
class PartSalePriceList(ListCreateAPI):
class PartSalePriceList(DataExportViewMixin, ListCreateAPI):
"""API endpoint for list view of PartSalePriceBreak model."""
queryset = PartSellPriceBreak.objects.all()
@ -401,7 +394,7 @@ class PartInternalPriceDetail(RetrieveUpdateDestroyAPI):
serializer_class = part_serializers.PartInternalPriceSerializer
class PartInternalPriceList(ListCreateAPI):
class PartInternalPriceList(DataExportViewMixin, ListCreateAPI):
"""API endpoint for list view of PartInternalPriceBreak model."""
queryset = PartInternalPriceBreak.objects.all()
@ -477,7 +470,7 @@ class PartTestTemplateDetail(PartTestTemplateMixin, RetrieveUpdateDestroyAPI):
pass
class PartTestTemplateList(PartTestTemplateMixin, ListCreateAPI):
class PartTestTemplateList(PartTestTemplateMixin, DataExportViewMixin, ListCreateAPI):
"""API endpoint for listing (and creating) a PartTestTemplate."""
filterset_class = PartTestTemplateFilter
@ -1224,21 +1217,12 @@ class PartMixin:
return context
class PartList(PartMixin, APIDownloadMixin, ListCreateAPI):
class PartList(PartMixin, DataExportViewMixin, ListCreateAPI):
"""API endpoint for accessing a list of Part objects, or creating a new Part instance."""
filterset_class = PartFilter
is_create = True
def download_queryset(self, queryset, export_format):
"""Download the filtered queryset as a data file."""
dataset = PartResource().export(queryset=queryset)
filedata = dataset.export(export_format)
filename = f'InvenTree_Parts.{export_format}'
return DownloadFile(filedata, filename)
def filter_queryset(self, queryset):
"""Perform custom filtering of the queryset."""
params = self.request.query_params
@ -1534,7 +1518,9 @@ class PartParameterTemplateMixin:
return queryset
class PartParameterTemplateList(PartParameterTemplateMixin, ListCreateAPI):
class PartParameterTemplateList(
PartParameterTemplateMixin, DataExportViewMixin, ListCreateAPI
):
"""API endpoint for accessing a list of PartParameterTemplate objects.
- GET: Return list of PartParameterTemplate objects
@ -1615,7 +1601,7 @@ class PartParameterFilter(rest_filters.FilterSet):
return queryset.filter(part=part)
class PartParameterList(PartParameterAPIMixin, ListCreateAPI):
class PartParameterList(PartParameterAPIMixin, DataExportViewMixin, ListCreateAPI):
"""API endpoint for accessing a list of PartParameter objects.
- GET: Return list of PartParameter objects
@ -1843,7 +1829,7 @@ class BomMixin:
return queryset
class BomList(BomMixin, ListCreateDestroyAPIView):
class BomList(BomMixin, DataExportViewMixin, ListCreateDestroyAPIView):
"""API endpoint for accessing a list of BomItem objects.
- GET: Return list of BomItem objects

View File

@ -19,5 +19,8 @@ class Migration(migrations.Migration):
('data', models.CharField(help_text='Parameter Value', max_length=100)),
('part', models.ForeignKey(help_text='Parent Part', on_delete=django.db.models.deletion.CASCADE, related_name='parameters', to='part.Part')),
],
options={
'verbose_name': 'Part Parameter',
},
),
]

View File

@ -18,6 +18,9 @@ class Migration(migrations.Migration):
('name', models.CharField(help_text='Parameter Name', max_length=100)),
('units', models.CharField(blank=True, help_text='Parameter Units', max_length=25)),
],
options={
'verbose_name': 'Part Parameter Template',
},
),
migrations.RemoveField(
model_name='partparameter',

View File

@ -19,5 +19,8 @@ class Migration(migrations.Migration):
('required', models.BooleanField(default=True, help_text='Is this test required to pass?', verbose_name='Required')),
('part', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='test_templates', to='part.Part')),
],
options={
'verbose_name': 'Part Test Template',
},
),
]

View File

@ -24,6 +24,7 @@ class Migration(migrations.Migration):
('part', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='salepricebreaks', to='part.Part')),
],
options={
'verbose_name': 'Part Sale Price Break',
'unique_together': {('part', 'quantity')},
},
),

View File

@ -19,6 +19,9 @@ class Migration(migrations.Migration):
('category', models.ForeignKey(help_text='Part Category', on_delete=django.db.models.deletion.CASCADE, related_name='parameter_templates', to='part.PartCategory')),
('parameter_template', models.ForeignKey(help_text='Parameter Template', on_delete=django.db.models.deletion.CASCADE, related_name='part_categories', to='part.PartParameterTemplate')),
],
options={
'verbose_name': 'Part Category Parameter Template',
},
),
migrations.AddConstraint(
model_name='partcategoryparametertemplate',

View File

@ -3288,6 +3288,7 @@ class PartSellPriceBreak(common.models.PriceBreak):
class Meta:
"""Metaclass providing extra model definition."""
verbose_name = _('Part Sale Price Break')
unique_together = ('part', 'quantity')
@staticmethod
@ -3396,6 +3397,11 @@ class PartTestTemplate(InvenTree.models.InvenTreeMetadataModel):
run on the model (refer to the validate_unique function).
"""
class Meta:
"""Metaclass options for the PartTestTemplate model."""
verbose_name = _('Part Test Template')
def __str__(self):
"""Format a string representation of this PartTestTemplate."""
return ' | '.join([self.part.name, self.test_name])
@ -3555,6 +3561,11 @@ class PartParameterTemplate(InvenTree.models.InvenTreeMetadataModel):
choices: List of valid choices for the parameter [string]
"""
class Meta:
"""Metaclass options for the PartParameterTemplate model."""
verbose_name = _('Part Parameter Template')
@staticmethod
def get_api_url():
"""Return the list API endpoint URL associated with the PartParameterTemplate model."""
@ -3699,6 +3710,7 @@ class PartParameter(InvenTree.models.InvenTreeMetadataModel):
class Meta:
"""Metaclass providing extra model definition."""
verbose_name = _('Part Parameter')
# Prevent multiple instances of a parameter for a single part
unique_together = ('part', 'template')
@ -3841,9 +3853,16 @@ class PartCategoryParameterTemplate(InvenTree.models.InvenTreeMetadataModel):
category
"""
@staticmethod
def get_api_url():
"""Return the API endpoint URL associated with the PartCategoryParameterTemplate model."""
return reverse('api-part-category-parameter-list')
class Meta:
"""Metaclass providing extra model definition."""
verbose_name = _('Part Category Parameter Template')
constraints = [
UniqueConstraint(
fields=['category', 'parameter_template'],

View File

@ -34,6 +34,8 @@ import part.tasks
import stock.models
import users.models
from build.status_codes import BuildStatusGroups
from importer.mixins import DataImportExportSerializerMixin
from importer.registry import register_importer
from InvenTree.tasks import offload_task
from .models import (
@ -57,7 +59,10 @@ from .models import (
logger = logging.getLogger('inventree')
class CategorySerializer(InvenTree.serializers.InvenTreeModelSerializer):
@register_importer()
class CategorySerializer(
DataImportExportSerializerMixin, InvenTree.serializers.InvenTreeModelSerializer
):
"""Serializer for PartCategory."""
class Meta:
@ -82,6 +87,7 @@ class CategorySerializer(InvenTree.serializers.InvenTreeModelSerializer):
'icon',
'parent_default_location',
]
read_only_fields = ['level', 'pathstring']
def __init__(self, *args, **kwargs):
"""Optionally add or remove extra fields."""
@ -90,7 +96,7 @@ class CategorySerializer(InvenTree.serializers.InvenTreeModelSerializer):
super().__init__(*args, **kwargs)
if not path_detail:
self.fields.pop('path')
self.fields.pop('path', None)
def get_starred(self, category) -> bool:
"""Return True if the category is directly "starred" by the current user."""
@ -153,7 +159,10 @@ class CategoryTree(InvenTree.serializers.InvenTreeModelSerializer):
return queryset.annotate(subcategories=part.filters.annotate_sub_categories())
class PartTestTemplateSerializer(InvenTree.serializers.InvenTreeModelSerializer):
@register_importer()
class PartTestTemplateSerializer(
DataImportExportSerializerMixin, InvenTree.serializers.InvenTreeModelSerializer
):
"""Serializer for the PartTestTemplate class."""
class Meta:
@ -188,7 +197,10 @@ class PartTestTemplateSerializer(InvenTree.serializers.InvenTreeModelSerializer)
return queryset.annotate(results=SubqueryCount('test_results'))
class PartSalePriceSerializer(InvenTree.serializers.InvenTreeModelSerializer):
@register_importer()
class PartSalePriceSerializer(
DataImportExportSerializerMixin, InvenTree.serializers.InvenTreeModelSerializer
):
"""Serializer for sale prices for Part model."""
class Meta:
@ -253,7 +265,10 @@ class PartThumbSerializerUpdate(InvenTree.serializers.InvenTreeModelSerializer):
image = InvenTree.serializers.InvenTreeAttachmentSerializerField(required=True)
class PartParameterTemplateSerializer(InvenTree.serializers.InvenTreeModelSerializer):
@register_importer()
class PartParameterTemplateSerializer(
DataImportExportSerializerMixin, InvenTree.serializers.InvenTreeModelSerializer
):
"""JSON serializer for the PartParameterTemplate model."""
class Meta:
@ -314,8 +329,8 @@ class PartBriefSerializer(InvenTree.serializers.InvenTreeModelSerializer):
super().__init__(*args, **kwargs)
if not pricing:
self.fields.pop('pricing_min')
self.fields.pop('pricing_max')
self.fields.pop('pricing_min', None)
self.fields.pop('pricing_max', None)
category_default_location = serializers.IntegerField(read_only=True)
@ -331,7 +346,10 @@ class PartBriefSerializer(InvenTree.serializers.InvenTreeModelSerializer):
)
class PartParameterSerializer(InvenTree.serializers.InvenTreeModelSerializer):
@register_importer()
class PartParameterSerializer(
DataImportExportSerializerMixin, InvenTree.serializers.InvenTreeModelSerializer
):
"""JSON serializers for the PartParameter model."""
class Meta:
@ -359,10 +377,10 @@ class PartParameterSerializer(InvenTree.serializers.InvenTreeModelSerializer):
super().__init__(*args, **kwargs)
if not part_detail:
self.fields.pop('part_detail')
self.fields.pop('part_detail', None)
if not template_detail:
self.fields.pop('template_detail')
self.fields.pop('template_detail', None)
part_detail = PartBriefSerializer(source='part', many=False, read_only=True)
template_detail = PartParameterTemplateSerializer(
@ -573,7 +591,9 @@ class InitialSupplierSerializer(serializers.Serializer):
return data
@register_importer()
class PartSerializer(
DataImportExportSerializerMixin,
InvenTree.serializers.NotesFieldMixin,
InvenTree.serializers.RemoteImageMixin,
InvenTree.serializers.InvenTreeTagModelSerializer,
@ -595,6 +615,7 @@ class PartSerializer(
'category',
'category_detail',
'category_path',
'category_name',
'component',
'creation_date',
'creation_user',
@ -671,13 +692,13 @@ class PartSerializer(
super().__init__(*args, **kwargs)
if not category_detail:
self.fields.pop('category_detail')
self.fields.pop('category_detail', None)
if not parameters:
self.fields.pop('parameters')
self.fields.pop('parameters', None)
if not path_detail:
self.fields.pop('category_path')
self.fields.pop('category_path', None)
if not create:
# These fields are only used for the LIST API endpoint
@ -685,12 +706,12 @@ class PartSerializer(
# Fields required for certain operations, but are not part of the model
if f in ['remote_image', 'existing_image']:
continue
self.fields.pop(f)
self.fields.pop(f, None)
if not pricing:
self.fields.pop('pricing_min')
self.fields.pop('pricing_max')
self.fields.pop('pricing_updated')
self.fields.pop('pricing_min', None)
self.fields.pop('pricing_max', None)
self.fields.pop('pricing_updated', None)
def get_api_url(self):
"""Return the API url associated with this serializer."""
@ -809,6 +830,10 @@ class PartSerializer(
child=serializers.DictField(), source='category.get_path', read_only=True
)
category_name = serializers.CharField(
source='category.name', read_only=True, label=_('Category Name')
)
responsible = serializers.PrimaryKeyRelatedField(
queryset=users.models.Owner.objects.all(),
required=False,
@ -823,8 +848,8 @@ class PartSerializer(
# Annotated fields
allocated_to_build_orders = serializers.FloatField(read_only=True)
allocated_to_sales_orders = serializers.FloatField(read_only=True)
building = serializers.FloatField(read_only=True)
in_stock = serializers.FloatField(read_only=True)
building = serializers.FloatField(read_only=True, label=_('Building'))
in_stock = serializers.FloatField(read_only=True, label=_('In Stock'))
ordering = serializers.FloatField(read_only=True, label=_('On Order'))
required_for_build_orders = serializers.IntegerField(read_only=True)
required_for_sales_orders = serializers.IntegerField(read_only=True)
@ -1412,7 +1437,10 @@ class BomItemSubstituteSerializer(InvenTree.serializers.InvenTreeModelSerializer
)
class BomItemSerializer(InvenTree.serializers.InvenTreeModelSerializer):
@register_importer()
class BomItemSerializer(
DataImportExportSerializerMixin, InvenTree.serializers.InvenTreeModelSerializer
):
"""Serializer for BomItem object."""
class Meta:
@ -1464,17 +1492,17 @@ class BomItemSerializer(InvenTree.serializers.InvenTreeModelSerializer):
super().__init__(*args, **kwargs)
if not part_detail:
self.fields.pop('part_detail')
self.fields.pop('part_detail', None)
if not sub_part_detail:
self.fields.pop('sub_part_detail')
self.fields.pop('sub_part_detail', None)
if not pricing:
self.fields.pop('pricing_min')
self.fields.pop('pricing_max')
self.fields.pop('pricing_min_total')
self.fields.pop('pricing_max_total')
self.fields.pop('pricing_updated')
self.fields.pop('pricing_min', None)
self.fields.pop('pricing_max', None)
self.fields.pop('pricing_min_total', None)
self.fields.pop('pricing_max_total', None)
self.fields.pop('pricing_updated', None)
quantity = InvenTree.serializers.InvenTreeDecimalField(required=True)
@ -1679,8 +1707,9 @@ class BomItemSerializer(InvenTree.serializers.InvenTreeModelSerializer):
return queryset
@register_importer()
class CategoryParameterTemplateSerializer(
InvenTree.serializers.InvenTreeModelSerializer
DataImportExportSerializerMixin, InvenTree.serializers.InvenTreeModelSerializer
):
"""Serializer for the PartCategoryParameterTemplate model."""
@ -1771,7 +1800,10 @@ class PartCopyBOMSerializer(serializers.Serializer):
class BomImportUploadSerializer(InvenTree.serializers.DataFileUploadSerializer):
"""Serializer for uploading a file and extracting data from it."""
"""Serializer for uploading a file and extracting data from it.
TODO: Delete this entirely once the new importer process is working
"""
TARGET_MODEL = BomItem
@ -1804,6 +1836,8 @@ class BomImportExtractSerializer(InvenTree.serializers.DataFileExtractSerializer
"""Serializer class for exatracting BOM data from an uploaded file.
The parent class DataFileExtractSerializer does most of the heavy lifting here.
TODO: Delete this entirely once the new importer process is working
"""
TARGET_MODEL = BomItem
@ -1891,7 +1925,9 @@ class BomImportExtractSerializer(InvenTree.serializers.DataFileExtractSerializer
class BomImportSubmitSerializer(serializers.Serializer):
"""Serializer for uploading a BOM against a specified part.
A "BOM" is a set of BomItem objects which are to be validated together as a set
A "BOM" is a set of BomItem objects which are to be validated together as a set.
TODO: Delete this entirely once the new importer process is working
"""
items = BomItemSerializer(many=True, required=True)

View File

@ -1033,25 +1033,26 @@ class PartAPITest(PartAPITestBase):
url = reverse('api-part-list')
required_cols = [
'Part ID',
'Part Name',
'Part Description',
'In Stock',
'ID',
'Name',
'Description',
'Total Stock',
'Category Name',
'Keywords',
'Template',
'Is Template',
'Virtual',
'Trackable',
'Active',
'Notes',
'creation_date',
'Creation Date',
'On Order',
'In Stock',
'Link',
]
excluded_cols = ['lft', 'rght', 'level', 'tree_id', 'metadata']
with self.download_file(
url, {'export': 'csv'}, expected_fn='InvenTree_Parts.csv'
) as file:
with self.download_file(url, {'export': 'csv'}) as file:
data = self.process_csv(
file,
excluded_cols=excluded_cols,
@ -1060,13 +1061,13 @@ class PartAPITest(PartAPITestBase):
)
for row in data:
part = Part.objects.get(pk=row['Part ID'])
part = Part.objects.get(pk=row['ID'])
if part.IPN:
self.assertEqual(part.IPN, row['IPN'])
self.assertEqual(part.name, row['Part Name'])
self.assertEqual(part.description, row['Part Description'])
self.assertEqual(part.name, row['Name'])
self.assertEqual(part.description, row['Description'])
if part.category:
self.assertEqual(part.category.name, row['Category Name'])
@ -2936,7 +2937,7 @@ class PartTestTemplateTest(PartAPITestBase):
options = response.data['actions']['PUT']
self.assertTrue(options['pk']['read_only'])
self.assertTrue(options['pk']['required'])
self.assertFalse(options['pk']['required'])
self.assertEqual(options['part']['api_url'], '/api/part/')
self.assertTrue(options['test_name']['required'])
self.assertFalse(options['test_name']['read_only'])

View File

@ -29,11 +29,11 @@ class BomExportTest(InvenTreeTestCase):
url = reverse('api-bom-upload-template')
# Download an XLS template
response = self.client.get(url, data={'format': 'xls'})
response = self.client.get(url, data={'format': 'xlsx'})
self.assertEqual(response.status_code, 200)
self.assertEqual(
response.headers['Content-Disposition'],
'attachment; filename="InvenTree_BOM_Template.xls"',
'attachment; filename="InvenTree_BOM_Template.xlsx"',
)
# Return a simple CSV template
@ -134,10 +134,10 @@ class BomExportTest(InvenTreeTestCase):
for header in headers:
self.assertIn(header, expected)
def test_export_xls(self):
"""Test BOM download in XLS format."""
def test_export_xlsx(self):
"""Test BOM download in XLSX format."""
params = {
'format': 'xls',
'format': 'xlsx',
'cascade': True,
'parameter_data': True,
'stock_data': True,

View File

@ -131,6 +131,7 @@ def allow_table_event(table_name):
'socialaccount_',
'user_',
'users_',
'importer_',
]
if any(table_name.startswith(prefix) for prefix in ignore_prefixes):

View File

@ -28,7 +28,8 @@ from build.serializers import BuildSerializer
from company.models import Company, SupplierPart
from company.serializers import CompanySerializer
from generic.states.api import StatusView
from InvenTree.api import APIDownloadMixin, ListCreateDestroyAPIView, MetadataView
from importer.mixins import DataExportViewMixin
from InvenTree.api import ListCreateDestroyAPIView, MetadataView
from InvenTree.filters import (
ORDER_FILTER_ALIAS,
SEARCH_ORDER_FILTER,
@ -36,7 +37,6 @@ from InvenTree.filters import (
InvenTreeDateFilter,
)
from InvenTree.helpers import (
DownloadFile,
extract_serial_numbers,
generateTestKey,
is_ajax,
@ -399,7 +399,7 @@ class StockLocationFilter(rest_filters.FilterSet):
return queryset
class StockLocationList(APIDownloadMixin, ListCreateAPI):
class StockLocationList(DataExportViewMixin, ListCreateAPI):
"""API endpoint for list view of StockLocation objects.
- GET: Return list of StockLocation objects
@ -410,14 +410,6 @@ class StockLocationList(APIDownloadMixin, ListCreateAPI):
serializer_class = StockSerializers.LocationSerializer
filterset_class = StockLocationFilter
def download_queryset(self, queryset, export_format):
"""Download the filtered queryset as a data file."""
dataset = LocationResource().export(queryset=queryset)
filedata = dataset.export(export_format)
filename = f'InvenTree_Locations.{export_format}'
return DownloadFile(filedata, filename)
def get_queryset(self, *args, **kwargs):
"""Return annotated queryset for the StockLocationList endpoint."""
queryset = super().get_queryset(*args, **kwargs)
@ -870,7 +862,7 @@ class StockFilter(rest_filters.FilterSet):
return queryset.exclude(stale_filter)
class StockList(APIDownloadMixin, ListCreateDestroyAPIView):
class StockList(DataExportViewMixin, ListCreateDestroyAPIView):
"""API endpoint for list view of Stock objects.
- GET: Return a list of all StockItem objects (with optional query filters)
@ -1088,19 +1080,6 @@ class StockList(APIDownloadMixin, ListCreateDestroyAPIView):
headers=self.get_success_headers(serializer.data),
)
def download_queryset(self, queryset, export_format):
"""Download this queryset as a file.
Uses the APIDownloadMixin mixin class
"""
dataset = StockItemResource().export(queryset=queryset)
filedata = dataset.export(export_format)
filename = f'InvenTree_StockItems_{InvenTree.helpers.current_date().strftime("%d-%b-%Y")}.{export_format}'
return DownloadFile(filedata, filename)
def get_queryset(self, *args, **kwargs):
"""Annotate queryset before returning."""
queryset = super().get_queryset(*args, **kwargs)
@ -1211,6 +1190,7 @@ class StockList(APIDownloadMixin, ListCreateDestroyAPIView):
'updated',
'stocktake_date',
'expiry_date',
'packaging',
'quantity',
'stock',
'status',
@ -1370,7 +1350,7 @@ class StockTrackingDetail(RetrieveAPI):
serializer_class = StockSerializers.StockTrackingSerializer
class StockTrackingList(ListAPI):
class StockTrackingList(DataExportViewMixin, ListAPI):
"""API endpoint for list view of StockItemTracking objects.
StockItemTracking objects are read-only

View File

@ -64,6 +64,9 @@ class Migration(migrations.Migration):
('item', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='tracking_info', to='stock.StockItem')),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL)),
],
options={
'verbose_name': 'Stock Item Tracking',
}
),
migrations.AddField(
model_name='stockitem',

View File

@ -25,5 +25,8 @@ class Migration(migrations.Migration):
('stock_item', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='test_results', to='stock.StockItem')),
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to=settings.AUTH_USER_MODEL)),
],
options={
'verbose_name': 'Stock Item Test Result',
},
),
]

View File

@ -2311,6 +2311,11 @@ class StockItemTracking(InvenTree.models.InvenTreeModel):
deltas: The changes associated with this history item
"""
class Meta:
"""Meta data for the StockItemTracking class."""
verbose_name = _('Stock Item Tracking')
@staticmethod
def get_api_url():
"""Return API url."""
@ -2379,6 +2384,11 @@ class StockItemTestResult(InvenTree.models.InvenTreeMetadataModel):
date: Date the test result was recorded
"""
class Meta:
"""Meta data for the StockItemTestResult class."""
verbose_name = _('Stock Item Test Result')
def __str__(self):
"""Return string representation."""
return f'{self.test_name} - {self.result}'

View File

@ -26,6 +26,8 @@ import stock.filters
import stock.status_codes
from common.settings import get_global_setting
from company.serializers import SupplierPartSerializer
from importer.mixins import DataImportExportSerializerMixin
from importer.registry import register_importer
from InvenTree.serializers import InvenTreeCurrencySerializer, InvenTreeDecimalField
from part.serializers import PartBriefSerializer, PartTestTemplateSerializer
@ -177,7 +179,10 @@ class LocationBriefSerializer(InvenTree.serializers.InvenTreeModelSerializer):
fields = ['pk', 'name', 'pathstring']
class StockItemTestResultSerializer(InvenTree.serializers.InvenTreeModelSerializer):
@register_importer()
class StockItemTestResultSerializer(
DataImportExportSerializerMixin, InvenTree.serializers.InvenTreeModelSerializer
):
"""Serializer for the StockItemTestResult model."""
class Meta:
@ -212,10 +217,10 @@ class StockItemTestResultSerializer(InvenTree.serializers.InvenTreeModelSerializ
super().__init__(*args, **kwargs)
if user_detail is not True:
self.fields.pop('user_detail')
self.fields.pop('user_detail', None)
if template_detail is not True:
self.fields.pop('template_detail')
self.fields.pop('template_detail', None)
user_detail = InvenTree.serializers.UserSerializer(source='user', read_only=True)
@ -316,13 +321,22 @@ class StockItemSerializerBrief(
return value
class StockItemSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
@register_importer()
class StockItemSerializer(
DataImportExportSerializerMixin, InvenTree.serializers.InvenTreeTagModelSerializer
):
"""Serializer for a StockItem.
- Includes serialization for the linked part
- Includes serialization for the item location
"""
export_exclude_fields = ['tracking_items']
export_only_fields = ['part_pricing_min', 'part_pricing_max']
import_exclude_fields = ['use_pack_size', 'tags']
class Meta:
"""Metaclass options."""
@ -338,11 +352,13 @@ class StockItemSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
'is_building',
'link',
'location',
'location_name',
'location_detail',
'location_path',
'notes',
'owner',
'packaging',
'parent',
'part',
'part_detail',
'purchase_order',
@ -356,6 +372,7 @@ class StockItemSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
'status_text',
'stocktake_date',
'supplier_part',
'sku',
'supplier_part_detail',
'barcode_hash',
'updated',
@ -371,6 +388,9 @@ class StockItemSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
'stale',
'tracking_items',
'tags',
# Export only fields
'part_pricing_min',
'part_pricing_max',
]
"""
@ -401,19 +421,19 @@ class StockItemSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
super(StockItemSerializer, self).__init__(*args, **kwargs)
if not part_detail:
self.fields.pop('part_detail')
self.fields.pop('part_detail', None)
if not location_detail:
self.fields.pop('location_detail')
self.fields.pop('location_detail', None)
if not supplier_part_detail:
self.fields.pop('supplier_part_detail')
self.fields.pop('supplier_part_detail', None)
if not tests:
self.fields.pop('tests')
self.fields.pop('tests', None)
if not path_detail:
self.fields.pop('location_path')
self.fields.pop('location_path', None)
part = serializers.PrimaryKeyRelatedField(
queryset=part_models.Part.objects.all(),
@ -423,6 +443,17 @@ class StockItemSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
label=_('Part'),
)
parent = serializers.PrimaryKeyRelatedField(
many=False,
read_only=True,
label=_('Parent Item'),
help_text=_('Parent stock item'),
)
location_name = serializers.CharField(
source='location.name', read_only=True, label=_('Location Name')
)
location_path = serializers.ListField(
child=serializers.DictField(), source='location.get_path', read_only=True
)
@ -468,6 +499,7 @@ class StockItemSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
)
).prefetch_related(None),
),
'parent',
'part__category',
'part__pricing_data',
'supplier_part',
@ -525,6 +557,8 @@ class StockItemSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
status_text = serializers.CharField(source='get_status_display', read_only=True)
sku = serializers.CharField(source='supplier_part.SKU', read_only=True)
# Optional detail fields, which can be appended via query parameters
supplier_part_detail = SupplierPartSerializer(
source='supplier_part',
@ -535,9 +569,11 @@ class StockItemSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
read_only=True,
)
part_detail = PartBriefSerializer(source='part', many=False, read_only=True)
location_detail = LocationBriefSerializer(
source='location', many=False, read_only=True
)
tests = StockItemTestResultSerializer(
source='test_results', many=True, read_only=True
)
@ -545,12 +581,22 @@ class StockItemSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
quantity = InvenTreeDecimalField()
# Annotated fields
allocated = serializers.FloatField(required=False)
expired = serializers.BooleanField(required=False, read_only=True)
installed_items = serializers.IntegerField(read_only=True, required=False)
child_items = serializers.IntegerField(read_only=True, required=False)
stale = serializers.BooleanField(required=False, read_only=True)
tracking_items = serializers.IntegerField(read_only=True, required=False)
allocated = serializers.FloatField(
required=False, read_only=True, label=_('Allocated Quantity')
)
expired = serializers.BooleanField(
required=False, read_only=True, label=_('Expired')
)
installed_items = serializers.IntegerField(
read_only=True, required=False, label=_('Installed Items')
)
child_items = serializers.IntegerField(
read_only=True, required=False, label=_('Child Items')
)
stale = serializers.BooleanField(required=False, read_only=True, label=_('Stale'))
tracking_items = serializers.IntegerField(
read_only=True, required=False, label=_('Tracking Items')
)
purchase_price = InvenTree.serializers.InvenTreeMoneySerializer(
label=_('Purchase Price'),
@ -571,6 +617,18 @@ class StockItemSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
tags = TagListSerializerField(required=False)
part_pricing_min = InvenTree.serializers.InvenTreeMoneySerializer(
source='part.pricing_data.overall_min',
read_only=True,
label=_('Minimum Pricing'),
)
part_pricing_max = InvenTree.serializers.InvenTreeMoneySerializer(
source='part.pricing_data.overall_max',
read_only=True,
label=_('Maximum Pricing'),
)
class SerializeStockItemSerializer(serializers.Serializer):
"""A DRF serializer for "serializing" a StockItem.
@ -1026,9 +1084,14 @@ class LocationTreeSerializer(InvenTree.serializers.InvenTreeModelSerializer):
return queryset.annotate(sublocations=stock.filters.annotate_sub_locations())
class LocationSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
@register_importer()
class LocationSerializer(
DataImportExportSerializerMixin, InvenTree.serializers.InvenTreeTagModelSerializer
):
"""Detailed information about a stock location."""
import_exclude_fields = ['tags']
class Meta:
"""Metaclass options."""
@ -1055,7 +1118,7 @@ class LocationSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
'tags',
]
read_only_fields = ['barcode_hash', 'icon']
read_only_fields = ['barcode_hash', 'icon', 'level', 'pathstring']
def __init__(self, *args, **kwargs):
"""Optionally add or remove extra fields."""
@ -1064,7 +1127,7 @@ class LocationSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
super().__init__(*args, **kwargs)
if not path_detail:
self.fields.pop('path')
self.fields.pop('path', None)
@staticmethod
def annotate_queryset(queryset):
@ -1109,7 +1172,10 @@ class LocationSerializer(InvenTree.serializers.InvenTreeTagModelSerializer):
)
class StockTrackingSerializer(InvenTree.serializers.InvenTreeModelSerializer):
@register_importer()
class StockTrackingSerializer(
DataImportExportSerializerMixin, InvenTree.serializers.InvenTreeModelSerializer
):
"""Serializer for StockItemTracking model."""
class Meta:
@ -1139,10 +1205,10 @@ class StockTrackingSerializer(InvenTree.serializers.InvenTreeModelSerializer):
super().__init__(*args, **kwargs)
if item_detail is not True:
self.fields.pop('item_detail')
self.fields.pop('item_detail', None)
if user_detail is not True:
self.fields.pop('user_detail')
self.fields.pop('user_detail', None)
label = serializers.CharField(read_only=True)

View File

@ -765,11 +765,11 @@ class StockItemListTest(StockAPITestCase):
# Expected headers
headers = [
'Part ID',
'Customer ID',
'Location ID',
'Part',
'Customer',
'Stock Location',
'Location Name',
'Parent ID',
'Parent Item',
'Quantity',
'Status',
]

View File

@ -2449,6 +2449,7 @@ function loadBuildLineTable(table, build_id, options={}) {
// If data is passed directly to this function, do not setup filters
if (!options.data) {
setupFilterList('buildlines', $(table), filterTarget, {
download: true,
labels: {
modeltype: 'buildline',
},

View File

@ -357,6 +357,10 @@ class RuleSet(models.Model):
'django_q_task',
'django_q_schedule',
'django_q_success',
# Importing
'importer_dataimportsession',
'importer_dataimportcolumnmap',
'importer_dataimportrow',
]
RULESET_CHANGE_INHERIT = [('part', 'partparameter'), ('part', 'bomitem')]

View File

@ -43,7 +43,7 @@ export function ActionButton(props: ActionButtonProps) {
props.tooltip ?? props.text ?? ''
)}`}
onClick={props.onClick ?? notYetImplemented}
variant={props.variant ?? 'light'}
variant={props.variant ?? 'transparent'}
>
<Group gap="xs" wrap="nowrap">
{props.icon}

View File

@ -67,6 +67,7 @@ export interface ApiFormAction {
* @param successMessage : Optional message to display on successful form submission
* @param onFormSuccess : A callback function to call when the form is submitted successfully.
* @param onFormError : A callback function to call when the form is submitted with errors.
* @param processFormData : A callback function to process the form data before submission
* @param modelType : Define a model type for this form
* @param follow : Boolean, follow the result of the form (if possible)
* @param table : Table to update on success (if provided)
@ -91,6 +92,7 @@ export interface ApiFormProps {
successMessage?: string;
onFormSuccess?: (data: any) => void;
onFormError?: () => void;
processFormData?: (data: any) => any;
table?: TableState;
modelType?: ModelType;
follow?: boolean;
@ -386,6 +388,11 @@ export function ApiForm({
}
});
// Optionally pre-process the data before submitting it
if (props.processFormData) {
data = props.processFormData(data);
}
return api({
method: method,
url: url,

View File

@ -5,10 +5,12 @@ import { ApiFormField, ApiFormFieldType } from './fields/ApiFormField';
export function StandaloneField({
fieldDefinition,
defaultValue
defaultValue,
hideLabels
}: {
fieldDefinition: ApiFormFieldType;
defaultValue?: any;
hideLabels?: boolean;
}) {
const defaultValues = useMemo(() => {
if (defaultValue)
@ -29,6 +31,7 @@ export function StandaloneField({
fieldName="field"
definition={fieldDefinition}
control={form.control}
hideLabels={hideLabels}
/>
</FormProvider>
);

View File

@ -102,11 +102,13 @@ export type ApiFormFieldType = {
export function ApiFormField({
fieldName,
definition,
control
control,
hideLabels
}: {
fieldName: string;
definition: ApiFormFieldType;
control: Control<FieldValues, any>;
hideLabels?: boolean;
}) {
const fieldId = useId();
const controller = useController({
@ -128,18 +130,26 @@ export function ApiFormField({
}
}, [definition.value]);
const fieldDefinition: ApiFormFieldType = useMemo(() => {
return {
...definition,
label: hideLabels ? undefined : definition.label,
description: hideLabels ? undefined : definition.description
};
}, [definition]);
// pull out onValueChange as this can cause strange errors when passing the
// definition to the input components via spread syntax
const reducedDefinition = useMemo(() => {
return {
...definition,
...fieldDefinition,
onValueChange: undefined,
adjustFilters: undefined,
adjustValue: undefined,
read_only: undefined,
children: undefined
};
}, [definition]);
}, [fieldDefinition]);
// Callback helper when form value changes
const onChange = useCallback(
@ -193,7 +203,7 @@ export function ApiFormField({
return (
<RelatedModelField
controller={controller}
definition={definition}
definition={fieldDefinition}
fieldName={fieldName}
/>
);
@ -228,14 +238,16 @@ export function ApiFormField({
aria-label={`boolean-field-${field.name}`}
radius="lg"
size="sm"
checked={isTrue(value)}
checked={isTrue(reducedDefinition.value)}
error={error?.message}
onChange={(event) => onChange(event.currentTarget.checked)}
/>
);
case 'date':
case 'datetime':
return <DateField controller={controller} definition={definition} />;
return (
<DateField controller={controller} definition={fieldDefinition} />
);
case 'integer':
case 'decimal':
case 'float':
@ -259,7 +271,7 @@ export function ApiFormField({
<ChoiceField
controller={controller}
fieldName={fieldName}
definition={definition}
definition={fieldDefinition}
/>
);
case 'file upload':
@ -277,7 +289,7 @@ export function ApiFormField({
case 'nested object':
return (
<NestedObjectField
definition={definition}
definition={fieldDefinition}
fieldName={fieldName}
control={control}
/>
@ -285,7 +297,7 @@ export function ApiFormField({
case 'table':
return (
<TableField
definition={definition}
definition={fieldDefinition}
fieldName={fieldName}
control={controller}
/>
@ -293,8 +305,8 @@ export function ApiFormField({
default:
return (
<Alert color="red" title={t`Error`}>
Invalid field type for field '{fieldName}': '{definition.field_type}
'
Invalid field type for field '{fieldName}': '
{fieldDefinition.field_type}'
</Alert>
);
}

View File

@ -65,6 +65,7 @@ export function ChoiceField({
disabled={definition.disabled}
leftSection={definition.icon}
comboboxProps={{ withinPortal: true }}
searchable
/>
);
}

View File

@ -0,0 +1,397 @@
import { t } from '@lingui/macro';
import { Group, HoverCard, Stack, Text } from '@mantine/core';
import { notifications } from '@mantine/notifications';
import {
IconArrowRight,
IconCircleCheck,
IconCircleDashedCheck,
IconExclamationCircle
} from '@tabler/icons-react';
import { ReactNode, useCallback, useMemo, useState } from 'react';
import { api } from '../../App';
import { ApiEndpoints } from '../../enums/ApiEndpoints';
import { cancelEvent } from '../../functions/events';
import {
useDeleteApiFormModal,
useEditApiFormModal
} from '../../hooks/UseForm';
import { ImportSessionState } from '../../hooks/UseImportSession';
import { useTable } from '../../hooks/UseTable';
import { apiUrl } from '../../states/ApiState';
import { TableColumn } from '../../tables/Column';
import { TableFilter } from '../../tables/Filter';
import { InvenTreeTable } from '../../tables/InvenTreeTable';
import { RowDeleteAction, RowEditAction } from '../../tables/RowActions';
import { ActionButton } from '../buttons/ActionButton';
import { YesNoButton } from '../buttons/YesNoButton';
import { ApiFormFieldSet } from '../forms/fields/ApiFormField';
import { RenderRemoteInstance } from '../render/Instance';
function ImporterDataCell({
session,
column,
row,
onEdit
}: {
session: ImportSessionState;
column: any;
row: any;
onEdit?: () => void;
}) {
const onRowEdit = useCallback(
(event: any) => {
cancelEvent(event);
if (!row.complete) {
onEdit?.();
}
},
[onEdit, row]
);
const cellErrors: string[] = useMemo(() => {
if (!row.errors) {
return [];
}
return row?.errors[column.field] ?? [];
}, [row.errors, column.field]);
const cellValue: ReactNode = useMemo(() => {
let field_def = session.availableFields[column.field];
if (!row?.data) {
return '-';
}
switch (field_def?.type) {
case 'boolean':
return (
<YesNoButton value={row.data ? row.data[column.field] : false} />
);
case 'related field':
if (field_def.model && row.data[column.field]) {
return (
<RenderRemoteInstance
model={field_def.model}
pk={row.data[column.field]}
/>
);
}
break;
default:
break;
}
let value = row.data ? row.data[column.field] ?? '' : '';
if (!value) {
value = '-';
}
return value;
}, [row.data, column.field, session.availableFields]);
const cellValid: boolean = useMemo(
() => cellErrors.length == 0,
[cellErrors]
);
return (
<HoverCard disabled={cellValid} openDelay={100} closeDelay={100}>
<HoverCard.Target>
<Group grow justify="apart" onClick={onRowEdit}>
<Group grow style={{ flex: 1 }}>
<Text size="xs" c={cellValid ? undefined : 'red'}>
{cellValue}
</Text>
</Group>
</Group>
</HoverCard.Target>
<HoverCard.Dropdown>
<Stack gap="xs">
{cellErrors.map((error: string) => (
<Text size="xs" c="red" key={error}>
{error}
</Text>
))}
</Stack>
</HoverCard.Dropdown>
</HoverCard>
);
}
export default function ImporterDataSelector({
session
}: {
session: ImportSessionState;
}) {
const table = useTable('dataimporter');
const [selectedFieldNames, setSelectedFieldNames] = useState<string[]>([]);
const selectedFields: ApiFormFieldSet = useMemo(() => {
let fields: ApiFormFieldSet = {};
for (let field of selectedFieldNames) {
// Find the field definition in session.availableFields
let fieldDef = session.availableFields[field];
if (fieldDef) {
fields[field] = {
...fieldDef,
field_type: fieldDef.type,
description: fieldDef.help_text
};
}
}
return fields;
}, [selectedFieldNames, session.availableFields]);
const importData = useCallback(
(rows: number[]) => {
notifications.show({
title: t`Importing Rows`,
message: t`Please wait while the data is imported`,
autoClose: false,
color: 'blue',
id: 'importing-rows',
icon: <IconArrowRight />
});
api
.post(
apiUrl(ApiEndpoints.import_session_accept_rows, session.sessionId),
{
rows: rows
}
)
.catch(() => {
notifications.show({
title: t`Error`,
message: t`An error occurred while importing data`,
color: 'red',
autoClose: true
});
})
.finally(() => {
table.clearSelectedRecords();
notifications.hide('importing-rows');
table.refreshTable();
});
},
[session.sessionId, table.refreshTable]
);
const [selectedRow, setSelectedRow] = useState<any>({});
const editRow = useEditApiFormModal({
url: ApiEndpoints.import_session_row_list,
pk: selectedRow.pk,
title: t`Edit Data`,
fields: selectedFields,
initialData: selectedRow.data,
processFormData: (data: any) => {
// Construct fields back into a single object
return {
data: {
...selectedRow.data,
...data
}
};
},
onFormSuccess: (row: any) => table.updateRecord(row)
});
const editCell = useCallback(
(row: any, col: any) => {
setSelectedRow(row);
setSelectedFieldNames([col.field]);
editRow.open();
},
[session, editRow]
);
const deleteRow = useDeleteApiFormModal({
url: ApiEndpoints.import_session_row_list,
pk: selectedRow.pk,
title: t`Delete Row`,
onFormSuccess: () => table.refreshTable()
});
const rowErrors = useCallback((row: any) => {
if (!row.errors) {
return [];
}
let errors: string[] = [];
for (const k of Object.keys(row.errors)) {
if (row.errors[k]) {
if (Array.isArray(row.errors[k])) {
row.errors[k].forEach((e: string) => {
errors.push(`${k}: ${e}`);
});
} else {
errors.push(row.errors[k].toString());
}
}
}
return errors;
}, []);
const columns: TableColumn[] = useMemo(() => {
let columns: TableColumn[] = [
{
accessor: 'row_index',
title: t`Row`,
sortable: true,
switchable: false,
render: (row: any) => {
return (
<Group justify="left" gap="xs">
<Text size="sm">{row.row_index}</Text>
{row.complete && <IconCircleCheck color="green" size={16} />}
{!row.complete && row.valid && (
<IconCircleDashedCheck color="blue" size={16} />
)}
{!row.complete && !row.valid && (
<HoverCard openDelay={50} closeDelay={100}>
<HoverCard.Target>
<IconExclamationCircle color="red" size={16} />
</HoverCard.Target>
<HoverCard.Dropdown>
<Stack gap="xs">
<Text>{t`Row contains errors`}:</Text>
{rowErrors(row).map((error: string) => (
<Text size="sm" c="red" key={error}>
{error}
</Text>
))}
</Stack>
</HoverCard.Dropdown>
</HoverCard>
)}
</Group>
);
}
},
...session.mappedFields.map((column: any) => {
return {
accessor: column.field,
title: column.column ?? column.title,
sortable: false,
switchable: true,
render: (row: any) => {
return (
<ImporterDataCell
session={session}
column={column}
row={row}
onEdit={() => editCell(row, column)}
/>
);
}
};
})
];
return columns;
}, [session]);
const rowActions = useCallback(
(record: any) => {
return [
{
title: t`Accept`,
icon: <IconArrowRight />,
color: 'green',
hidden: record.complete || !record.valid,
onClick: () => {
importData([record.pk]);
}
},
RowEditAction({
hidden: record.complete,
onClick: () => {
setSelectedRow(record);
setSelectedFieldNames(
session.mappedFields.map((f: any) => f.field)
);
editRow.open();
}
}),
RowDeleteAction({
onClick: () => {
setSelectedRow(record);
deleteRow.open();
}
})
];
},
[session, importData]
);
const filters: TableFilter[] = useMemo(() => {
return [
{
name: 'valid',
label: t`Valid`,
description: t`Filter by row validation status`,
type: 'boolean'
},
{
name: 'complete',
label: t`Complete`,
description: t`Filter by row completion status`,
type: 'boolean'
}
];
}, []);
const tableActions = useMemo(() => {
// Can only "import" valid (and incomplete) rows
const canImport: boolean =
table.hasSelectedRecords &&
table.selectedRecords.every((row: any) => row.valid && !row.complete);
return [
<ActionButton
disabled={!canImport}
icon={<IconArrowRight />}
color="green"
tooltip={t`Import selected rows`}
onClick={() => {
importData(table.selectedRecords.map((row: any) => row.pk));
}}
/>
];
}, [table.hasSelectedRecords, table.selectedRecords]);
return (
<>
{editRow.modal}
{deleteRow.modal}
<Stack gap="xs">
<InvenTreeTable
tableState={table}
columns={columns}
url={apiUrl(ApiEndpoints.import_session_row_list)}
props={{
params: {
session: session.sessionId
},
rowActions: rowActions,
tableActions: tableActions,
tableFilters: filters,
enableColumnSwitching: true,
enableColumnCaching: false,
enableSelection: true,
enableBulkDelete: true
}}
/>
</Stack>
</>
);
}

View File

@ -0,0 +1,144 @@
import { t } from '@lingui/macro';
import {
Alert,
Button,
Divider,
Group,
Select,
SimpleGrid,
Stack,
Text
} from '@mantine/core';
import { useCallback, useEffect, useMemo, useState } from 'react';
import { api } from '../../App';
import { ApiEndpoints } from '../../enums/ApiEndpoints';
import { ImportSessionState } from '../../hooks/UseImportSession';
import { apiUrl } from '../../states/ApiState';
function ImporterColumn({ column, options }: { column: any; options: any[] }) {
const [errorMessage, setErrorMessage] = useState<string>('');
const [selectedColumn, setSelectedColumn] = useState<string>(
column.column ?? ''
);
useEffect(() => {
setSelectedColumn(column.column ?? '');
}, [column.column]);
const onChange = useCallback(
(value: any) => {
api
.patch(
apiUrl(ApiEndpoints.import_session_column_mapping_list, column.pk),
{
column: value || ''
}
)
.then((response) => {
setSelectedColumn(response.data?.column ?? value);
setErrorMessage('');
})
.catch((error) => {
const data = error.response.data;
setErrorMessage(
data.column ?? data.non_field_errors ?? t`An error occurred`
);
});
},
[column]
);
return (
<Select
error={errorMessage}
clearable
placeholder={t`Select column, or leave blank to ignore this field.`}
label={undefined}
data={options}
value={selectedColumn}
onChange={onChange}
/>
);
}
export default function ImporterColumnSelector({
session
}: {
session: ImportSessionState;
}) {
const [errorMessage, setErrorMessage] = useState<string>('');
const acceptMapping = useCallback(() => {
const url = apiUrl(
ApiEndpoints.import_session_accept_fields,
session.sessionId
);
api
.post(url)
.then(() => {
session.refreshSession();
})
.catch((error) => {
setErrorMessage(error.response?.data?.error ?? t`An error occurred`);
});
}, [session.sessionId]);
const columnOptions: any[] = useMemo(() => {
return [
{ value: '', label: t`Select a column from the data file` },
...session.availableColumns.map((column: any) => {
return {
value: column,
label: column
};
})
];
}, [session.availableColumns]);
return (
<Stack gap="xs">
<Group justify="apart">
<Text>{t`Map data columns to database fields`}</Text>
<Button
color="green"
variant="filled"
onClick={acceptMapping}
>{t`Accept Column Mapping`}</Button>
</Group>
{errorMessage && (
<Alert color="red" title={t`Error`}>
<Text>{errorMessage}</Text>
</Alert>
)}
<SimpleGrid cols={3} spacing="xs">
<Text fw={700}>{t`Database Field`}</Text>
<Text fw={700}>{t`Field Description`}</Text>
<Text fw={700}>{t`Imported Column Name`}</Text>
<Divider />
<Divider />
<Divider />
{session.columnMappings.map((column: any) => {
return [
<Group gap="xs">
<Text fw={column.required ? 700 : undefined}>
{column.label ?? column.field}
</Text>
{column.required && (
<Text c="red" fw={700}>
*
</Text>
)}
</Group>,
<Text size="sm" fs="italic">
{column.description}
</Text>,
<ImporterColumn column={column} options={columnOptions} />
];
})}
</SimpleGrid>
</Stack>
);
}

View File

@ -0,0 +1,133 @@
import { t } from '@lingui/macro';
import {
ActionIcon,
Divider,
Drawer,
Group,
LoadingOverlay,
Paper,
Stack,
Stepper,
Text,
Tooltip
} from '@mantine/core';
import { IconCircleX } from '@tabler/icons-react';
import { ReactNode, useCallback, useMemo, useState } from 'react';
import { ModelType } from '../../enums/ModelType';
import {
ImportSessionStatus,
useImportSession
} from '../../hooks/UseImportSession';
import { StylishText } from '../items/StylishText';
import { StatusRenderer } from '../render/StatusRenderer';
import ImporterDataSelector from './ImportDataSelector';
import ImporterColumnSelector from './ImporterColumnSelector';
import ImporterImportProgress from './ImporterImportProgress';
/*
* Stepper component showing the current step of the data import process.
*/
function ImportDrawerStepper({ currentStep }: { currentStep: number }) {
/* TODO: Enhance this with:
* - Custom icons
* - Loading indicators for "background" states
*/
return (
<Stepper
active={currentStep}
onStepClick={undefined}
allowNextStepsSelect={false}
size="xs"
>
<Stepper.Step label={t`Import Data`} />
<Stepper.Step label={t`Map Columns`} />
<Stepper.Step label={t`Process Data`} />
<Stepper.Step label={t`Complete Import`} />
</Stepper>
);
}
export default function ImporterDrawer({
sessionId,
opened,
onClose
}: {
sessionId: number;
opened: boolean;
onClose: () => void;
}) {
const session = useImportSession({ sessionId: sessionId });
const widget = useMemo(() => {
switch (session.status) {
case ImportSessionStatus.INITIAL:
return <Text>Initial : TODO</Text>;
case ImportSessionStatus.MAPPING:
return <ImporterColumnSelector session={session} />;
case ImportSessionStatus.IMPORTING:
return <ImporterImportProgress session={session} />;
case ImportSessionStatus.PROCESSING:
return <ImporterDataSelector session={session} />;
case ImportSessionStatus.COMPLETE:
return <Text>Complete!</Text>;
default:
return <Text>Unknown status code: {session?.status}</Text>;
}
}, [session.status]);
const title: ReactNode = useMemo(() => {
return (
<Stack gap="xs" style={{ width: '100%' }}>
<Group
gap="xs"
wrap="nowrap"
justify="space-apart"
grow
preventGrowOverflow={false}
>
<StylishText>
{session.sessionData?.statusText ?? t`Importing Data`}
</StylishText>
{StatusRenderer({
status: session.status,
type: ModelType.importsession
})}
<Tooltip label={t`Cancel import session`}>
<ActionIcon color="red" variant="transparent" onClick={onClose}>
<IconCircleX />
</ActionIcon>
</Tooltip>
</Group>
<Divider />
</Stack>
);
}, [session.sessionData]);
return (
<Drawer
position="bottom"
size="80%"
title={title}
opened={opened}
onClose={onClose}
withCloseButton={false}
closeOnEscape={false}
closeOnClickOutside={false}
styles={{
header: {
width: '100%'
},
title: {
width: '100%'
}
}}
>
<Stack gap="xs">
<LoadingOverlay visible={session.sessionQuery.isFetching} />
<Paper p="md">{session.sessionQuery.isFetching || widget}</Paper>
</Stack>
</Drawer>
);
}

View File

@ -0,0 +1,46 @@
import { t } from '@lingui/macro';
import { Center, Container, Loader, Stack, Text } from '@mantine/core';
import { useInterval } from '@mantine/hooks';
import { useEffect } from 'react';
import {
ImportSessionState,
ImportSessionStatus
} from '../../hooks/UseImportSession';
import { StylishText } from '../items/StylishText';
export default function ImporterImportProgress({
session
}: {
session: ImportSessionState;
}) {
// Periodically refresh the import session data
const interval = useInterval(() => {
console.log('refreshing:', session.status);
if (session.status == ImportSessionStatus.IMPORTING) {
session.refreshSession();
}
}, 1000);
useEffect(() => {
interval.start();
return interval.stop;
}, []);
return (
<>
<Center>
<Container>
<Stack gap="xs">
<StylishText size="lg">{t`Importing Records`}</StylishText>
<Loader />
<Text size="lg">
{t`Imported rows`}: {session.sessionData.row_count}
</Text>
</Stack>
</Container>
</Center>
</>
);
}

View File

@ -66,7 +66,7 @@ export function ActionDropdown({
<ActionIcon
size="lg"
radius="sm"
variant="outline"
variant="transparent"
disabled={disabled}
aria-label={menuName}
>

View File

@ -21,6 +21,7 @@ import { Link, useNavigate } from 'react-router-dom';
import { api } from '../../App';
import { ApiEndpoints } from '../../enums/ApiEndpoints';
import { apiUrl } from '../../states/ApiState';
import { useUserState } from '../../states/UserState';
import { StylishText } from '../items/StylishText';
/**
@ -33,10 +34,12 @@ export function NotificationDrawer({
opened: boolean;
onClose: () => void;
}) {
const { isLoggedIn } = useUserState();
const navigate = useNavigate();
const notificationQuery = useQuery({
enabled: opened,
enabled: opened && isLoggedIn(),
queryKey: ['notifications', opened],
queryFn: async () =>
api

View File

@ -14,3 +14,11 @@ export function RenderProjectCode({
)
);
}
export function RenderImportSession({
instance
}: {
instance: any;
}): ReactNode {
return instance && <RenderInlineModel primary={instance.data_file} />;
}

View File

@ -1,9 +1,12 @@
import { t } from '@lingui/macro';
import { Alert, Anchor, Group, Space, Text } from '@mantine/core';
import { Alert, Anchor, Group, Skeleton, Space, Text } from '@mantine/core';
import { useQuery, useSuspenseQuery } from '@tanstack/react-query';
import { ReactNode, useCallback } from 'react';
import { api } from '../../App';
import { ModelType } from '../../enums/ModelType';
import { navigateToLink } from '../../functions/navigation';
import { apiUrl } from '../../states/ApiState';
import { Thumbnail } from '../images/Thumbnail';
import { RenderBuildLine, RenderBuildOrder } from './Build';
import {
@ -13,7 +16,8 @@ import {
RenderManufacturerPart,
RenderSupplierPart
} from './Company';
import { RenderProjectCode } from './Generic';
import { RenderImportSession, RenderProjectCode } from './Generic';
import { ModelInformationDict } from './ModelType';
import {
RenderPurchaseOrder,
RenderReturnOrder,
@ -75,6 +79,7 @@ const RendererLookup: EnumDictionary<
[ModelType.stockhistory]: RenderStockItem,
[ModelType.supplierpart]: RenderSupplierPart,
[ModelType.user]: RenderUser,
[ModelType.importsession]: RenderImportSession,
[ModelType.reporttemplate]: RenderReportTemplate,
[ModelType.labeltemplate]: RenderLabelTemplate,
[ModelType.pluginconfig]: RenderPlugin
@ -103,6 +108,36 @@ export function RenderInstance(props: RenderInstanceProps): ReactNode {
return <RenderComponent {...props} />;
}
export function RenderRemoteInstance({
model,
pk
}: {
model: ModelType;
pk: number;
}): ReactNode {
const { data, isLoading, isFetching } = useQuery({
queryKey: ['model', model, pk],
queryFn: async () => {
const url = apiUrl(ModelInformationDict[model].api_endpoint, pk);
return api
.get(url)
.then((response) => response.data)
.catch(() => null);
}
});
if (isLoading || isFetching) {
return <Skeleton />;
}
if (!data) {
return <Text>${pk}</Text>;
}
return <RenderInstance model={model} instance={data} />;
}
/**
* Helper function for rendering an inline model in a consistent style
*/

View File

@ -196,6 +196,13 @@ export const ModelInformationDict: ModelDict = {
url_detail: '/user/:pk/',
api_endpoint: ApiEndpoints.user_list
},
importsession: {
label: t`Import Session`,
label_multiple: t`Import Sessions`,
url_overview: '/import',
url_detail: '/import/:pk/',
api_endpoint: ApiEndpoints.import_session_list
},
labeltemplate: {
label: t`Label Template`,
label_multiple: t`Label Templates`,

View File

@ -7,6 +7,7 @@ import { useGlobalStatusState } from '../../states/StatusState';
interface StatusCodeInterface {
key: string;
label: string;
name: string;
color: string;
}
@ -41,7 +42,9 @@ function renderStatusLabel(
}
if (!text) {
console.error(`renderStatusLabel could not find match for code ${key}`);
console.error(
`ERR: renderStatusLabel could not find match for code ${key}`
);
}
// Fallbacks
@ -59,6 +62,49 @@ function renderStatusLabel(
</Badge>
);
}
export function getStatusCodes(type: ModelType | string) {
const statusCodeList = useGlobalStatusState.getState().status;
if (statusCodeList === undefined) {
console.log('StatusRenderer: statusCodeList is undefined');
return null;
}
const statusCodes = statusCodeList[type];
if (statusCodes === undefined) {
console.log('StatusRenderer: statusCodes is undefined');
return null;
}
return statusCodes;
}
/*
* Return the name of a status code, based on the key
*/
export function getStatusCodeName(
type: ModelType | string,
key: string | number
) {
const statusCodes = getStatusCodes(type);
if (!statusCodes) {
return null;
}
for (let name in statusCodes) {
let entry = statusCodes[name];
if (entry.key == key) {
return entry.name;
}
}
return null;
}
/*
* Render the status for a object.
* Uses the values specified in "status_codes.py"
@ -72,14 +118,9 @@ export const StatusRenderer = ({
type: ModelType | string;
options?: RenderStatusLabelOptionsInterface;
}) => {
const statusCodeList = useGlobalStatusState.getState().status;
const statusCodes = getStatusCodes(type);
if (status === undefined || statusCodeList === undefined) {
return null;
}
const statusCodes = statusCodeList[type];
if (statusCodes === undefined) {
if (statusCodes === undefined || statusCodes === null) {
console.warn('StatusRenderer: statusCodes is undefined');
return null;
}

View File

@ -13,7 +13,8 @@ export const statusCodeList: Record<string, ModelType> = {
ReturnOrderStatus: ModelType.returnorder,
SalesOrderStatus: ModelType.salesorder,
StockHistoryCode: ModelType.stockhistory,
StockStatus: ModelType.stockitem
StockStatus: ModelType.stockitem,
DataImportStatusCode: ModelType.importsession
};
/*

View File

@ -46,6 +46,13 @@ export enum ApiEndpoints {
group_list = 'user/group/',
owner_list = 'user/owner/',
// Data import endpoints
import_session_list = 'importer/session/',
import_session_accept_fields = 'importer/session/:id/accept_fields/',
import_session_accept_rows = 'importer/session/:id/accept_rows/',
import_session_column_mapping_list = 'importer/column-mapping/',
import_session_row_list = 'importer/row/',
// Notification endpoints
notifications_list = 'notifications/',
notifications_readall = 'notifications/readall/',

View File

@ -21,6 +21,7 @@ export enum ModelType {
salesorder = 'salesorder',
salesordershipment = 'salesordershipment',
returnorder = 'returnorder',
importsession = 'importsession',
address = 'address',
contact = 'contact',
owner = 'owner',

View File

@ -0,0 +1,11 @@
import { ApiFormFieldSet } from '../components/forms/fields/ApiFormField';
export function dataImporterSessionFields(): ApiFormFieldSet {
return {
data_file: {},
model_type: {},
field_detauls: {
hidden: true
}
};
}

View File

@ -0,0 +1,107 @@
import { useCallback, useMemo } from 'react';
import { api } from '../App';
import { ApiEndpoints } from '../enums/ApiEndpoints';
import { apiUrl } from '../states/ApiState';
import { useInstance } from './UseInstance';
/*
* Custom hook for managing the state of a data import session
*/
// TODO: Load these values from the server?
export enum ImportSessionStatus {
INITIAL = 0,
MAPPING = 10,
IMPORTING = 20,
PROCESSING = 30,
COMPLETE = 40
}
export type ImportSessionState = {
sessionId: number;
sessionData: any;
refreshSession: () => void;
sessionQuery: any;
status: ImportSessionStatus;
availableFields: Record<string, any>;
availableColumns: string[];
mappedFields: any[];
columnMappings: any[];
};
export function useImportSession({
sessionId
}: {
sessionId: number;
}): ImportSessionState {
// Query manager for the import session
const {
instance: sessionData,
refreshInstance: refreshSession,
instanceQuery: sessionQuery
} = useInstance({
endpoint: ApiEndpoints.import_session_list,
pk: sessionId,
defaultValue: {}
});
// Current step of the import process
const status: ImportSessionStatus = useMemo(() => {
return sessionData?.status ?? ImportSessionStatus.INITIAL;
}, [sessionData]);
// List of available writeable database field definitions
const availableFields: any[] = useMemo(() => {
return sessionData?.available_fields ?? [];
}, [sessionData]);
// List of available data file columns
const availableColumns: string[] = useMemo(() => {
let cols = sessionData?.columns ?? [];
// Filter out any blank or duplicate columns
cols = cols.filter((col: string) => !!col);
cols = cols.filter(
(col: string, index: number) => cols.indexOf(col) === index
);
return cols;
}, [sessionData.columns]);
const columnMappings: any[] = useMemo(() => {
let mapping =
sessionData?.column_mappings?.map((mapping: any) => ({
...mapping,
...(availableFields[mapping.field] ?? {})
})) ?? [];
mapping = mapping.sort((a: any, b: any) => {
if (a?.required && !b?.required) return -1;
if (!a?.required && b?.required) return 1;
return 0;
});
return mapping;
}, [sessionData, availableColumns]);
// List of field which have been mapped to columns
const mappedFields: any[] = useMemo(() => {
return (
sessionData?.column_mappings?.filter((column: any) => !!column.column) ??
[]
);
}, [sessionData]);
return {
sessionData,
sessionId,
refreshSession,
sessionQuery,
status,
availableFields,
availableColumns,
columnMappings,
mappedFields
};
}

Some files were not shown because too many files have changed in this diff Show More