DPDK patches and discussions
 help / color / mirror / Atom feed
* [PATCH 0/5] dts: Pydantic configuration
@ 2024-08-22 16:39 Luca Vizzarro
  2024-08-22 16:39 ` [PATCH 1/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
                   ` (4 more replies)
  0 siblings, 5 replies; 13+ messages in thread
From: Luca Vizzarro @ 2024-08-22 16:39 UTC (permalink / raw)
  To: dev; +Cc: Honnappa Nagarahalli, Juraj Linkeš, Luca Vizzarro

Hello,

sending the first version for the Pydantic configuration update work.

Best,
Luca

Luca Vizzarro (5):
  dts: add TestSuiteSpec class and discovery
  dts: add Pydantic and remove Warlock
  dts: use Pydantic in the configuration
  dts: use TestSuiteSpec class imports
  dts: add JSON schema generation script

 doc/guides/tools/dts.rst                      |  10 +
 dts/framework/config/__init__.py              | 588 +++++++------
 dts/framework/config/conf_yaml_schema.json    | 776 ++++++++++--------
 dts/framework/config/types.py                 | 132 ---
 dts/framework/runner.py                       | 198 ++---
 dts/framework/settings.py                     |  16 +-
 dts/framework/test_suite.py                   | 182 +++-
 dts/framework/testbed_model/sut_node.py       |   2 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 dts/generate-schema.py                        |  38 +
 dts/poetry.lock                               | 346 +++-----
 dts/pyproject.toml                            |   3 +-
 13 files changed, 1152 insertions(+), 1145 deletions(-)
 delete mode 100644 dts/framework/config/types.py
 create mode 100755 dts/generate-schema.py

-- 
2.34.1


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [PATCH 1/5] dts: add TestSuiteSpec class and discovery
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
@ 2024-08-22 16:39 ` Luca Vizzarro
  2024-09-16 13:00   ` Juraj Linkeš
  2024-09-19 20:01   ` Nicholas Pratte
  2024-08-22 16:39 ` [PATCH 2/5] dts: add Pydantic and remove Warlock Luca Vizzarro
                   ` (3 subsequent siblings)
  4 siblings, 2 replies; 13+ messages in thread
From: Luca Vizzarro @ 2024-08-22 16:39 UTC (permalink / raw)
  To: dev
  Cc: Honnappa Nagarahalli, Juraj Linkeš, Luca Vizzarro, Paul Szczepanek

Currently there is a lack of a definition which identifies all the test
suites available to test. This change intends to simplify the process to
discover all the test suites and idenfity them.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/test_suite.py | 182 +++++++++++++++++++++++++++++++++++-
 1 file changed, 181 insertions(+), 1 deletion(-)

diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index 694b2eba65..972968b036 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2010-2014 Intel Corporation
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Features common to all test suites.
 
@@ -13,12 +14,22 @@
     * Test case verification.
 """
 
+import inspect
+import re
+from dataclasses import dataclass
+from enum import Enum, auto
+from functools import cached_property
+from importlib import import_module
 from ipaddress import IPv4Interface, IPv6Interface, ip_interface
-from typing import ClassVar, Union
+from pkgutil import iter_modules
+from types import FunctionType, ModuleType
+from typing import ClassVar, NamedTuple, Union
 
+from pydantic.alias_generators import to_pascal
 from scapy.layers.inet import IP  # type: ignore[import-untyped]
 from scapy.layers.l2 import Ether  # type: ignore[import-untyped]
 from scapy.packet import Packet, Padding  # type: ignore[import-untyped]
+from typing_extensions import Self
 
 from framework.testbed_model.port import Port, PortLink
 from framework.testbed_model.sut_node import SutNode
@@ -365,3 +376,172 @@ def _verify_l3_packet(self, received_packet: IP, expected_packet: IP) -> bool:
         if received_packet.src != expected_packet.src or received_packet.dst != expected_packet.dst:
             return False
         return True
+
+
+class TestCaseVariant(Enum):
+    """Enum representing the variant of the test case."""
+
+    #:
+    FUNCTIONAL = auto()
+    #:
+    PERFORMANCE = auto()
+
+
+class TestCase(NamedTuple):
+    """Tuple representing a test case."""
+
+    #: The name of the test case without prefix
+    name: str
+    #: The reference to the function
+    function_type: FunctionType
+    #: The test case variant
+    variant: TestCaseVariant
+
+
+@dataclass
+class TestSuiteSpec:
+    """A class defining the specification of a test suite.
+
+    Apart from defining all the specs of a test suite, a helper function :meth:`discover_all` is
+    provided to automatically discover all the available test suites.
+
+    Attributes:
+        module_name: The name of the test suite's module.
+    """
+
+    #:
+    TEST_SUITES_PACKAGE_NAME = "tests"
+    #:
+    TEST_SUITE_MODULE_PREFIX = "TestSuite_"
+    #:
+    TEST_SUITE_CLASS_PREFIX = "Test"
+    #:
+    TEST_CASE_METHOD_PREFIX = "test_"
+    #:
+    FUNC_TEST_CASE_REGEX = r"test_(?!perf_)"
+    #:
+    PERF_TEST_CASE_REGEX = r"test_perf_"
+
+    module_name: str
+
+    @cached_property
+    def name(self) -> str:
+        """The name of the test suite's module."""
+        return self.module_name[len(self.TEST_SUITE_MODULE_PREFIX) :]
+
+    @cached_property
+    def module_type(self) -> ModuleType:
+        """A reference to the test suite's module."""
+        return import_module(f"{self.TEST_SUITES_PACKAGE_NAME}.{self.module_name}")
+
+    @cached_property
+    def class_name(self) -> str:
+        """The name of the test suite's class."""
+        return f"{self.TEST_SUITE_CLASS_PREFIX}{to_pascal(self.name)}"
+
+    @cached_property
+    def class_type(self) -> type[TestSuite]:
+        """A reference to the test suite's class."""
+
+        def is_test_suite(obj) -> bool:
+            """Check whether `obj` is a :class:`TestSuite`.
+
+            The `obj` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
+
+            Args:
+                obj: The object to be checked.
+
+            Returns:
+                :data:`True` if `obj` is a subclass of `TestSuite`.
+            """
+            try:
+                if issubclass(obj, TestSuite) and obj is not TestSuite:
+                    return True
+            except TypeError:
+                return False
+            return False
+
+        for class_name, class_type in inspect.getmembers(self.module_type, is_test_suite):
+            if class_name == self.class_name:
+                return class_type
+
+        raise Exception("class not found in eligible test module")
+
+    @cached_property
+    def test_cases(self) -> list[TestCase]:
+        """A list of all the available test cases."""
+        test_cases = []
+
+        functions = inspect.getmembers(self.class_type, inspect.isfunction)
+        for fn_name, fn_type in functions:
+            if prefix := re.match(self.FUNC_TEST_CASE_REGEX, fn_name):
+                variant = TestCaseVariant.FUNCTIONAL
+            elif prefix := re.match(self.PERF_TEST_CASE_REGEX, fn_name):
+                variant = TestCaseVariant.PERFORMANCE
+            else:
+                continue
+
+            name = fn_name[len(prefix.group(0)) :]
+            test_cases.append(TestCase(name, fn_type, variant))
+
+        return test_cases
+
+    @classmethod
+    def discover_all(
+        cls, package_name: str | None = None, module_prefix: str | None = None
+    ) -> list[Self]:
+        """Discover all the test suites.
+
+        The test suites are discovered in the provided `package_name`. The full module name,
+        expected under that package, is prefixed with `module_prefix`.
+        The module name is a standard filename with words separated with underscores.
+        For each module found, search for a :class:`TestSuite` class which starts
+        with `self.TEST_SUITE_CLASS_PREFIX`, continuing with the module name in PascalCase.
+
+        The PascalCase convention applies to abbreviations, acronyms, initialisms and so on::
+
+            OS -> Os
+            TCP -> Tcp
+
+        Args:
+            package_name: The name of the package where to find the test suites, if none is set the
+                constant :attr:`~TestSuiteSpec.TEST_SUITES_PACKAGE_NAME` is used instead.
+            module_prefix: The name prefix defining the test suite module, if none is set the
+                constant :attr:`~TestSuiteSpec.TEST_SUITE_MODULE_PREFIX` is used instead.
+
+        Returns:
+            A list containing all the discovered test suites.
+        """
+        if package_name is None:
+            package_name = cls.TEST_SUITES_PACKAGE_NAME
+        if module_prefix is None:
+            module_prefix = cls.TEST_SUITE_MODULE_PREFIX
+
+        test_suites = []
+
+        test_suites_pkg = import_module(package_name)
+        for _, module_name, is_pkg in iter_modules(test_suites_pkg.__path__):
+            if not module_name.startswith(module_prefix) or is_pkg:
+                continue
+
+            test_suite = cls(module_name)
+            try:
+                if test_suite.class_type:
+                    test_suites.append(test_suite)
+            except Exception:
+                pass
+
+        return test_suites
+
+
+AVAILABLE_TEST_SUITES: list[TestSuiteSpec] = TestSuiteSpec.discover_all()
+"""Constant to store all the available, discovered and imported test suites.
+
+The test suites should be gathered from this list to avoid importing more than once.
+"""
+
+
+def find_by_name(name: str) -> TestSuiteSpec | None:
+    """Find a requested test suite by name from the available ones."""
+    test_suites = filter(lambda t: t.name == name, AVAILABLE_TEST_SUITES)
+    return next(test_suites, None)
-- 
2.34.1


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [PATCH 2/5] dts: add Pydantic and remove Warlock
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
  2024-08-22 16:39 ` [PATCH 1/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
@ 2024-08-22 16:39 ` Luca Vizzarro
  2024-09-16 13:17   ` Juraj Linkeš
  2024-09-19 19:56   ` Nicholas Pratte
  2024-08-22 16:39 ` [PATCH 3/5] dts: use Pydantic in the configuration Luca Vizzarro
                   ` (2 subsequent siblings)
  4 siblings, 2 replies; 13+ messages in thread
From: Luca Vizzarro @ 2024-08-22 16:39 UTC (permalink / raw)
  To: dev
  Cc: Honnappa Nagarahalli, Juraj Linkeš, Luca Vizzarro, Paul Szczepanek

Add Pydantic to the project dependencies while dropping Warlock.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/poetry.lock    | 346 +++++++++++++++++----------------------------
 dts/pyproject.toml |   3 +-
 2 files changed, 135 insertions(+), 214 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..c5b0d059a8 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,23 +1,16 @@
 # This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
 
 [[package]]
-name = "attrs"
-version = "23.1.0"
-description = "Classes Without Boilerplate"
+name = "annotated-types"
+version = "0.7.0"
+description = "Reusable constraint types to use with typing.Annotated"
 optional = false
-python-versions = ">=3.7"
+python-versions = ">=3.8"
 files = [
-    {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
-    {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
+    {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"},
+    {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
 ]
 
-[package.extras]
-cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
-dev = ["attrs[docs,tests]", "pre-commit"]
-docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
-tests = ["attrs[tests-no-zope]", "zope-interface"]
-tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
-
 [[package]]
 name = "bcrypt"
 version = "4.0.1"
@@ -280,66 +273,6 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
 plugins = ["setuptools"]
 requirements-deprecated-finder = ["pip-api", "pipreqs"]
 
-[[package]]
-name = "jsonpatch"
-version = "1.33"
-description = "Apply JSON-Patches (RFC 6902)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpatch-1.33-py2.py3-none-any.whl", hash = "sha256:0ae28c0cd062bbd8b8ecc26d7d164fbbea9652a1a3693f3b956c1eae5145dade"},
-    {file = "jsonpatch-1.33.tar.gz", hash = "sha256:9fcd4009c41e6d12348b4a0ff2563ba56a2923a7dfee731d004e212e1ee5030c"},
-]
-
-[package.dependencies]
-jsonpointer = ">=1.9"
-
-[[package]]
-name = "jsonpointer"
-version = "2.4"
-description = "Identify specific nodes in a JSON document (RFC 6901)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpointer-2.4-py2.py3-none-any.whl", hash = "sha256:15d51bba20eea3165644553647711d150376234112651b4f1811022aecad7d7a"},
-    {file = "jsonpointer-2.4.tar.gz", hash = "sha256:585cee82b70211fa9e6043b7bb89db6e1aa49524340dde8ad6b63206ea689d88"},
-]
-
-[[package]]
-name = "jsonschema"
-version = "4.18.4"
-description = "An implementation of JSON Schema validation for Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema-4.18.4-py3-none-any.whl", hash = "sha256:971be834317c22daaa9132340a51c01b50910724082c2c1a2ac87eeec153a3fe"},
-    {file = "jsonschema-4.18.4.tar.gz", hash = "sha256:fb3642735399fa958c0d2aad7057901554596c63349f4f6b283c493cf692a25d"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-jsonschema-specifications = ">=2023.03.6"
-referencing = ">=0.28.4"
-rpds-py = ">=0.7.1"
-
-[package.extras]
-format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
-format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
-
-[[package]]
-name = "jsonschema-specifications"
-version = "2023.7.1"
-description = "The JSON Schema meta-schemas and vocabularies, exposed as a Registry"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema_specifications-2023.7.1-py3-none-any.whl", hash = "sha256:05adf340b659828a004220a9613be00fa3f223f2b82002e273dee62fd50524b1"},
-    {file = "jsonschema_specifications-2023.7.1.tar.gz", hash = "sha256:c91a50404e88a1f6ba40636778e2ee08f6e24c5613fe4c53ac24578a5a7f72bb"},
-]
-
-[package.dependencies]
-referencing = ">=0.28.0"
-
 [[package]]
 name = "mccabe"
 version = "0.7.0"
@@ -492,6 +425,129 @@ files = [
     {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
 ]
 
+[[package]]
+name = "pydantic"
+version = "2.8.2"
+description = "Data validation using Python type hints"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic-2.8.2-py3-none-any.whl", hash = "sha256:73ee9fddd406dc318b885c7a2eab8a6472b68b8fb5ba8150949fc3db939f23c8"},
+    {file = "pydantic-2.8.2.tar.gz", hash = "sha256:6f62c13d067b0755ad1c21a34bdd06c0c12625a22b0fc09c6b149816604f7c2a"},
+]
+
+[package.dependencies]
+annotated-types = ">=0.4.0"
+pydantic-core = "2.20.1"
+typing-extensions = [
+    {version = ">=4.12.2", markers = "python_version >= \"3.13\""},
+    {version = ">=4.6.1", markers = "python_version < \"3.13\""},
+]
+
+[package.extras]
+email = ["email-validator (>=2.0.0)"]
+
+[[package]]
+name = "pydantic-core"
+version = "2.20.1"
+description = "Core functionality for Pydantic validation and serialization"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic_core-2.20.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:3acae97ffd19bf091c72df4d726d552c473f3576409b2a7ca36b2f535ffff4a3"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:41f4c96227a67a013e7de5ff8f20fb496ce573893b7f4f2707d065907bffdbd6"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5f239eb799a2081495ea659d8d4a43a8f42cd1fe9ff2e7e436295c38a10c286a"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:53e431da3fc53360db73eedf6f7124d1076e1b4ee4276b36fb25514544ceb4a3"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f1f62b2413c3a0e846c3b838b2ecd6c7a19ec6793b2a522745b0869e37ab5bc1"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5d41e6daee2813ecceea8eda38062d69e280b39df793f5a942fa515b8ed67953"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d482efec8b7dc6bfaedc0f166b2ce349df0011f5d2f1f25537ced4cfc34fd98"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e93e1a4b4b33daed65d781a57a522ff153dcf748dee70b40c7258c5861e1768a"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e7c4ea22b6739b162c9ecaaa41d718dfad48a244909fe7ef4b54c0b530effc5a"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:4f2790949cf385d985a31984907fecb3896999329103df4e4983a4a41e13e840"},
+    {file = "pydantic_core-2.20.1-cp310-none-win32.whl", hash = "sha256:5e999ba8dd90e93d57410c5e67ebb67ffcaadcea0ad973240fdfd3a135506250"},
+    {file = "pydantic_core-2.20.1-cp310-none-win_amd64.whl", hash = "sha256:512ecfbefef6dac7bc5eaaf46177b2de58cdf7acac8793fe033b24ece0b9566c"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:d2a8fa9d6d6f891f3deec72f5cc668e6f66b188ab14bb1ab52422fe8e644f312"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:175873691124f3d0da55aeea1d90660a6ea7a3cfea137c38afa0a5ffabe37b88"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:37eee5b638f0e0dcd18d21f59b679686bbd18917b87db0193ae36f9c23c355fc"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:25e9185e2d06c16ee438ed39bf62935ec436474a6ac4f9358524220f1b236e43"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:150906b40ff188a3260cbee25380e7494ee85048584998c1e66df0c7a11c17a6"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8ad4aeb3e9a97286573c03df758fc7627aecdd02f1da04516a86dc159bf70121"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d3f3ed29cd9f978c604708511a1f9c2fdcb6c38b9aae36a51905b8811ee5cbf1"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0dae11d8f5ded51699c74d9548dcc5938e0804cc8298ec0aa0da95c21fff57b"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:faa6b09ee09433b87992fb5a2859efd1c264ddc37280d2dd5db502126d0e7f27"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9dc1b507c12eb0481d071f3c1808f0529ad41dc415d0ca11f7ebfc666e66a18b"},
+    {file = "pydantic_core-2.20.1-cp311-none-win32.whl", hash = "sha256:fa2fddcb7107e0d1808086ca306dcade7df60a13a6c347a7acf1ec139aa6789a"},
+    {file = "pydantic_core-2.20.1-cp311-none-win_amd64.whl", hash = "sha256:40a783fb7ee353c50bd3853e626f15677ea527ae556429453685ae32280c19c2"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:595ba5be69b35777474fa07f80fc260ea71255656191adb22a8c53aba4479231"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a4f55095ad087474999ee28d3398bae183a66be4823f753cd7d67dd0153427c9"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f9aa05d09ecf4c75157197f27cdc9cfaeb7c5f15021c6373932bf3e124af029f"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e97fdf088d4b31ff4ba35db26d9cc472ac7ef4a2ff2badeabf8d727b3377fc52"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bc633a9fe1eb87e250b5c57d389cf28998e4292336926b0b6cdaee353f89a237"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d573faf8eb7e6b1cbbcb4f5b247c60ca8be39fe2c674495df0eb4318303137fe"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:26dc97754b57d2fd00ac2b24dfa341abffc380b823211994c4efac7f13b9e90e"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:33499e85e739a4b60c9dac710c20a08dc73cb3240c9a0e22325e671b27b70d24"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:bebb4d6715c814597f85297c332297c6ce81e29436125ca59d1159b07f423eb1"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:516d9227919612425c8ef1c9b869bbbee249bc91912c8aaffb66116c0b447ebd"},
+    {file = "pydantic_core-2.20.1-cp312-none-win32.whl", hash = "sha256:469f29f9093c9d834432034d33f5fe45699e664f12a13bf38c04967ce233d688"},
+    {file = "pydantic_core-2.20.1-cp312-none-win_amd64.whl", hash = "sha256:035ede2e16da7281041f0e626459bcae33ed998cca6a0a007a5ebb73414ac72d"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:0827505a5c87e8aa285dc31e9ec7f4a17c81a813d45f70b1d9164e03a813a686"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:19c0fa39fa154e7e0b7f82f88ef85faa2a4c23cc65aae2f5aea625e3c13c735a"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa223cd1e36b642092c326d694d8bf59b71ddddc94cdb752bbbb1c5c91d833b"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:c336a6d235522a62fef872c6295a42ecb0c4e1d0f1a3e500fe949415761b8a19"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7eb6a0587eded33aeefea9f916899d42b1799b7b14b8f8ff2753c0ac1741edac"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:70c8daf4faca8da5a6d655f9af86faf6ec2e1768f4b8b9d0226c02f3d6209703"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e9fa4c9bf273ca41f940bceb86922a7667cd5bf90e95dbb157cbb8441008482c"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:11b71d67b4725e7e2a9f6e9c0ac1239bbc0c48cce3dc59f98635efc57d6dac83"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:270755f15174fb983890c49881e93f8f1b80f0b5e3a3cc1394a255706cabd203"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:c81131869240e3e568916ef4c307f8b99583efaa60a8112ef27a366eefba8ef0"},
+    {file = "pydantic_core-2.20.1-cp313-none-win32.whl", hash = "sha256:b91ced227c41aa29c672814f50dbb05ec93536abf8f43cd14ec9521ea09afe4e"},
+    {file = "pydantic_core-2.20.1-cp313-none-win_amd64.whl", hash = "sha256:65db0f2eefcaad1a3950f498aabb4875c8890438bc80b19362cf633b87a8ab20"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:4745f4ac52cc6686390c40eaa01d48b18997cb130833154801a442323cc78f91"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:a8ad4c766d3f33ba8fd692f9aa297c9058970530a32c728a2c4bfd2616d3358b"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41e81317dd6a0127cabce83c0c9c3fbecceae981c8391e6f1dec88a77c8a569a"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:04024d270cf63f586ad41fff13fde4311c4fc13ea74676962c876d9577bcc78f"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:eaad4ff2de1c3823fddf82f41121bdf453d922e9a238642b1dedb33c4e4f98ad"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:26ab812fa0c845df815e506be30337e2df27e88399b985d0bb4e3ecfe72df31c"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3c5ebac750d9d5f2706654c638c041635c385596caf68f81342011ddfa1e5598"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2aafc5a503855ea5885559eae883978c9b6d8c8993d67766ee73d82e841300dd"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:4868f6bd7c9d98904b748a2653031fc9c2f85b6237009d475b1008bfaeb0a5aa"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:aa2f457b4af386254372dfa78a2eda2563680d982422641a85f271c859df1987"},
+    {file = "pydantic_core-2.20.1-cp38-none-win32.whl", hash = "sha256:225b67a1f6d602de0ce7f6c1c3ae89a4aa25d3de9be857999e9124f15dab486a"},
+    {file = "pydantic_core-2.20.1-cp38-none-win_amd64.whl", hash = "sha256:6b507132dcfc0dea440cce23ee2182c0ce7aba7054576efc65634f080dbe9434"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:b03f7941783b4c4a26051846dea594628b38f6940a2fdc0df00b221aed39314c"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:1eedfeb6089ed3fad42e81a67755846ad4dcc14d73698c120a82e4ccf0f1f9f6"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:635fee4e041ab9c479e31edda27fcf966ea9614fff1317e280d99eb3e5ab6fe2"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:77bf3ac639c1ff567ae3b47f8d4cc3dc20f9966a2a6dd2311dcc055d3d04fb8a"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7ed1b0132f24beeec5a78b67d9388656d03e6a7c837394f99257e2d55b461611"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c6514f963b023aeee506678a1cf821fe31159b925c4b76fe2afa94cc70b3222b"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:10d4204d8ca33146e761c79f83cc861df20e7ae9f6487ca290a97702daf56006"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2d036c7187b9422ae5b262badb87a20a49eb6c5238b2004e96d4da1231badef1"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9ebfef07dbe1d93efb94b4700f2d278494e9162565a54f124c404a5656d7ff09"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:6b9d9bb600328a1ce523ab4f454859e9d439150abb0906c5a1983c146580ebab"},
+    {file = "pydantic_core-2.20.1-cp39-none-win32.whl", hash = "sha256:784c1214cb6dd1e3b15dd8b91b9a53852aed16671cc3fbe4786f4f1db07089e2"},
+    {file = "pydantic_core-2.20.1-cp39-none-win_amd64.whl", hash = "sha256:d2fe69c5434391727efa54b47a1e7986bb0186e72a41b203df8f5b0a19a4f669"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:a45f84b09ac9c3d35dfcf6a27fd0634d30d183205230a0ebe8373a0e8cfa0906"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d02a72df14dfdbaf228424573a07af10637bd490f0901cee872c4f434a735b94"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d2b27e6af28f07e2f195552b37d7d66b150adbaa39a6d327766ffd695799780f"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:084659fac3c83fd674596612aeff6041a18402f1e1bc19ca39e417d554468482"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:242b8feb3c493ab78be289c034a1f659e8826e2233786e36f2893a950a719bb6"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:38cf1c40a921d05c5edc61a785c0ddb4bed67827069f535d794ce6bcded919fc"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:e0bbdd76ce9aa5d4209d65f2b27fc6e5ef1312ae6c5333c26db3f5ade53a1e99"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:254ec27fdb5b1ee60684f91683be95e5133c994cc54e86a0b0963afa25c8f8a6"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:407653af5617f0757261ae249d3fba09504d7a71ab36ac057c938572d1bc9331"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:c693e916709c2465b02ca0ad7b387c4f8423d1db7b4649c551f27a529181c5ad"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5b5ff4911aea936a47d9376fd3ab17e970cc543d1b68921886e7f64bd28308d1"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:177f55a886d74f1808763976ac4efd29b7ed15c69f4d838bbd74d9d09cf6fa86"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:964faa8a861d2664f0c7ab0c181af0bea66098b1919439815ca8803ef136fc4e"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:4dd484681c15e6b9a977c785a345d3e378d72678fd5f1f3c0509608da24f2ac0"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f6d6cff3538391e8486a431569b77921adfcdef14eb18fbf19b7c0a5294d4e6a"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:a6d511cc297ff0883bc3708b465ff82d7560193169a8b93260f74ecb0a5e08a7"},
+    {file = "pydantic_core-2.20.1.tar.gz", hash = "sha256:26ca695eeee5f9f1aeeb211ffc12f10bcb6f71e2989988fda61dabd65db878d4"},
+]
+
+[package.dependencies]
+typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
+
 [[package]]
 name = "pydocstyle"
 version = "6.1.1"
@@ -633,127 +689,6 @@ files = [
     {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
 ]
 
-[[package]]
-name = "referencing"
-version = "0.30.0"
-description = "JSON Referencing + Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "referencing-0.30.0-py3-none-any.whl", hash = "sha256:c257b08a399b6c2f5a3510a50d28ab5dbc7bbde049bcaf954d43c446f83ab548"},
-    {file = "referencing-0.30.0.tar.gz", hash = "sha256:47237742e990457f7512c7d27486394a9aadaf876cbfaa4be65b27b4f4d47c6b"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-rpds-py = ">=0.7.0"
-
-[[package]]
-name = "rpds-py"
-version = "0.9.2"
-description = "Python bindings to Rust's persistent data structures (rpds)"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:ab6919a09c055c9b092798ce18c6c4adf49d24d4d9e43a92b257e3f2548231e7"},
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d55777a80f78dd09410bd84ff8c95ee05519f41113b2df90a69622f5540c4f8b"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a216b26e5af0a8e265d4efd65d3bcec5fba6b26909014effe20cd302fd1138fa"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:29cd8bfb2d716366a035913ced99188a79b623a3512292963d84d3e06e63b496"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:44659b1f326214950a8204a248ca6199535e73a694be8d3e0e869f820767f12f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:745f5a43fdd7d6d25a53ab1a99979e7f8ea419dfefebcab0a5a1e9095490ee5e"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a987578ac5214f18b99d1f2a3851cba5b09f4a689818a106c23dbad0dfeb760f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf4151acb541b6e895354f6ff9ac06995ad9e4175cbc6d30aaed08856558201f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:03421628f0dc10a4119d714a17f646e2837126a25ac7a256bdf7c3943400f67f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:13b602dc3e8dff3063734f02dcf05111e887f301fdda74151a93dbbc249930fe"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:fae5cb554b604b3f9e2c608241b5d8d303e410d7dfb6d397c335f983495ce7f6"},
-    {file = "rpds_py-0.9.2-cp310-none-win32.whl", hash = "sha256:47c5f58a8e0c2c920cc7783113df2fc4ff12bf3a411d985012f145e9242a2764"},
-    {file = "rpds_py-0.9.2-cp310-none-win_amd64.whl", hash = "sha256:4ea6b73c22d8182dff91155af018b11aac9ff7eca085750455c5990cb1cfae6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:e564d2238512c5ef5e9d79338ab77f1cbbda6c2d541ad41b2af445fb200385e3"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f411330a6376fb50e5b7a3e66894e4a39e60ca2e17dce258d53768fea06a37bd"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e7521f5af0233e89939ad626b15278c71b69dc1dfccaa7b97bd4cdf96536bb7"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8d3335c03100a073883857e91db9f2e0ef8a1cf42dc0369cbb9151c149dbbc1b"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d25b1c1096ef0447355f7293fbe9ad740f7c47ae032c2884113f8e87660d8f6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a5d3fbd02efd9cf6a8ffc2f17b53a33542f6b154e88dd7b42ef4a4c0700fdad"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c5934e2833afeaf36bd1eadb57256239785f5af0220ed8d21c2896ec4d3a765f"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:095b460e117685867d45548fbd8598a8d9999227e9061ee7f012d9d264e6048d"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:91378d9f4151adc223d584489591dbb79f78814c0734a7c3bfa9c9e09978121c"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:24a81c177379300220e907e9b864107614b144f6c2a15ed5c3450e19cf536fae"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:de0b6eceb46141984671802d412568d22c6bacc9b230174f9e55fc72ef4f57de"},
-    {file = "rpds_py-0.9.2-cp311-none-win32.whl", hash = "sha256:700375326ed641f3d9d32060a91513ad668bcb7e2cffb18415c399acb25de2ab"},
-    {file = "rpds_py-0.9.2-cp311-none-win_amd64.whl", hash = "sha256:0766babfcf941db8607bdaf82569ec38107dbb03c7f0b72604a0b346b6eb3298"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_10_7_x86_64.whl", hash = "sha256:b1440c291db3f98a914e1afd9d6541e8fc60b4c3aab1a9008d03da4651e67386"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0f2996fbac8e0b77fd67102becb9229986396e051f33dbceada3debaacc7033f"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f30d205755566a25f2ae0382944fcae2f350500ae4df4e795efa9e850821d82"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:159fba751a1e6b1c69244e23ba6c28f879a8758a3e992ed056d86d74a194a0f3"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a1f044792e1adcea82468a72310c66a7f08728d72a244730d14880cd1dabe36b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9251eb8aa82e6cf88510530b29eef4fac825a2b709baf5b94a6094894f252387"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01899794b654e616c8625b194ddd1e5b51ef5b60ed61baa7a2d9c2ad7b2a4238"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0c43f8ae8f6be1d605b0465671124aa8d6a0e40f1fb81dcea28b7e3d87ca1e1"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:207f57c402d1f8712618f737356e4b6f35253b6d20a324d9a47cb9f38ee43a6b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b52e7c5ae35b00566d244ffefba0f46bb6bec749a50412acf42b1c3f402e2c90"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:978fa96dbb005d599ec4fd9ed301b1cc45f1a8f7982d4793faf20b404b56677d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:6aa8326a4a608e1c28da191edd7c924dff445251b94653988efb059b16577a4d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:aad51239bee6bff6823bbbdc8ad85136c6125542bbc609e035ab98ca1e32a192"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4bd4dc3602370679c2dfb818d9c97b1137d4dd412230cfecd3c66a1bf388a196"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dd9da77c6ec1f258387957b754f0df60766ac23ed698b61941ba9acccd3284d1"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:190ca6f55042ea4649ed19c9093a9be9d63cd8a97880106747d7147f88a49d18"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:876bf9ed62323bc7dcfc261dbc5572c996ef26fe6406b0ff985cbcf460fc8a4c"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa2818759aba55df50592ecbc95ebcdc99917fa7b55cc6796235b04193eb3c55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9ea4d00850ef1e917815e59b078ecb338f6a8efda23369677c54a5825dbebb55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:5855c85eb8b8a968a74dc7fb014c9166a05e7e7a8377fb91d78512900aadd13d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:14c408e9d1a80dcb45c05a5149e5961aadb912fff42ca1dd9b68c0044904eb32"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:65a0583c43d9f22cb2130c7b110e695fff834fd5e832a776a107197e59a1898e"},
-    {file = "rpds_py-0.9.2-cp38-none-win32.whl", hash = "sha256:71f2f7715935a61fa3e4ae91d91b67e571aeb5cb5d10331ab681256bda2ad920"},
-    {file = "rpds_py-0.9.2-cp38-none-win_amd64.whl", hash = "sha256:674c704605092e3ebbbd13687b09c9f78c362a4bc710343efe37a91457123044"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:07e2c54bef6838fa44c48dfbc8234e8e2466d851124b551fc4e07a1cfeb37260"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f7fdf55283ad38c33e35e2855565361f4bf0abd02470b8ab28d499c663bc5d7c"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:890ba852c16ace6ed9f90e8670f2c1c178d96510a21b06d2fa12d8783a905193"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:50025635ba8b629a86d9d5474e650da304cb46bbb4d18690532dd79341467846"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:517cbf6e67ae3623c5127206489d69eb2bdb27239a3c3cc559350ef52a3bbf0b"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0836d71ca19071090d524739420a61580f3f894618d10b666cf3d9a1688355b1"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c439fd54b2b9053717cca3de9583be6584b384d88d045f97d409f0ca867d80f"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f68996a3b3dc9335037f82754f9cdbe3a95db42bde571d8c3be26cc6245f2324"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7d68dc8acded354c972116f59b5eb2e5864432948e098c19fe6994926d8e15c3"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:f963c6b1218b96db85fc37a9f0851eaf8b9040aa46dec112611697a7023da535"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5a46859d7f947061b4010e554ccd1791467d1b1759f2dc2ec9055fa239f1bc26"},
-    {file = "rpds_py-0.9.2-cp39-none-win32.whl", hash = "sha256:e07e5dbf8a83c66783a9fe2d4566968ea8c161199680e8ad38d53e075df5f0d0"},
-    {file = "rpds_py-0.9.2-cp39-none-win_amd64.whl", hash = "sha256:682726178138ea45a0766907957b60f3a1bf3acdf212436be9733f28b6c5af3c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_10_7_x86_64.whl", hash = "sha256:196cb208825a8b9c8fc360dc0f87993b8b260038615230242bf18ec84447c08d"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:c7671d45530fcb6d5e22fd40c97e1e1e01965fc298cbda523bb640f3d923b387"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83b32f0940adec65099f3b1c215ef7f1d025d13ff947975a055989cb7fd019a4"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7f67da97f5b9eac838b6980fc6da268622e91f8960e083a34533ca710bec8611"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03975db5f103997904c37e804e5f340c8fdabbb5883f26ee50a255d664eed58c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:987b06d1cdb28f88a42e4fb8a87f094e43f3c435ed8e486533aea0bf2e53d931"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c861a7e4aef15ff91233751619ce3a3d2b9e5877e0fcd76f9ea4f6847183aa16"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:02938432352359805b6da099c9c95c8a0547fe4b274ce8f1a91677401bb9a45f"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:ef1f08f2a924837e112cba2953e15aacfccbbfcd773b4b9b4723f8f2ddded08e"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:35da5cc5cb37c04c4ee03128ad59b8c3941a1e5cd398d78c37f716f32a9b7f67"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:141acb9d4ccc04e704e5992d35472f78c35af047fa0cfae2923835d153f091be"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_10_7_x86_64.whl", hash = "sha256:79f594919d2c1a0cc17d1988a6adaf9a2f000d2e1048f71f298b056b1018e872"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:a06418fe1155e72e16dddc68bb3780ae44cebb2912fbd8bb6ff9161de56e1798"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b2eb034c94b0b96d5eddb290b7b5198460e2d5d0c421751713953a9c4e47d10"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b08605d248b974eb02f40bdcd1a35d3924c83a2a5e8f5d0fa5af852c4d960af"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a0805911caedfe2736935250be5008b261f10a729a303f676d3d5fea6900c96a"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab2299e3f92aa5417d5e16bb45bb4586171c1327568f638e8453c9f8d9e0f020"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c8d7594e38cf98d8a7df25b440f684b510cf4627fe038c297a87496d10a174f"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8b9ec12ad5f0a4625db34db7e0005be2632c1013b253a4a60e8302ad4d462afd"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:1fcdee18fea97238ed17ab6478c66b2095e4ae7177e35fb71fbe561a27adf620"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_i686.whl", hash = "sha256:933a7d5cd4b84f959aedeb84f2030f0a01d63ae6cf256629af3081cf3e3426e8"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:686ba516e02db6d6f8c279d1641f7067ebb5dc58b1d0536c4aaebb7bf01cdc5d"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_10_7_x86_64.whl", hash = "sha256:0173c0444bec0a3d7d848eaeca2d8bd32a1b43f3d3fde6617aac3731fa4be05f"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:d576c3ef8c7b2d560e301eb33891d1944d965a4d7a2eacb6332eee8a71827db6"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed89861ee8c8c47d6beb742a602f912b1bb64f598b1e2f3d758948721d44d468"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1054a08e818f8e18910f1bee731583fe8f899b0a0a5044c6e680ceea34f93876"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99e7c4bb27ff1aab90dcc3e9d37ee5af0231ed98d99cb6f5250de28889a3d502"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c545d9d14d47be716495076b659db179206e3fd997769bc01e2d550eeb685596"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9039a11bca3c41be5a58282ed81ae422fa680409022b996032a43badef2a3752"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fb39aca7a64ad0c9490adfa719dbeeb87d13be137ca189d2564e596f8ba32c07"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:2d8b3b3a2ce0eaa00c5bbbb60b6713e94e7e0becab7b3db6c5c77f979e8ed1f1"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:99b1c16f732b3a9971406fbfe18468592c5a3529585a45a35adbc1389a529a03"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:c27ee01a6c3223025f4badd533bea5e87c988cb0ba2811b690395dfe16088cfe"},
-    {file = "rpds_py-0.9.2.tar.gz", hash = "sha256:8d70e8f14900f2657c249ea4def963bed86a29b81f81f5b76b5a9215680de945"},
-]
-
 [[package]]
 name = "scapy"
 version = "2.5.0"
@@ -826,31 +761,16 @@ files = [
 
 [[package]]
 name = "typing-extensions"
-version = "4.11.0"
+version = "4.12.2"
 description = "Backported and Experimental Type Hints for Python 3.8+"
 optional = false
 python-versions = ">=3.8"
 files = [
-    {file = "typing_extensions-4.11.0-py3-none-any.whl", hash = "sha256:c1f94d72897edaf4ce775bb7558d5b79d8126906a14ea5ed1635921406c0387a"},
-    {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
+    {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"},
+    {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"},
 ]
 
-[[package]]
-name = "warlock"
-version = "2.0.1"
-description = "Python object model built on JSON schema and JSON patch."
-optional = false
-python-versions = ">=3.7,<4.0"
-files = [
-    {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"},
-    {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"},
-]
-
-[package.dependencies]
-jsonpatch = ">=1,<2"
-jsonschema = ">=4,<5"
-
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "f69ffb8c1545d7beb035533dab109722f844f39f9ffd46b7aceb386e90fa039d"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..e5785f27d8 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -19,13 +19,13 @@ documentation = "https://doc.dpdk.org/guides/tools/dts.html"
 
 [tool.poetry.dependencies]
 python = "^3.10"
-warlock = "^2.0.1"
 PyYAML = "^6.0"
 types-PyYAML = "^6.0.8"
 fabric = "^2.7.1"
 scapy = "^2.5.0"
 pydocstyle = "6.1.1"
 typing-extensions = "^4.11.0"
+pydantic = "^2.8.2"
 
 [tool.poetry.group.dev.dependencies]
 mypy = "^1.10.0"
@@ -55,6 +55,7 @@ python_version = "3.10"
 enable_error_code = ["ignore-without-code"]
 show_error_codes = true
 warn_unused_ignores = true
+plugins = "pydantic.mypy"
 
 [tool.isort]
 profile = "black"
-- 
2.34.1


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [PATCH 3/5] dts: use Pydantic in the configuration
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
  2024-08-22 16:39 ` [PATCH 1/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
  2024-08-22 16:39 ` [PATCH 2/5] dts: add Pydantic and remove Warlock Luca Vizzarro
@ 2024-08-22 16:39 ` Luca Vizzarro
  2024-09-17 11:13   ` Juraj Linkeš
  2024-08-22 16:39 ` [PATCH 4/5] dts: use TestSuiteSpec class imports Luca Vizzarro
  2024-08-22 16:39 ` [PATCH 5/5] dts: add JSON schema generation script Luca Vizzarro
  4 siblings, 1 reply; 13+ messages in thread
From: Luca Vizzarro @ 2024-08-22 16:39 UTC (permalink / raw)
  To: dev
  Cc: Honnappa Nagarahalli, Juraj Linkeš, Luca Vizzarro, Paul Szczepanek

This change brings in Pydantic in place of Warlock. Pydantic offers
a built-in model validation system in the classes, which allows for
a more resilient and simpler code. As a consequence of this change:

- most validation is now built-in
- further validation is added to verify:
  - cross referencing of node names and ports
  - test suite and test cases names
- dictionaries representing the config schema are removed
- the config schema is no longer used for validation but kept as an
  alternative format for the developer
- the config schema can now be generated automatically from the
  Pydantic models
- the TrafficGeneratorType enum has been changed from inheriting
  StrEnum to the native str and Enum. This change was necessary to
  enable the discriminator for object unions
- the structure of the classes has been slightly changed to perfectly
  match the structure of the configuration files
- updates the test suite argument to catch the ValidationError that
  TestSuiteConfig can now raise

Bugzilla ID: 1508

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/config/__init__.py              | 588 +++++++++---------
 dts/framework/config/types.py                 | 132 ----
 dts/framework/runner.py                       |  35 +-
 dts/framework/settings.py                     |  16 +-
 dts/framework/testbed_model/sut_node.py       |   2 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 7 files changed, 325 insertions(+), 454 deletions(-)
 delete mode 100644 dts/framework/config/types.py

diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index df60a5030e..013c529829 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -2,17 +2,19 @@
 # Copyright(c) 2010-2021 Intel Corporation
 # Copyright(c) 2022-2023 University of New Hampshire
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Testbed configuration and test suite specification.
 
 This package offers classes that hold real-time information about the testbed, hold test run
 configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
-the YAML test run configuration file
-and validates it according to :download:`the schema <conf_yaml_schema.json>`.
+the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
+dataclass model. The Pydantic model is also available as
+:download:`JSON schema <conf_yaml_schema.json>`.
 
 The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
-this package. The allowed keys and types inside this dictionary are defined in
-the :doc:`types <framework.config.types>` module.
+this package. The allowed keys and types inside this dictionary map directly to the
+:class:`Configuration` model, its fields and sub-models.
 
 The test run configuration has two main sections:
 
@@ -24,7 +26,7 @@
 
 The real-time information about testbed is supposed to be gathered at runtime.
 
-The classes defined in this package make heavy use of :mod:`dataclasses`.
+The classes defined in this package make heavy use of :mod:`pydantic.dataclasses`.
 All of them use slots and are frozen:
 
     * Slots enables some optimizations, by pre-allocating space for the defined
@@ -33,29 +35,31 @@
       and makes it thread safe should we ever want to move in that direction.
 """
 
-import json
-import os.path
-from dataclasses import dataclass, fields
-from enum import auto, unique
+from enum import Enum, auto, unique
+from functools import cached_property
 from pathlib import Path
-from typing import Union
+from typing import TYPE_CHECKING, Annotated, Any, Literal, NamedTuple, Protocol
 
-import warlock  # type: ignore[import-untyped]
 import yaml
+from pydantic import (
+    ConfigDict,
+    Field,
+    StringConstraints,
+    TypeAdapter,
+    ValidationError,
+    field_validator,
+    model_validator,
+)
+from pydantic.config import JsonDict
+from pydantic.dataclasses import dataclass
 from typing_extensions import Self
 
-from framework.config.types import (
-    BuildTargetConfigDict,
-    ConfigurationDict,
-    NodeConfigDict,
-    PortConfigDict,
-    TestRunConfigDict,
-    TestSuiteConfigDict,
-    TrafficGeneratorConfigDict,
-)
 from framework.exception import ConfigurationError
 from framework.utils import StrEnum
 
+if TYPE_CHECKING:
+    from framework.test_suite import TestSuiteSpec
+
 
 @unique
 class Architecture(StrEnum):
@@ -116,14 +120,14 @@ class Compiler(StrEnum):
 
 
 @unique
-class TrafficGeneratorType(StrEnum):
+class TrafficGeneratorType(str, Enum):
     """The supported traffic generators."""
 
     #:
-    SCAPY = auto()
+    SCAPY = "SCAPY"
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class HugepageConfiguration:
     r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
 
@@ -136,12 +140,17 @@ class HugepageConfiguration:
     force_first_numa: bool
 
 
-@dataclass(slots=True, frozen=True)
+PciAddress = Annotated[
+    str, StringConstraints(pattern=r"^[\da-fA-F]{4}:[\da-fA-F]{2}:[\da-fA-F]{2}.\d:?\w*$")
+]
+"""A constrained string type representing a PCI address."""
+
+
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class PortConfig:
     r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
-        node: The :class:`~framework.testbed_model.node.Node` where this port exists.
         pci: The PCI address of the port.
         os_driver_for_dpdk: The operating system driver name for use with DPDK.
         os_driver: The operating system driver name when the operating system controls the port.
@@ -150,69 +159,53 @@ class PortConfig:
         peer_pci: The PCI address of the port connected to this port.
     """
 
-    node: str
-    pci: str
-    os_driver_for_dpdk: str
-    os_driver: str
-    peer_node: str
-    peer_pci: str
-
-    @classmethod
-    def from_dict(cls, node: str, d: PortConfigDict) -> Self:
-        """A convenience method that creates the object from fewer inputs.
-
-        Args:
-            node: The node where this port exists.
-            d: The configuration dictionary.
-
-        Returns:
-            The port configuration instance.
-        """
-        return cls(node=node, **d)
-
+    pci: PciAddress = Field(description="The local PCI address of the port.")
+    os_driver_for_dpdk: str = Field(
+        description="The driver that the kernel should bind this device to for DPDK to use it.",
+        examples=["vfio-pci", "mlx5_core"],
+    )
+    os_driver: str = Field(
+        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
+    )
+    peer_node: str = Field(description="The name of the peer node this port is connected to.")
+    peer_pci: PciAddress = Field(
+        description="The PCI address of the peer port this port is connected to."
+    )
 
-@dataclass(slots=True, frozen=True)
-class TrafficGeneratorConfig:
-    """The configuration of traffic generators.
 
-    The class will be expanded when more configuration is needed.
+class TrafficGeneratorConfig(Protocol):
+    """A protocol required to define traffic generator types.
 
     Attributes:
-        traffic_generator_type: The type of the traffic generator.
+        type: The traffic generator type, the child class is required to define to be distinguished
+            among others.
     """
 
-    traffic_generator_type: TrafficGeneratorType
+    type: TrafficGeneratorType
 
-    @staticmethod
-    def from_dict(d: TrafficGeneratorConfigDict) -> "TrafficGeneratorConfig":
-        """A convenience method that produces traffic generator config of the proper type.
 
-        Args:
-            d: The configuration dictionary.
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
+class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
+    """Scapy traffic generator specific configuration."""
 
-        Returns:
-            The traffic generator configuration instance.
+    type: Literal[TrafficGeneratorType.SCAPY]
 
-        Raises:
-            ConfigurationError: An unknown traffic generator type was encountered.
-        """
-        match TrafficGeneratorType(d["type"]):
-            case TrafficGeneratorType.SCAPY:
-                return ScapyTrafficGeneratorConfig(
-                    traffic_generator_type=TrafficGeneratorType.SCAPY
-                )
-            case _:
-                raise ConfigurationError(f'Unknown traffic generator type "{d["type"]}".')
 
+TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
 
-@dataclass(slots=True, frozen=True)
-class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
-    """Scapy traffic generator specific configuration."""
 
-    pass
+LogicalCores = Annotated[
+    str,
+    StringConstraints(pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$"),
+    Field(
+        description="Comma-separated list of logical cores to use. "
+        "An empty string means use all lcores.",
+        examples=["1,2,3,4,5,18-22", "10-15"],
+    ),
+]
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class NodeConfiguration:
     r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
 
@@ -232,69 +225,25 @@ class NodeConfiguration:
         ports: The ports that can be used in testing.
     """
 
-    name: str
-    hostname: str
-    user: str
-    password: str | None
+    name: str = Field(description="A unique identifier for this node.")
+    hostname: str = Field(description="The hostname or IP address of the node.")
+    user: str = Field(description="The login user to use to connect to this node.")
+    password: str | None = Field(
+        default=None,
+        description="The login password to use to connect to this node. "
+        "SSH keys are STRONGLY preferred, use only as last resort.",
+    )
     arch: Architecture
     os: OS
-    lcores: str
-    use_first_core: bool
-    hugepages: HugepageConfiguration | None
-    ports: list[PortConfig]
-
-    @staticmethod
-    def from_dict(
-        d: NodeConfigDict,
-    ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
-        """A convenience method that processes the inputs before creating a specialized instance.
-
-        Args:
-            d: The configuration dictionary.
-
-        Returns:
-            Either an SUT or TG configuration instance.
-        """
-        hugepage_config = None
-        if "hugepages_2mb" in d:
-            hugepage_config_dict = d["hugepages_2mb"]
-            if "force_first_numa" not in hugepage_config_dict:
-                hugepage_config_dict["force_first_numa"] = False
-            hugepage_config = HugepageConfiguration(**hugepage_config_dict)
-
-        # The calls here contain duplicated code which is here because Mypy doesn't
-        # properly support dictionary unpacking with TypedDicts
-        if "traffic_generator" in d:
-            return TGNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                traffic_generator=TrafficGeneratorConfig.from_dict(d["traffic_generator"]),
-            )
-        else:
-            return SutNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                memory_channels=d.get("memory_channels", 1),
-            )
+    lcores: LogicalCores = "1"
+    use_first_core: bool = Field(
+        default=False, description="DPDK won't use the first physical core if set to False."
+    )
+    hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
+    ports: list[PortConfig] = Field(min_length=1)
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class SutNodeConfiguration(NodeConfiguration):
     """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
 
@@ -302,10 +251,12 @@ class SutNodeConfiguration(NodeConfiguration):
         memory_channels: The number of memory channels to use when running DPDK.
     """
 
-    memory_channels: int
+    memory_channels: int = Field(
+        default=1, description="Number of memory channels to use when running DPDK."
+    )
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class TGNodeConfiguration(NodeConfiguration):
     """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
 
@@ -313,10 +264,14 @@ class TGNodeConfiguration(NodeConfiguration):
         traffic_generator: The configuration of the traffic generator present on the TG node.
     """
 
-    traffic_generator: TrafficGeneratorConfig
+    traffic_generator: TrafficGeneratorConfigTypes
+
 
+NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
+"""Union type for all the node configuration types."""
 
-@dataclass(slots=True, frozen=True)
+
+@dataclass(slots=True, frozen=True, config=ConfigDict(extra="forbid"))
 class NodeInfo:
     """Supplemental node information.
 
@@ -334,7 +289,7 @@ class NodeInfo:
     kernel_version: str
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class BuildTargetConfiguration:
     """DPDK build configuration.
 
@@ -347,40 +302,21 @@ class BuildTargetConfiguration:
         compiler: The compiler executable to use.
         compiler_wrapper: This string will be put in front of the compiler when
             executing the build. Useful for adding wrapper commands, such as ``ccache``.
-        name: The name of the compiler.
     """
 
     arch: Architecture
     os: OS
     cpu: CPUType
     compiler: Compiler
-    compiler_wrapper: str
-    name: str
+    compiler_wrapper: str = ""
 
-    @classmethod
-    def from_dict(cls, d: BuildTargetConfigDict) -> Self:
-        r"""A convenience method that processes the inputs before creating an instance.
-
-        `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
-        `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
-
-        Args:
-            d: The configuration dictionary.
-
-        Returns:
-            The build target configuration instance.
-        """
-        return cls(
-            arch=Architecture(d["arch"]),
-            os=OS(d["os"]),
-            cpu=CPUType(d["cpu"]),
-            compiler=Compiler(d["compiler"]),
-            compiler_wrapper=d.get("compiler_wrapper", ""),
-            name=f"{d['arch']}-{d['os']}-{d['cpu']}-{d['compiler']}",
-        )
+    @cached_property
+    def name(self) -> str:
+        """The name of the compiler."""
+        return f"{self.arch}-{self.os}-{self.cpu}-{self.compiler}"
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class BuildTargetInfo:
     """Various versions and other information about a build target.
 
@@ -393,11 +329,39 @@ class BuildTargetInfo:
     compiler_version: str
 
 
-@dataclass(slots=True, frozen=True)
+def make_parsable_schema(schema: JsonDict):
+    """Updates a model's JSON schema to make a string representation a valid alternative.
+
+    This utility function is required to be used with models that can be represented and validated
+    as a string instead of an object mapping. Normally the generated JSON schema will just show
+    the object mapping. This function wraps the mapping under an anyOf property sequenced with a
+    string type.
+
+    This function is a valid `Callable` for the `json_schema_extra` attribute of
+    `~pydantic.config.ConfigDict`.
+    """
+    inner_schema = schema.copy()
+    del inner_schema["title"]
+
+    title = schema.get("title")
+    description = schema.get("description")
+
+    schema.clear()
+
+    schema["title"] = title
+    schema["description"] = description
+    schema["anyOf"] = [inner_schema, {"type": "string"}]
+
+
+@dataclass(
+    frozen=True,
+    config=ConfigDict(extra="forbid", json_schema_extra=make_parsable_schema),
+)
 class TestSuiteConfig:
     """Test suite configuration.
 
-    Information about a single test suite to be executed.
+    Information about a single test suite to be executed. It can be represented and validated as a
+    string type in the form of: ``TEST_SUITE [TEST_CASE, ...]``, in the configuration file.
 
     Attributes:
         test_suite: The name of the test suite module without the starting ``TestSuite_``.
@@ -405,31 +369,63 @@ class TestSuiteConfig:
             If empty, all test cases will be executed.
     """
 
-    test_suite: str
-    test_cases: list[str]
-
+    test_suite_name: str = Field(
+        title="Test suite name",
+        description="The identifying name of the test suite.",
+        alias="test_suite",
+    )
+    test_cases_names: list[str] = Field(
+        default_factory=list,
+        title="Test cases by name",
+        description="The identifying name of the test cases of the test suite.",
+        alias="test_cases",
+    )
+
+    @cached_property
+    def test_suite_spec(self) -> "TestSuiteSpec":
+        """The specification of the requested test suite."""
+        from framework.test_suite import find_by_name
+
+        test_suite_spec = find_by_name(self.test_suite_name)
+        assert test_suite_spec is not None, f"{self.test_suite_name} is not a valid test suite name"
+        return test_suite_spec
+
+    @model_validator(mode="before")
     @classmethod
-    def from_dict(
-        cls,
-        entry: str | TestSuiteConfigDict,
-    ) -> Self:
-        """Create an instance from two different types.
+    def convert_from_string(cls, data: Any) -> Any:
+        """Convert the string representation into a valid mapping."""
+        if isinstance(data, str):
+            [test_suite, *test_cases] = data.split()
+            return dict(test_suite=test_suite, test_cases=test_cases)
+        return data
+
+    @model_validator(mode="after")
+    def validate_names(self) -> Self:
+        """Validate the supplied test suite and test cases names."""
+        available_test_cases = map(lambda t: t.name, self.test_suite_spec.test_cases)
+        for requested_test_case in self.test_cases_names:
+            assert requested_test_case in available_test_cases, (
+                f"{requested_test_case} is not a valid test case "
+                f"for test suite {self.test_suite_name}"
+            )
 
-        Args:
-            entry: Either a suite name or a dictionary containing the config.
+        return self
 
-        Returns:
-            The test suite configuration instance.
-        """
-        if isinstance(entry, str):
-            return cls(test_suite=entry, test_cases=[])
-        elif isinstance(entry, dict):
-            return cls(test_suite=entry["suite"], test_cases=entry["cases"])
-        else:
-            raise TypeError(f"{type(entry)} is not valid for a test suite config.")
+
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
+class TestRunSUTNodeConfiguration:
+    """The SUT node configuration of a test run.
+
+    Attributes:
+        node_name: The SUT node to use in this test run.
+        vdevs: The names of virtual devices to test.
+    """
+
+    node_name: str
+    vdevs: list[str] = Field(default_factory=list)
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class TestRunConfiguration:
     """The configuration of a test run.
 
@@ -442,143 +438,132 @@ class TestRunConfiguration:
         func: Whether to run functional tests.
         skip_smoke_tests: Whether to skip smoke tests.
         test_suites: The names of test suites and/or test cases to execute.
-        system_under_test_node: The SUT node to use in this test run.
-        traffic_generator_node: The TG node to use in this test run.
-        vdevs: The names of virtual devices to test.
+        system_under_test_node: The SUT node configuration to use in this test run.
+        traffic_generator_node: The TG node name to use in this test run.
     """
 
     build_targets: list[BuildTargetConfiguration]
-    perf: bool
-    func: bool
-    skip_smoke_tests: bool
-    test_suites: list[TestSuiteConfig]
-    system_under_test_node: SutNodeConfiguration
-    traffic_generator_node: TGNodeConfiguration
-    vdevs: list[str]
+    perf: bool = Field(description="Enable performance testing.")
+    func: bool = Field(description="Enable functional testing.")
+    skip_smoke_tests: bool = False
+    test_suites: list[TestSuiteConfig] = Field(min_length=1)
+    system_under_test_node: TestRunSUTNodeConfiguration
+    traffic_generator_node: str
 
-    @classmethod
-    def from_dict(
-        cls,
-        d: TestRunConfigDict,
-        node_map: dict[str, SutNodeConfiguration | TGNodeConfiguration],
-    ) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
-
-        The build target and the test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
-
-        Args:
-            d: The test run configuration dictionary.
-            node_map: A dictionary mapping node names to their config objects.
-
-        Returns:
-            The test run configuration instance.
-        """
-        build_targets: list[BuildTargetConfiguration] = list(
-            map(BuildTargetConfiguration.from_dict, d["build_targets"])
-        )
-        test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"]))
-        sut_name = d["system_under_test_node"]["node_name"]
-        skip_smoke_tests = d.get("skip_smoke_tests", False)
-        assert sut_name in node_map, f"Unknown SUT {sut_name} in test run {d}"
-        system_under_test_node = node_map[sut_name]
-        assert isinstance(
-            system_under_test_node, SutNodeConfiguration
-        ), f"Invalid SUT configuration {system_under_test_node}"
-
-        tg_name = d["traffic_generator_node"]
-        assert tg_name in node_map, f"Unknown TG {tg_name} in test run {d}"
-        traffic_generator_node = node_map[tg_name]
-        assert isinstance(
-            traffic_generator_node, TGNodeConfiguration
-        ), f"Invalid TG configuration {traffic_generator_node}"
-
-        vdevs = (
-            d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else []
-        )
-        return cls(
-            build_targets=build_targets,
-            perf=d["perf"],
-            func=d["func"],
-            skip_smoke_tests=skip_smoke_tests,
-            test_suites=test_suites,
-            system_under_test_node=system_under_test_node,
-            traffic_generator_node=traffic_generator_node,
-            vdevs=vdevs,
-        )
-
-    def copy_and_modify(self, **kwargs) -> Self:
-        """Create a shallow copy with any of the fields modified.
-
-        The only new data are those passed to this method.
-        The rest are copied from the object's fields calling the method.
-
-        Args:
-            **kwargs: The names and types of keyword arguments are defined
-                by the fields of the :class:`TestRunConfiguration` class.
-
-        Returns:
-            The copied and modified test run configuration.
-        """
-        new_config = {}
-        for field in fields(self):
-            if field.name in kwargs:
-                new_config[field.name] = kwargs[field.name]
-            else:
-                new_config[field.name] = getattr(self, field.name)
 
-        return type(self)(**new_config)
+class TestRunWithNodesConfiguration(NamedTuple):
+    """Tuple containing the configuration of the test run and its associated nodes."""
 
+    #:
+    test_run_config: TestRunConfiguration
+    #:
+    sut_node_config: SutNodeConfiguration
+    #:
+    tg_node_config: TGNodeConfiguration
 
-@dataclass(slots=True, frozen=True)
+
+@dataclass(frozen=True, kw_only=True)
 class Configuration:
     """DTS testbed and test configuration.
 
-    The node configuration is not stored in this object. Rather, all used node configurations
-    are stored inside the test run configuration where the nodes are actually used.
-
     Attributes:
         test_runs: Test run configurations.
+        nodes: Node configurations.
     """
 
-    test_runs: list[TestRunConfiguration]
+    test_runs: list[TestRunConfiguration] = Field(min_length=1)
+    nodes: list[NodeConfigurationTypes] = Field(min_length=1)
 
+    @field_validator("nodes")
     @classmethod
-    def from_dict(cls, d: ConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
+    def validate_node_names(cls, nodes: list[NodeConfiguration]) -> list[NodeConfiguration]:
+        """Validate that the node names are unique."""
+        nodes_by_name: dict[str, int] = {}
+        for node_no, node in enumerate(nodes):
+            assert node.name not in nodes_by_name, (
+                f"node {node_no} cannot have the same name as node {nodes_by_name[node.name]} "
+                f"({node.name})"
+            )
+            nodes_by_name[node.name] = node_no
+
+        return nodes
+
+    @model_validator(mode="after")
+    def validate_ports(self) -> Self:
+        """Validate that the ports are all linked to valid ones."""
+        port_links: dict[tuple[str, str], Literal[False] | tuple[int, int]] = {
+            (node.name, port.pci): False for node in self.nodes for port in node.ports
+        }
+
+        for node_no, node in enumerate(self.nodes):
+            for port_no, port in enumerate(node.ports):
+                peer_port_identifier = (port.peer_node, port.peer_pci)
+                peer_port = port_links.get(peer_port_identifier, None)
+                assert peer_port is not None, (
+                    "invalid peer port specified for " f"nodes.{node_no}.ports.{port_no}"
+                )
+                assert peer_port is False, (
+                    f"the peer port specified for nodes.{node_no}.ports.{port_no} "
+                    f"is already linked to nodes.{peer_port[0]}.ports.{peer_port[1]}"
+                )
+                port_links[peer_port_identifier] = (node_no, port_no)
 
-        Build target and test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
+        return self
 
-        Args:
-            d: The configuration dictionary.
+    @cached_property
+    def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]:
+        """List test runs with the associated nodes."""
+        test_runs_with_nodes = []
 
-        Returns:
-            The whole configuration instance.
-        """
-        nodes: list[SutNodeConfiguration | TGNodeConfiguration] = list(
-            map(NodeConfiguration.from_dict, d["nodes"])
-        )
-        assert len(nodes) > 0, "There must be a node to test"
+        for test_run_no, test_run in enumerate(self.test_runs):
+            sut_node_name = test_run.system_under_test_node.node_name
+            sut_node = next(filter(lambda n: n.name == sut_node_name, self.nodes), None)
 
-        node_map = {node.name: node for node in nodes}
-        assert len(nodes) == len(node_map), "Duplicate node names are not allowed"
+            assert sut_node is not None, (
+                f"test_runs.{test_run_no}.sut_node_config.node_name "
+                f"({test_run.system_under_test_node.node_name}) is not a valid node name"
+            )
+            assert isinstance(sut_node, SutNodeConfiguration), (
+                f"test_runs.{test_run_no}.sut_node_config.node_name is a valid node name, "
+                "but it is not a valid SUT node"
+            )
+
+            tg_node_name = test_run.traffic_generator_node
+            tg_node = next(filter(lambda n: n.name == tg_node_name, self.nodes), None)
 
-        test_runs: list[TestRunConfiguration] = list(
-            map(TestRunConfiguration.from_dict, d["test_runs"], [node_map for _ in d])
-        )
+            assert tg_node is not None, (
+                f"test_runs.{test_run_no}.tg_node_name "
+                f"({test_run.traffic_generator_node}) is not a valid node name"
+            )
+            assert isinstance(tg_node, TGNodeConfiguration), (
+                f"test_runs.{test_run_no}.tg_node_name is a valid node name, "
+                "but it is not a valid TG node"
+            )
 
-        return cls(test_runs=test_runs)
+            test_runs_with_nodes.append(TestRunWithNodesConfiguration(test_run, sut_node, tg_node))
+
+        return test_runs_with_nodes
+
+    @model_validator(mode="after")
+    def validate_test_runs_with_nodes(self) -> Self:
+        """Validate the test runs to nodes associations.
+
+        This validator relies on the cached property `test_runs_with_nodes` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        if self.test_runs_with_nodes:
+            pass
+        return self
+
+
+ConfigurationType = TypeAdapter(Configuration)
 
 
 def load_config(config_file_path: Path) -> Configuration:
     """Load DTS test run configuration from a file.
 
-    Load the YAML test run configuration file
-    and :download:`the configuration file schema <conf_yaml_schema.json>`,
-    validate the test run configuration file, and create a test run configuration object.
+    Load the YAML test run configuration file, validate it, and create a test run configuration
+    object.
 
     The YAML test run configuration file is specified in the :option:`--config-file` command line
     argument or the :envvar:`DTS_CFG_FILE` environment variable.
@@ -588,14 +573,15 @@ def load_config(config_file_path: Path) -> Configuration:
 
     Returns:
         The parsed test run configuration.
+
+    Raises:
+        ConfigurationError: If the supplied configuration file is invalid.
     """
     with open(config_file_path, "r") as f:
         config_data = yaml.safe_load(f)
 
-    schema_path = os.path.join(Path(__file__).parent.resolve(), "conf_yaml_schema.json")
-
-    with open(schema_path, "r") as f:
-        schema = json.load(f)
-    config = warlock.model_factory(schema, name="_Config")(config_data)
-    config_obj: Configuration = Configuration.from_dict(dict(config))  # type: ignore[arg-type]
-    return config_obj
+    try:
+        ConfigurationType.json_schema()
+        return ConfigurationType.validate_python(config_data)
+    except ValidationError as e:
+        raise ConfigurationError("failed to load the supplied configuration") from e
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
deleted file mode 100644
index cf16556403..0000000000
--- a/dts/framework/config/types.py
+++ /dev/null
@@ -1,132 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-"""Configuration dictionary contents specification.
-
-These type definitions serve as documentation of the configuration dictionary contents.
-
-The definitions use the built-in :class:`~typing.TypedDict` construct.
-"""
-
-from typing import TypedDict
-
-
-class PortConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    pci: str
-    #:
-    os_driver_for_dpdk: str
-    #:
-    os_driver: str
-    #:
-    peer_node: str
-    #:
-    peer_pci: str
-
-
-class TrafficGeneratorConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    type: str
-
-
-class HugepageConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    number_of: int
-    #:
-    force_first_numa: bool
-
-
-class NodeConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    hugepages_2mb: HugepageConfigurationDict
-    #:
-    name: str
-    #:
-    hostname: str
-    #:
-    user: str
-    #:
-    password: str
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    lcores: str
-    #:
-    use_first_core: bool
-    #:
-    ports: list[PortConfigDict]
-    #:
-    memory_channels: int
-    #:
-    traffic_generator: TrafficGeneratorConfigDict
-
-
-class BuildTargetConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    cpu: str
-    #:
-    compiler: str
-    #:
-    compiler_wrapper: str
-
-
-class TestSuiteConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    suite: str
-    #:
-    cases: list[str]
-
-
-class TestRunSUTConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    node_name: str
-    #:
-    vdevs: list[str]
-
-
-class TestRunConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    build_targets: list[BuildTargetConfigDict]
-    #:
-    perf: bool
-    #:
-    func: bool
-    #:
-    skip_smoke_tests: bool
-    #:
-    test_suites: TestSuiteConfigDict
-    #:
-    system_under_test_node: TestRunSUTConfigDict
-    #:
-    traffic_generator_node: str
-
-
-class ConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    nodes: list[NodeConfigDict]
-    #:
-    test_runs: list[TestRunConfigDict]
diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 6b6f6a05f5..14e405aced 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -32,8 +32,10 @@
 from .config import (
     BuildTargetConfiguration,
     Configuration,
+    SutNodeConfiguration,
     TestRunConfiguration,
     TestSuiteConfig,
+    TGNodeConfiguration,
     load_config,
 )
 from .exception import (
@@ -142,18 +144,17 @@ def run(self) -> None:
             self._result.update_setup(Result.PASS)
 
             # for all test run sections
-            for test_run_config in self._configuration.test_runs:
+            for test_run_with_nodes_config in self._configuration.test_runs_with_nodes:
+                test_run_config, sut_node_config, tg_node_config = test_run_with_nodes_config
                 self._logger.set_stage(DtsStage.test_run_setup)
-                self._logger.info(
-                    f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
-                )
+                self._logger.info(f"Running test run with SUT '{sut_node_config.name}'.")
                 test_run_result = self._result.add_test_run(test_run_config)
                 # we don't want to modify the original config, so create a copy
                 test_run_test_suites = list(
                     SETTINGS.test_suites if SETTINGS.test_suites else test_run_config.test_suites
                 )
                 if not test_run_config.skip_smoke_tests:
-                    test_run_test_suites[:0] = [TestSuiteConfig.from_dict("smoke_tests")]
+                    test_run_test_suites[:0] = [TestSuiteConfig("smoke_tests")]
                 try:
                     test_suites_with_cases = self._get_test_suites_with_cases(
                         test_run_test_suites, test_run_config.func, test_run_config.perf
@@ -169,6 +170,8 @@ def run(self) -> None:
                     self._connect_nodes_and_run_test_run(
                         sut_nodes,
                         tg_nodes,
+                        sut_node_config,
+                        tg_node_config,
                         test_run_config,
                         test_run_result,
                         test_suites_with_cases,
@@ -231,10 +234,10 @@ def _get_test_suites_with_cases(
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
+            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
             test_cases = []
             func_test_cases, perf_test_cases = self._filter_test_cases(
-                test_suite_class, test_suite_config.test_cases
+                test_suite_class, test_suite_config.test_cases_names
             )
             if func:
                 test_cases.extend(func_test_cases)
@@ -364,6 +367,8 @@ def _connect_nodes_and_run_test_run(
         self,
         sut_nodes: dict[str, SutNode],
         tg_nodes: dict[str, TGNode],
+        sut_node_config: SutNodeConfiguration,
+        tg_node_config: TGNodeConfiguration,
         test_run_config: TestRunConfiguration,
         test_run_result: TestRunResult,
         test_suites_with_cases: Iterable[TestSuiteWithCases],
@@ -378,24 +383,26 @@ def _connect_nodes_and_run_test_run(
         Args:
             sut_nodes: A dictionary storing connected/to be connected SUT nodes.
             tg_nodes: A dictionary storing connected/to be connected TG nodes.
+            sut_node_config: The test run's SUT node configuration.
+            tg_node_config: The test run's TG node configuration.
             test_run_config: A test run configuration.
             test_run_result: The test run's result.
             test_suites_with_cases: The test suites with test cases to run.
         """
-        sut_node = sut_nodes.get(test_run_config.system_under_test_node.name)
-        tg_node = tg_nodes.get(test_run_config.traffic_generator_node.name)
+        sut_node = sut_nodes.get(sut_node_config.name)
+        tg_node = tg_nodes.get(tg_node_config.name)
 
         try:
             if not sut_node:
-                sut_node = SutNode(test_run_config.system_under_test_node)
+                sut_node = SutNode(sut_node_config)
                 sut_nodes[sut_node.name] = sut_node
             if not tg_node:
-                tg_node = TGNode(test_run_config.traffic_generator_node)
+                tg_node = TGNode(tg_node_config)
                 tg_nodes[tg_node.name] = tg_node
         except Exception as e:
-            failed_node = test_run_config.system_under_test_node.name
+            failed_node = test_run_config.system_under_test_node.node_name
             if sut_node:
-                failed_node = test_run_config.traffic_generator_node.name
+                failed_node = test_run_config.traffic_generator_node
             self._logger.exception(f"The Creation of node {failed_node} failed.")
             test_run_result.update_setup(Result.FAIL, e)
 
@@ -425,7 +432,7 @@ def _run_test_run(
             test_suites_with_cases: The test suites with test cases to run.
         """
         self._logger.info(
-            f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
+            f"Running test run with SUT '{test_run_config.system_under_test_node.node_name}'."
         )
         test_run_result.add_sut_info(sut_node.node_info)
         try:
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index f6303066d4..2e8dedef4f 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -85,6 +85,8 @@
 from pathlib import Path
 from typing import Callable
 
+from pydantic import ValidationError
+
 from .config import TestSuiteConfig
 from .exception import ConfigurationError
 from .utils import DPDKGitTarball, get_commit_id
@@ -391,11 +393,21 @@ def _process_test_suites(
     Returns:
         A list of test suite configurations to execute.
     """
-    if parser.find_action("test_suites", _is_from_env):
+    action = parser.find_action("test_suites", _is_from_env)
+    if action:
         # Environment variable in the form of "SUITE1 CASE1 CASE2, SUITE2 CASE1, SUITE3, ..."
         args = [suite_with_cases.split() for suite_with_cases in args[0][0].split(",")]
 
-    return [TestSuiteConfig(test_suite, test_cases) for [test_suite, *test_cases] in args]
+    try:
+        return [TestSuiteConfig(test_suite, test_cases) for [test_suite, *test_cases] in args]
+    except ValidationError as e:
+        print(
+            "An error has occurred while validating the test suites supplied in the "
+            f"{'environment variable' if action else 'arguments'}:",
+            file=sys.stderr,
+        )
+        print(e, file=sys.stderr)
+        sys.exit(1)
 
 
 def get_settings() -> Settings:
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 2855fe0276..5957e8140c 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -181,7 +181,7 @@ def set_up_test_run(self, test_run_config: TestRunConfiguration) -> None:
                 the setup steps will be taken.
         """
         super().set_up_test_run(test_run_config)
-        for vdev in test_run_config.vdevs:
+        for vdev in test_run_config.system_under_test_node.vdevs:
             self.virtual_devices.append(VirtualDevice(vdev))
 
     def tear_down_test_run(self) -> None:
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
index 6dac86a224..bb271836a9 100644
--- a/dts/framework/testbed_model/traffic_generator/__init__.py
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -38,6 +38,4 @@ def create_traffic_generator(
         case ScapyTrafficGeneratorConfig():
             return ScapyTrafficGenerator(tg_node, traffic_generator_config)
         case _:
-            raise ConfigurationError(
-                f"Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
-            )
+            raise ConfigurationError(f"Unknown traffic generator: {traffic_generator_config.type}")
diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 4ce1148706..39a4170979 100644
--- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -39,7 +39,7 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
         """
         self._config = config
         self._tg_node = tg_node
-        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
+        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.type}")
 
     def send_packet(self, packet: Packet, port: Port) -> None:
         """Send `packet` and block until it is fully sent.
-- 
2.34.1


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [PATCH 4/5] dts: use TestSuiteSpec class imports
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
                   ` (2 preceding siblings ...)
  2024-08-22 16:39 ` [PATCH 3/5] dts: use Pydantic in the configuration Luca Vizzarro
@ 2024-08-22 16:39 ` Luca Vizzarro
  2024-09-17 11:39   ` Juraj Linkeš
  2024-08-22 16:39 ` [PATCH 5/5] dts: add JSON schema generation script Luca Vizzarro
  4 siblings, 1 reply; 13+ messages in thread
From: Luca Vizzarro @ 2024-08-22 16:39 UTC (permalink / raw)
  To: dev
  Cc: Honnappa Nagarahalli, Juraj Linkeš, Luca Vizzarro, Paul Szczepanek

The introduction of TestSuiteSpec adds auto-discovery of test suites,
which are also automatically imported. This causes double imports as the
runner loads the test suites. This changes the behaviour of the runner
to load the imported classes from TestSuiteSpec instead of importing
them again.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/runner.py | 167 +++++++---------------------------------
 1 file changed, 27 insertions(+), 140 deletions(-)

diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 14e405aced..00b63cc292 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -2,6 +2,7 @@
 # Copyright(c) 2010-2019 Intel Corporation
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
+# Copyright(c) 2024 Arm Limited
 
 """Test suite runner module.
 
@@ -17,14 +18,11 @@
 and the test case stage runs test cases individually.
 """
 
-import importlib
-import inspect
 import os
-import re
 import sys
 from pathlib import Path
 from types import FunctionType
-from typing import Iterable, Sequence
+from typing import Iterable
 
 from framework.testbed_model.sut_node import SutNode
 from framework.testbed_model.tg_node import TGNode
@@ -38,12 +36,7 @@
     TGNodeConfiguration,
     load_config,
 )
-from .exception import (
-    BlockingTestSuiteError,
-    ConfigurationError,
-    SSHTimeoutError,
-    TestCaseVerifyError,
-)
+from .exception import BlockingTestSuiteError, SSHTimeoutError, TestCaseVerifyError
 from .logger import DTSLogger, DtsStage, get_dts_logger
 from .settings import SETTINGS
 from .test_result import (
@@ -55,7 +48,7 @@
     TestSuiteResult,
     TestSuiteWithCases,
 )
-from .test_suite import TestSuite
+from .test_suite import TestCase, TestCaseVariant, TestSuite
 
 
 class DTSRunner:
@@ -217,11 +210,10 @@ def _get_test_suites_with_cases(
         func: bool,
         perf: bool,
     ) -> list[TestSuiteWithCases]:
-        """Test suites with test cases discovery.
+        """Get test suites with selected cases.
 
-        The test suites with test cases defined in the user configuration are discovered
-        and stored for future use so that we don't import the modules twice and so that
-        the list of test suites with test cases is available for recording right away.
+        The test suites with test cases defined in the user configuration are selected
+        and the corresponding functions and classes are gathered.
 
         Args:
             test_suite_configs: Test suite configurations.
@@ -229,139 +221,34 @@ def _get_test_suites_with_cases(
             perf: Whether to include performance test cases in the final list.
 
         Returns:
-            The discovered test suites, each with test cases.
+            The test suites, each with test cases.
         """
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
-            test_cases = []
-            func_test_cases, perf_test_cases = self._filter_test_cases(
-                test_suite_class, test_suite_config.test_cases_names
-            )
-            if func:
-                test_cases.extend(func_test_cases)
-            if perf:
-                test_cases.extend(perf_test_cases)
-
-            test_suites_with_cases.append(
-                TestSuiteWithCases(test_suite_class=test_suite_class, test_cases=test_cases)
-            )
-
-        return test_suites_with_cases
-
-    def _get_test_suite_class(self, module_name: str) -> type[TestSuite]:
-        """Find the :class:`TestSuite` class in `module_name`.
-
-        The full module name is `module_name` prefixed with `self._test_suite_module_prefix`.
-        The module name is a standard filename with words separated with underscores.
-        Search the `module_name` for a :class:`TestSuite` class which starts
-        with `self._test_suite_class_prefix`, continuing with CamelCase `module_name`.
-        The first matching class is returned.
-
-        The CamelCase convention applies to abbreviations, acronyms, initialisms and so on::
-
-            OS -> Os
-            TCP -> Tcp
-
-        Args:
-            module_name: The module name without prefix where to search for the test suite.
-
-        Returns:
-            The found test suite class.
-
-        Raises:
-            ConfigurationError: If the corresponding module is not found or
-                a valid :class:`TestSuite` is not found in the module.
-        """
-
-        def is_test_suite(object) -> bool:
-            """Check whether `object` is a :class:`TestSuite`.
-
-            The `object` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
-
-            Args:
-                object: The object to be checked.
-
-            Returns:
-                :data:`True` if `object` is a subclass of `TestSuite`.
-            """
-            try:
-                if issubclass(object, TestSuite) and object is not TestSuite:
-                    return True
-            except TypeError:
-                return False
-            return False
-
-        testsuite_module_path = f"{self._test_suite_module_prefix}{module_name}"
-        try:
-            test_suite_module = importlib.import_module(testsuite_module_path)
-        except ModuleNotFoundError as e:
-            raise ConfigurationError(
-                f"Test suite module '{testsuite_module_path}' not found."
-            ) from e
-
-        camel_case_suite_name = "".join(
-            [suite_word.capitalize() for suite_word in module_name.split("_")]
-        )
-        full_suite_name_to_find = f"{self._test_suite_class_prefix}{camel_case_suite_name}"
-        for class_name, class_obj in inspect.getmembers(test_suite_module, is_test_suite):
-            if class_name == full_suite_name_to_find:
-                return class_obj
-        raise ConfigurationError(
-            f"Couldn't find any valid test suites in {test_suite_module.__name__}."
-        )
-
-    def _filter_test_cases(
-        self, test_suite_class: type[TestSuite], test_cases_to_run: Sequence[str]
-    ) -> tuple[list[FunctionType], list[FunctionType]]:
-        """Filter `test_cases_to_run` from `test_suite_class`.
-
-        There are two rounds of filtering if `test_cases_to_run` is not empty.
-        The first filters `test_cases_to_run` from all methods of `test_suite_class`.
-        Then the methods are separated into functional and performance test cases.
-        If a method matches neither the functional nor performance name prefix, it's an error.
-
-        Args:
-            test_suite_class: The class of the test suite.
-            test_cases_to_run: Test case names to filter from `test_suite_class`.
-                If empty, return all matching test cases.
-
-        Returns:
-            A list of test case methods that should be executed.
+            test_suite_spec = test_suite_config.test_suite_spec
+            test_suite_class = test_suite_spec.class_type
+
+            filtered_test_cases: list[TestCase] = [
+                test_case
+                for test_case in test_suite_spec.test_cases
+                if not test_suite_config.test_cases_names
+                or test_case.name in test_suite_config.test_cases_names
+            ]
 
-        Raises:
-            ConfigurationError: If a test case from `test_cases_to_run` is not found
-                or it doesn't match either the functional nor performance name prefix.
-        """
-        func_test_cases = []
-        perf_test_cases = []
-        name_method_tuples = inspect.getmembers(test_suite_class, inspect.isfunction)
-        if test_cases_to_run:
-            name_method_tuples = [
-                (name, method) for name, method in name_method_tuples if name in test_cases_to_run
+            selected_test_cases: list[FunctionType] = [
+                test_case.function_type  # type: ignore[misc]
+                for test_case in filtered_test_cases
+                if (func and test_case.variant == TestCaseVariant.FUNCTIONAL)
+                or (perf and test_case.variant == TestCaseVariant.PERFORMANCE)
             ]
-            if len(name_method_tuples) < len(test_cases_to_run):
-                missing_test_cases = set(test_cases_to_run) - {
-                    name for name, _ in name_method_tuples
-                }
-                raise ConfigurationError(
-                    f"Test cases {missing_test_cases} not found among methods "
-                    f"of {test_suite_class.__name__}."
-                )
 
-        for test_case_name, test_case_method in name_method_tuples:
-            if re.match(self._func_test_case_regex, test_case_name):
-                func_test_cases.append(test_case_method)
-            elif re.match(self._perf_test_case_regex, test_case_name):
-                perf_test_cases.append(test_case_method)
-            elif test_cases_to_run:
-                raise ConfigurationError(
-                    f"Method '{test_case_name}' matches neither "
-                    f"a functional nor a performance test case name."
+            test_suites_with_cases.append(
+                TestSuiteWithCases(
+                    test_suite_class=test_suite_class, test_cases=selected_test_cases
                 )
-
-        return func_test_cases, perf_test_cases
+            )
+        return test_suites_with_cases
 
     def _connect_nodes_and_run_test_run(
         self,
-- 
2.34.1


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [PATCH 5/5] dts: add JSON schema generation script
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
                   ` (3 preceding siblings ...)
  2024-08-22 16:39 ` [PATCH 4/5] dts: use TestSuiteSpec class imports Luca Vizzarro
@ 2024-08-22 16:39 ` Luca Vizzarro
  2024-09-17 11:59   ` Juraj Linkeš
  4 siblings, 1 reply; 13+ messages in thread
From: Luca Vizzarro @ 2024-08-22 16:39 UTC (permalink / raw)
  To: dev
  Cc: Honnappa Nagarahalli, Juraj Linkeš, Luca Vizzarro, Paul Szczepanek

Adds a new script which automatically re-generates the JSON schema file
based on the Pydantic configuration models.

Moreover, update the JSON schema with this script for the first time.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 doc/guides/tools/dts.rst                   |  10 +
 dts/framework/config/conf_yaml_schema.json | 776 ++++++++++++---------
 dts/generate-schema.py                     |  38 +
 3 files changed, 486 insertions(+), 338 deletions(-)
 create mode 100755 dts/generate-schema.py

diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..317bd0ff99 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -430,6 +430,16 @@ Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
 Configuration Schema
 --------------------
 
+The configuration schema is automatically generated from Pydantic models and can be found
+at ``dts/framework/config/conf_yaml_schema.json``. Whenever the models are changed, the schema
+should be regenerated using the dedicated script at ``dts/generate-schema.py``, e.g.:
+
+.. code-block:: console
+
+   $ poetry shell
+   (dts-py3.10) $ ./generate-schema.py
+
+
 Definitions
 ~~~~~~~~~~~
 
diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
index f02a310bb5..1cf1bb098a 100644
--- a/dts/framework/config/conf_yaml_schema.json
+++ b/dts/framework/config/conf_yaml_schema.json
@@ -1,402 +1,502 @@
 {
-  "$schema": "https://json-schema.org/draft-07/schema",
-  "title": "DTS Config Schema",
-  "definitions": {
-    "node_name": {
-      "type": "string",
-      "description": "A unique identifier for a node"
-    },
-    "NIC": {
-      "type": "string",
-      "enum": [
-        "ALL",
-        "ConnectX3_MT4103",
-        "ConnectX4_LX_MT4117",
-        "ConnectX4_MT4115",
-        "ConnectX5_MT4119",
-        "ConnectX5_MT4121",
-        "I40E_10G-10G_BASE_T_BC",
-        "I40E_10G-10G_BASE_T_X722",
-        "I40E_10G-SFP_X722",
-        "I40E_10G-SFP_XL710",
-        "I40E_10G-X722_A0",
-        "I40E_1G-1G_BASE_T_X722",
-        "I40E_25G-25G_SFP28",
-        "I40E_40G-QSFP_A",
-        "I40E_40G-QSFP_B",
-        "IAVF-ADAPTIVE_VF",
-        "IAVF-VF",
-        "IAVF_10G-X722_VF",
-        "ICE_100G-E810C_QSFP",
-        "ICE_25G-E810C_SFP",
-        "ICE_25G-E810_XXV_SFP",
-        "IGB-I350_VF",
-        "IGB_1G-82540EM",
-        "IGB_1G-82545EM_COPPER",
-        "IGB_1G-82571EB_COPPER",
-        "IGB_1G-82574L",
-        "IGB_1G-82576",
-        "IGB_1G-82576_QUAD_COPPER",
-        "IGB_1G-82576_QUAD_COPPER_ET2",
-        "IGB_1G-82580_COPPER",
-        "IGB_1G-I210_COPPER",
-        "IGB_1G-I350_COPPER",
-        "IGB_1G-I354_SGMII",
-        "IGB_1G-PCH_LPTLP_I218_LM",
-        "IGB_1G-PCH_LPTLP_I218_V",
-        "IGB_1G-PCH_LPT_I217_LM",
-        "IGB_1G-PCH_LPT_I217_V",
-        "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
-        "IGC-I225_LM",
-        "IGC-I226_LM",
-        "IXGBE_10G-82599_SFP",
-        "IXGBE_10G-82599_SFP_SF_QP",
-        "IXGBE_10G-82599_T3_LOM",
-        "IXGBE_10G-82599_VF",
-        "IXGBE_10G-X540T",
-        "IXGBE_10G-X540_VF",
-        "IXGBE_10G-X550EM_A_SFP",
-        "IXGBE_10G-X550EM_X_10G_T",
-        "IXGBE_10G-X550EM_X_SFP",
-        "IXGBE_10G-X550EM_X_VF",
-        "IXGBE_10G-X550T",
-        "IXGBE_10G-X550_VF",
-        "brcm_57414",
-        "brcm_P2100G",
-        "cavium_0011",
-        "cavium_a034",
-        "cavium_a063",
-        "cavium_a064",
-        "fastlinq_ql41000",
-        "fastlinq_ql41000_vf",
-        "fastlinq_ql45000",
-        "fastlinq_ql45000_vf",
-        "hi1822",
-        "virtio"
-      ]
-    },
-
-    "ARCH": {
-      "type": "string",
+  "$defs": {
+    "Architecture": {
+      "description": "The supported architectures of :class:`~framework.testbed_model.node.Node`\\s.",
       "enum": [
+        "i686",
         "x86_64",
+        "x86_32",
         "arm64",
         "ppc64le"
-      ]
-    },
-    "OS": {
-      "type": "string",
-      "enum": [
-        "linux"
-      ]
-    },
-    "cpu": {
-      "type": "string",
-      "description": "Native should be the default on x86",
-      "enum": [
-        "native",
-        "armv8a",
-        "dpaa2",
-        "thunderx",
-        "xgene1"
-      ]
-    },
-    "compiler": {
-      "type": "string",
-      "enum": [
-        "gcc",
-        "clang",
-        "icc",
-        "mscv"
-      ]
+      ],
+      "title": "Architecture",
+      "type": "string"
     },
-    "build_target": {
-      "type": "object",
-      "description": "Targets supported by DTS",
+    "BuildTargetConfiguration": {
+      "additionalProperties": false,
+      "description": "DPDK build configuration.\n\nThe configuration used for building DPDK.\n\nAttributes:\n    arch: The target architecture to build for.\n    os: The target os to build for.\n    cpu: The target CPU to build for.\n    compiler: The compiler executable to use.\n    compiler_wrapper: This string will be put in front of the compiler when\n        executing the build. Useful for adding wrapper commands, such as ``ccache``.",
       "properties": {
         "arch": {
-          "type": "string",
-          "enum": [
-            "ALL",
-            "x86_64",
-            "arm64",
-            "ppc64le",
-            "other"
-          ]
+          "$ref": "#/$defs/Architecture"
         },
         "os": {
-          "$ref": "#/definitions/OS"
+          "$ref": "#/$defs/OS"
         },
         "cpu": {
-          "$ref": "#/definitions/cpu"
+          "$ref": "#/$defs/CPUType"
         },
         "compiler": {
-          "$ref": "#/definitions/compiler"
+          "$ref": "#/$defs/Compiler"
         },
-          "compiler_wrapper": {
-          "type": "string",
-          "description": "This will be added before compiler to the CC variable when building DPDK. Optional."
+        "compiler_wrapper": {
+          "default": "",
+          "title": "Compiler Wrapper",
+          "type": "string"
         }
       },
-      "additionalProperties": false,
       "required": [
         "arch",
         "os",
         "cpu",
         "compiler"
-      ]
+      ],
+      "title": "BuildTargetConfiguration",
+      "type": "object"
     },
-    "hugepages_2mb": {
-      "type": "object",
-      "description": "Optional hugepage configuration. If not specified, hugepages won't be configured and DTS will use system configuration.",
+    "CPUType": {
+      "description": "The supported CPUs of :class:`~framework.testbed_model.node.Node`\\s.",
+      "enum": [
+        "native",
+        "armv8a",
+        "dpaa2",
+        "thunderx",
+        "xgene1"
+      ],
+      "title": "CPUType",
+      "type": "string"
+    },
+    "Compiler": {
+      "description": "The supported compilers of :class:`~framework.testbed_model.node.Node`\\s.",
+      "enum": [
+        "gcc",
+        "clang",
+        "icc",
+        "msvc"
+      ],
+      "title": "Compiler",
+      "type": "string"
+    },
+    "HugepageConfiguration": {
+      "additionalProperties": false,
+      "description": "The hugepage configuration of :class:`~framework.testbed_model.node.Node`\\s.\n\nAttributes:\n    number_of: The number of hugepages to allocate.\n    force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.",
       "properties": {
         "number_of": {
-          "type": "integer",
-          "description": "The number of hugepages to configure. Hugepage size will be the system default."
+          "title": "Number Of",
+          "type": "integer"
         },
         "force_first_numa": {
-          "type": "boolean",
-          "description": "Set to True to force configuring hugepages on the first NUMA node. Defaults to False."
+          "title": "Force First Numa",
+          "type": "boolean"
         }
       },
-      "additionalProperties": false,
       "required": [
-        "number_of"
-      ]
-    },
-    "mac_address": {
-      "type": "string",
-      "description": "A MAC address",
-      "pattern": "^([0-9A-Fa-f]{2}[:-]){5}([0-9A-Fa-f]{2})$"
+        "number_of",
+        "force_first_numa"
+      ],
+      "title": "HugepageConfiguration",
+      "type": "object"
     },
-    "pci_address": {
-      "type": "string",
-      "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
+    "OS": {
+      "description": "The supported operating systems of :class:`~framework.testbed_model.node.Node`\\s.",
+      "enum": [
+        "linux",
+        "freebsd",
+        "windows"
+      ],
+      "title": "OS",
+      "type": "string"
     },
-    "port_peer_address": {
-      "description": "Peer is a TRex port, and IXIA port or a PCI address",
-      "oneOf": [
-        {
-          "description": "PCI peer port",
-          "$ref": "#/definitions/pci_address"
+    "PortConfig": {
+      "additionalProperties": false,
+      "description": "The port configuration of :class:`~framework.testbed_model.node.Node`\\s.\n\nAttributes:\n    pci: The PCI address of the port.\n    os_driver_for_dpdk: The operating system driver name for use with DPDK.\n    os_driver: The operating system driver name when the operating system controls the port.\n    peer_node: The :class:`~framework.testbed_model.node.Node` of the port\n        connected to this port.\n    peer_pci: The PCI address of the port connected to this port.",
+      "properties": {
+        "pci": {
+          "description": "The local PCI address of the port.",
+          "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$",
+          "title": "Pci",
+          "type": "string"
+        },
+        "os_driver_for_dpdk": {
+          "description": "The driver that the kernel should bind this device to for DPDK to use it.",
+          "examples": [
+            "vfio-pci",
+            "mlx5_core"
+          ],
+          "title": "Os Driver For Dpdk",
+          "type": "string"
+        },
+        "os_driver": {
+          "description": "The driver normally used by this port",
+          "examples": [
+            "i40e",
+            "ice",
+            "mlx5_core"
+          ],
+          "title": "Os Driver",
+          "type": "string"
+        },
+        "peer_node": {
+          "description": "The name of the peer node this port is connected to.",
+          "title": "Peer Node",
+          "type": "string"
+        },
+        "peer_pci": {
+          "description": "The PCI address of the peer port this port is connected to.",
+          "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$",
+          "title": "Peer Pci",
+          "type": "string"
         }
-      ]
+      },
+      "required": [
+        "pci",
+        "os_driver_for_dpdk",
+        "os_driver",
+        "peer_node",
+        "peer_pci"
+      ],
+      "title": "PortConfig",
+      "type": "object"
     },
-    "test_suite": {
-      "type": "string",
-      "enum": [
-        "hello_world",
-        "os_udp",
-        "pmd_buffer_scatter"
-      ]
+    "ScapyTrafficGeneratorConfig": {
+      "additionalProperties": false,
+      "description": "Scapy traffic generator specific configuration.",
+      "properties": {
+        "type": {
+          "const": "SCAPY",
+          "enum": [
+            "SCAPY"
+          ],
+          "title": "Type",
+          "type": "string"
+        }
+      },
+      "required": [
+        "type"
+      ],
+      "title": "ScapyTrafficGeneratorConfig",
+      "type": "object"
     },
-    "test_target": {
-      "type": "object",
+    "SutNodeConfiguration": {
+      "additionalProperties": false,
+      "description": ":class:`~framework.testbed_model.sut_node.SutNode` specific configuration.\n\nAttributes:\n    memory_channels: The number of memory channels to use when running DPDK.",
       "properties": {
-        "suite": {
-          "$ref": "#/definitions/test_suite"
+        "name": {
+          "description": "A unique identifier for this node.",
+          "title": "Name",
+          "type": "string"
+        },
+        "hostname": {
+          "description": "The hostname or IP address of the node.",
+          "title": "Hostname",
+          "type": "string"
+        },
+        "user": {
+          "description": "The login user to use to connect to this node.",
+          "title": "User",
+          "type": "string"
         },
-        "cases": {
-          "type": "array",
-          "description": "If specified, only this subset of test suite's test cases will be run.",
+        "password": {
+          "anyOf": [
+            {
+              "type": "string"
+            },
+            {
+              "type": "null"
+            }
+          ],
+          "default": null,
+          "description": "The login password to use to connect to this node. SSH keys are STRONGLY preferred, use only as last resort.",
+          "title": "Password"
+        },
+        "use_first_core": {
+          "default": false,
+          "description": "DPDK won't use the first physical core if set to False.",
+          "title": "Use First Core",
+          "type": "boolean"
+        },
+        "hugepages_2mb": {
+          "anyOf": [
+            {
+              "$ref": "#/$defs/HugepageConfiguration"
+            },
+            {
+              "type": "null"
+            }
+          ],
+          "default": null
+        },
+        "ports": {
           "items": {
-            "type": "string"
+            "$ref": "#/$defs/PortConfig"
           },
-          "minimum": 1
+          "minItems": 1,
+          "title": "Ports",
+          "type": "array"
+        },
+        "memory_channels": {
+          "default": 1,
+          "description": "Number of memory channels to use when running DPDK.",
+          "title": "Memory Channels",
+          "type": "integer"
+        },
+        "arch": {
+          "$ref": "#/$defs/Architecture"
+        },
+        "os": {
+          "$ref": "#/$defs/OS"
+        },
+        "lcores": {
+          "default": "1",
+          "description": "Comma-separated list of logical cores to use. An empty string means use all lcores.",
+          "examples": [
+            "1,2,3,4,5,18-22",
+            "10-15"
+          ],
+          "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
+          "title": "Lcores",
+          "type": "string"
         }
       },
       "required": [
-        "suite"
+        "name",
+        "hostname",
+        "user",
+        "ports",
+        "arch",
+        "os"
       ],
-      "additionalProperties": false
-    }
-  },
-  "type": "object",
-  "properties": {
-    "nodes": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "name": {
-            "type": "string",
-            "description": "A unique identifier for this node"
-          },
-          "hostname": {
-            "type": "string",
-            "description": "A hostname from which the node running DTS can access this node. This can also be an IP address."
-          },
-          "user": {
-            "type": "string",
-            "description": "The user to access this node with."
-          },
-          "password": {
-            "type": "string",
-            "description": "The password to use on this node. Use only as a last resort. SSH keys are STRONGLY preferred."
-          },
-          "arch": {
-            "$ref": "#/definitions/ARCH"
-          },
-          "os": {
-            "$ref": "#/definitions/OS"
-          },
-          "lcores": {
-            "type": "string",
-            "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
-            "description": "Optional comma-separated list of logical cores to use, e.g.: 1,2,3,4,5,18-22. Defaults to 1. An empty string means use all lcores."
+      "title": "SutNodeConfiguration",
+      "type": "object"
+    },
+    "TGNodeConfiguration": {
+      "additionalProperties": false,
+      "description": ":class:`~framework.testbed_model.tg_node.TGNode` specific configuration.\n\nAttributes:\n    traffic_generator: The configuration of the traffic generator present on the TG node.",
+      "properties": {
+        "name": {
+          "description": "A unique identifier for this node.",
+          "title": "Name",
+          "type": "string"
+        },
+        "hostname": {
+          "description": "The hostname or IP address of the node.",
+          "title": "Hostname",
+          "type": "string"
+        },
+        "user": {
+          "description": "The login user to use to connect to this node.",
+          "title": "User",
+          "type": "string"
+        },
+        "password": {
+          "anyOf": [
+            {
+              "type": "string"
+            },
+            {
+              "type": "null"
+            }
+          ],
+          "default": null,
+          "description": "The login password to use to connect to this node. SSH keys are STRONGLY preferred, use only as last resort.",
+          "title": "Password"
+        },
+        "use_first_core": {
+          "default": false,
+          "description": "DPDK won't use the first physical core if set to False.",
+          "title": "Use First Core",
+          "type": "boolean"
+        },
+        "hugepages_2mb": {
+          "anyOf": [
+            {
+              "$ref": "#/$defs/HugepageConfiguration"
+            },
+            {
+              "type": "null"
+            }
+          ],
+          "default": null
+        },
+        "ports": {
+          "items": {
+            "$ref": "#/$defs/PortConfig"
           },
-          "use_first_core": {
-            "type": "boolean",
-            "description": "Indicate whether DPDK should use the first physical core. It won't be used by default."
+          "minItems": 1,
+          "title": "Ports",
+          "type": "array"
+        },
+        "arch": {
+          "$ref": "#/$defs/Architecture"
+        },
+        "os": {
+          "$ref": "#/$defs/OS"
+        },
+        "lcores": {
+          "default": "1",
+          "description": "Comma-separated list of logical cores to use. An empty string means use all lcores.",
+          "examples": [
+            "1,2,3,4,5,18-22",
+            "10-15"
+          ],
+          "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
+          "title": "Lcores",
+          "type": "string"
+        },
+        "traffic_generator": {
+          "discriminator": {
+            "mapping": {
+              "SCAPY": "#/$defs/ScapyTrafficGeneratorConfig"
+            },
+            "propertyName": "type"
           },
-          "memory_channels": {
-            "type": "integer",
-            "description": "How many memory channels to use. Optional, defaults to 1."
+          "oneOf": [
+            {
+              "$ref": "#/$defs/ScapyTrafficGeneratorConfig"
+            }
+          ],
+          "title": "Traffic Generator"
+        }
+      },
+      "required": [
+        "name",
+        "hostname",
+        "user",
+        "ports",
+        "arch",
+        "os",
+        "traffic_generator"
+      ],
+      "title": "TGNodeConfiguration",
+      "type": "object"
+    },
+    "TestRunConfiguration": {
+      "additionalProperties": false,
+      "description": "The configuration of a test run.\n\nThe configuration contains testbed information, what tests to execute\nand with what DPDK build.\n\nAttributes:\n    build_targets: A list of DPDK builds to test.\n    perf: Whether to run performance tests.\n    func: Whether to run functional tests.\n    skip_smoke_tests: Whether to skip smoke tests.\n    test_suites: The names of test suites and/or test cases to execute.\n    system_under_test_node: The SUT node configuration to use in this test run.\n    traffic_generator_node: The TG node name to use in this test run.",
+      "properties": {
+        "perf": {
+          "description": "Enable performance testing.",
+          "title": "Perf",
+          "type": "boolean"
+        },
+        "func": {
+          "description": "Enable functional testing.",
+          "title": "Func",
+          "type": "boolean"
+        },
+        "test_suites": {
+          "items": {
+            "$ref": "#/$defs/TestSuiteConfig"
           },
-          "hugepages_2mb": {
-            "$ref": "#/definitions/hugepages_2mb"
+          "minItems": 1,
+          "title": "Test Suites",
+          "type": "array"
+        },
+        "build_targets": {
+          "items": {
+            "$ref": "#/$defs/BuildTargetConfiguration"
           },
-          "ports": {
-            "type": "array",
-            "items": {
-              "type": "object",
-              "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.",
-              "properties": {
-                "pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The local PCI address of the port"
-                },
-                "os_driver_for_dpdk": {
-                  "type": "string",
-                  "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)"
-                },
-                "os_driver": {
-                  "type": "string",
-                  "description": "The driver normally used by this port (ex: i40e)"
-                },
-                "peer_node": {
-                  "type": "string",
-                  "description": "The name of the node the peer port is on"
-                },
-                "peer_pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The PCI address of the peer port"
-                }
-              },
-              "additionalProperties": false,
-              "required": [
-                "pci",
-                "os_driver_for_dpdk",
-                "os_driver",
-                "peer_node",
-                "peer_pci"
-              ]
-            },
-            "minimum": 1
+          "title": "Build Targets",
+          "type": "array"
+        },
+        "skip_smoke_tests": {
+          "default": false,
+          "title": "Skip Smoke Tests",
+          "type": "boolean"
+        },
+        "system_under_test_node": {
+          "$ref": "#/$defs/TestRunSUTNodeConfiguration"
+        },
+        "traffic_generator_node": {
+          "title": "Traffic Generator Node",
+          "type": "string"
+        }
+      },
+      "required": [
+        "perf",
+        "func",
+        "test_suites",
+        "build_targets",
+        "system_under_test_node",
+        "traffic_generator_node"
+      ],
+      "title": "TestRunConfiguration",
+      "type": "object"
+    },
+    "TestRunSUTNodeConfiguration": {
+      "additionalProperties": false,
+      "description": "The SUT node configuration of a test run.\n\nAttributes:\n    node_name: The SUT node to use in this test run.\n    vdevs: The names of virtual devices to test.",
+      "properties": {
+        "vdevs": {
+          "items": {
+            "type": "string"
           },
-          "traffic_generator": {
-            "oneOf": [
-              {
-                "type": "object",
-                "description": "Scapy traffic generator. Used for functional testing.",
-                "properties": {
-                  "type": {
-                    "type": "string",
-                    "enum": [
-                      "SCAPY"
-                    ]
-                  }
-                }
-              }
-            ]
-          }
+          "title": "Vdevs",
+          "type": "array"
         },
-        "additionalProperties": false,
-        "required": [
-          "name",
-          "hostname",
-          "user",
-          "arch",
-          "os"
-        ]
+        "node_name": {
+          "title": "Node Name",
+          "type": "string"
+        }
       },
-      "minimum": 1
+      "required": [
+        "node_name"
+      ],
+      "title": "TestRunSUTNodeConfiguration",
+      "type": "object"
     },
-    "test_runs": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "build_targets": {
-            "type": "array",
-            "items": {
-              "$ref": "#/definitions/build_target"
+    "TestSuiteConfig": {
+      "anyOf": [
+        {
+          "additionalProperties": false,
+          "properties": {
+            "test_suite": {
+              "description": "The identifying name of the test suite.",
+              "title": "Test suite name",
+              "type": "string"
             },
-            "minimum": 1
-          },
-          "perf": {
-            "type": "boolean",
-            "description": "Enable performance testing."
-          },
-          "func": {
-            "type": "boolean",
-            "description": "Enable functional testing."
-          },
-          "test_suites": {
-            "type": "array",
-            "items": {
-              "oneOf": [
-                {
-                  "$ref": "#/definitions/test_suite"
-                },
-                {
-                  "$ref": "#/definitions/test_target"
-                }
-              ]
+            "test_cases": {
+              "description": "The identifying name of the test cases of the test suite.",
+              "items": {
+                "type": "string"
+              },
+              "title": "Test cases by name",
+              "type": "array"
             }
           },
-          "skip_smoke_tests": {
-            "description": "Optional field that allows you to skip smoke testing",
-            "type": "boolean"
-          },
-          "system_under_test_node": {
-            "type":"object",
-            "properties": {
-              "node_name": {
-                "$ref": "#/definitions/node_name"
-              },
-              "vdevs": {
-                "description": "Optional list of names of vdevs to be used in the test run",
-                "type": "array",
-                "items": {
-                  "type": "string"
-                }
-              }
-            },
-            "required": [
-              "node_name"
-            ]
+          "required": [
+            "test_suite"
+          ],
+          "type": "object"
+        },
+        {
+          "type": "string"
+        }
+      ],
+      "description": "Test suite configuration.\n\nInformation about a single test suite to be executed. It can be represented and validated as a\nstring type in the form of: ``TEST_SUITE [TEST_CASE, ...]``, in the configuration file.\n\nAttributes:\n    test_suite: The name of the test suite module without the starting ``TestSuite_``.\n    test_cases: The names of test cases from this test suite to execute.\n        If empty, all test cases will be executed.",
+      "title": "TestSuiteConfig"
+    }
+  },
+  "description": "DTS testbed and test configuration.\n\nAttributes:\n    test_runs: Test run configurations.\n    nodes: Node configurations.",
+  "properties": {
+    "test_runs": {
+      "items": {
+        "$ref": "#/$defs/TestRunConfiguration"
+      },
+      "minItems": 1,
+      "title": "Test Runs",
+      "type": "array"
+    },
+    "nodes": {
+      "items": {
+        "anyOf": [
+          {
+            "$ref": "#/$defs/TGNodeConfiguration"
           },
-          "traffic_generator_node": {
-            "$ref": "#/definitions/node_name"
+          {
+            "$ref": "#/$defs/SutNodeConfiguration"
           }
-        },
-        "additionalProperties": false,
-        "required": [
-          "build_targets",
-          "perf",
-          "func",
-          "test_suites",
-          "system_under_test_node",
-          "traffic_generator_node"
         ]
       },
-      "minimum": 1
+      "minItems": 1,
+      "title": "Nodes",
+      "type": "array"
     }
   },
   "required": [
     "test_runs",
     "nodes"
   ],
-  "additionalProperties": false
-}
+  "title": "Configuration",
+  "type": "object",
+  "$schema": "https://json-schema.org/draft/2020-12/schema"
+}
\ No newline at end of file
diff --git a/dts/generate-schema.py b/dts/generate-schema.py
new file mode 100755
index 0000000000..b41d28492f
--- /dev/null
+++ b/dts/generate-schema.py
@@ -0,0 +1,38 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 Arm Limited
+
+"""JSON schema generation script."""
+
+import json
+import os
+
+from pydantic.json_schema import GenerateJsonSchema
+
+from framework.config import ConfigurationType
+
+DTS_DIR = os.path.dirname(os.path.realpath(__file__))
+RELATIVE_PATH_TO_SCHEMA = "framework/config/conf_yaml_schema.json"
+
+
+class GenerateSchemaWithDialect(GenerateJsonSchema):
+    """Custom schema generator which adds the schema dialect."""
+
+    def generate(self, schema, mode="validation"):
+        """Generate JSON schema."""
+        json_schema = super().generate(schema, mode=mode)
+        json_schema["$schema"] = self.schema_dialect
+        return json_schema
+
+
+try:
+    path = os.path.join(DTS_DIR, RELATIVE_PATH_TO_SCHEMA)
+
+    with open(path, "w") as schema_file:
+        schema_dict = ConfigurationType.json_schema(schema_generator=GenerateSchemaWithDialect)
+        schema_json = json.dumps(schema_dict, indent=2)
+        schema_file.write(schema_json)
+
+    print("Schema generated successfully!")
+except Exception as e:
+    raise Exception("failed to generate schema") from e
-- 
2.34.1


^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [PATCH 1/5] dts: add TestSuiteSpec class and discovery
  2024-08-22 16:39 ` [PATCH 1/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
@ 2024-09-16 13:00   ` Juraj Linkeš
  2024-09-19 20:01   ` Nicholas Pratte
  1 sibling, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-16 13:00 UTC (permalink / raw)
  To: Luca Vizzarro, dev; +Cc: Honnappa Nagarahalli, Paul Szczepanek

There are some elements which seem to be present in 
https://patches.dpdk.org/project/dpdk/patch/20240821145315.97974-4-juraj.linkes@pantheon.tech/, 
which is an attempt at decorating test cases (buzgilla 1460) as part of 
the capabilities series.

Looks like we could create a separate patch with 1460 and this patch in 
it on which both capabilities and this series would depend. What do you 
think? Certainly makes sense to have decorating tests cases separate 
from capabilities.

On 22. 8. 2024 18:39, Luca Vizzarro wrote:
> Currently there is a lack of a definition which identifies all the test
> suites available to test. This change intends to simplify the process to
> discover all the test suites and idenfity them.
> 
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>   dts/framework/test_suite.py | 182 +++++++++++++++++++++++++++++++++++-
>   1 file changed, 181 insertions(+), 1 deletion(-)
> 
> diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
> index 694b2eba65..972968b036 100644
> --- a/dts/framework/test_suite.py
> +++ b/dts/framework/test_suite.py
> @@ -1,6 +1,7 @@
>   # SPDX-License-Identifier: BSD-3-Clause
>   # Copyright(c) 2010-2014 Intel Corporation
>   # Copyright(c) 2023 PANTHEON.tech s.r.o.
> +# Copyright(c) 2024 Arm Limited
>   
>   """Features common to all test suites.
>   
> @@ -13,12 +14,22 @@
>       * Test case verification.
>   """
>   
> +import inspect
> +import re
> +from dataclasses import dataclass
> +from enum import Enum, auto
> +from functools import cached_property
> +from importlib import import_module
>   from ipaddress import IPv4Interface, IPv6Interface, ip_interface
> -from typing import ClassVar, Union
> +from pkgutil import iter_modules
> +from types import FunctionType, ModuleType
> +from typing import ClassVar, NamedTuple, Union
>   
> +from pydantic.alias_generators import to_pascal

This is using pydantic, but it's only added in the subsequent patch.

>   from scapy.layers.inet import IP  # type: ignore[import-untyped]
>   from scapy.layers.l2 import Ether  # type: ignore[import-untyped]
>   from scapy.packet import Packet, Padding  # type: ignore[import-untyped]
> +from typing_extensions import Self
>   
>   from framework.testbed_model.port import Port, PortLink
>   from framework.testbed_model.sut_node import SutNode
> @@ -365,3 +376,172 @@ def _verify_l3_packet(self, received_packet: IP, expected_packet: IP) -> bool:
>           if received_packet.src != expected_packet.src or received_packet.dst != expected_packet.dst:
>               return False
>           return True
> +
> +
> +class TestCaseVariant(Enum):
> +    """Enum representing the variant of the test case."""
> +
> +    #:
> +    FUNCTIONAL = auto()
> +    #:
> +    PERFORMANCE = auto()
> +
> +
> +class TestCase(NamedTuple):
> +    """Tuple representing a test case."""
> +
> +    #: The name of the test case without prefix
> +    name: str
> +    #: The reference to the function
> +    function_type: FunctionType

I had to read almost the whole patch to understand what this is. It's 
not the type of a function, it's the function object, which is what the 
docstring says, but I glossed over that. This should be just function or 
maybe function_obj.

> +    #: The test case variant
> +    variant: TestCaseVariant
> +
> +
> +@dataclass
> +class TestSuiteSpec:
> +    """A class defining the specification of a test suite.
> +
> +    Apart from defining all the specs of a test suite, a helper function :meth:`discover_all` is
> +    provided to automatically discover all the available test suites.
> +

We should probably document the assumption that there's only one 
TestCase class in a test case module.

> +    Attributes:
> +        module_name: The name of the test suite's module.
> +    """
> +
> +    #:
> +    TEST_SUITES_PACKAGE_NAME = "tests"

Formally speaking, the tests dir doesn't have an __init__.py file in it, 
so it isn't a package, but the name is fine.

> +    #:
> +    TEST_SUITE_MODULE_PREFIX = "TestSuite_"
> +    #:
> +    TEST_SUITE_CLASS_PREFIX = "Test"
> +    #:
> +    TEST_CASE_METHOD_PREFIX = "test_"
> +    #:
> +    FUNC_TEST_CASE_REGEX = r"test_(?!perf_)"
> +    #:
> +    PERF_TEST_CASE_REGEX = r"test_perf_"
> +

These are common to all test suites, so they should be class variables.

I'm also wondering whether these should be documented in the module 
level docstring. It makes sense that we document there what a subclass 
is supposed to look like (and where it's supposed to be located by 
default). If we do this, we may need to move parts of the class's 
docstring as well.

> +    module_name: str
> +
> +    @cached_property

Nice touch, we are using our own implementation of this elsewhere, so 
maybe we should create a ticket to update those to use @cached_property 
instead.

> +    def name(self) -> str:

TestSuiteSpec.name really sound the name of a TestSuite, so I'd rename 
this to module_name.

> +        """The name of the test suite's module."""
> +        return self.module_name[len(self.TEST_SUITE_MODULE_PREFIX) :]
> +
> +    @cached_property
> +    def module_type(self) -> ModuleType:

This isn't a module type, just an instance of the module object, right? 
Could be named just module.

> +        """A reference to the test suite's module."""
> +        return import_module(f"{self.TEST_SUITES_PACKAGE_NAME}.{self.module_name}")
> +
> +    @cached_property
> +    def class_name(self) -> str:
> +        """The name of the test suite's class."""
> +        return f"{self.TEST_SUITE_CLASS_PREFIX}{to_pascal(self.name)}"
> +
> +    @cached_property
> +    def class_type(self) -> type[TestSuite]:

Class type would be the type of the class, but this is just the class, 
right? Could be named just class.

> +        """A reference to the test suite's class."""
> +
> +        def is_test_suite(obj) -> bool:
> +            """Check whether `obj` is a :class:`TestSuite`.
> +
> +            The `obj` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
> +
> +            Args:
> +                obj: The object to be checked.
> +
> +            Returns:
> +                :data:`True` if `obj` is a subclass of `TestSuite`.
> +            """
> +            try:
> +                if issubclass(obj, TestSuite) and obj is not TestSuite:
> +                    return True
> +            except TypeError:
> +                return False
> +            return False
> +
> +        for class_name, class_type in inspect.getmembers(self.module_type, is_test_suite):
> +            if class_name == self.class_name:
> +                return class_type
> +
> +        raise Exception("class not found in eligible test module")

This should be a DTS error, maybe InternalError? This doesn't seem like 
ConfigurationError. It should also say which module and be a proper 
sentence (capital first letter, end with a dot).

> +
> +    @cached_property
> +    def test_cases(self) -> list[TestCase]:
> +        """A list of all the available test cases."""
> +        test_cases = []
> +
> +        functions = inspect.getmembers(self.class_type, inspect.isfunction)
> +        for fn_name, fn_type in functions:

fn_obj instead of fn_type. The type suffix used in the whole module is 
very confusing.

> +            if prefix := re.match(self.FUNC_TEST_CASE_REGEX, fn_name):
> +                variant = TestCaseVariant.FUNCTIONAL
> +            elif prefix := re.match(self.PERF_TEST_CASE_REGEX, fn_name):
> +                variant = TestCaseVariant.PERFORMANCE
> +            else:
> +                continue
> +
> +            name = fn_name[len(prefix.group(0)) :]

Do we actually want to strip the prefix? It could be confusing if it 
appears in logs.

> +            test_cases.append(TestCase(name, fn_type, variant))
> +
> +        return test_cases
> +
> +    @classmethod
> +    def discover_all(
> +        cls, package_name: str | None = None, module_prefix: str | None = None
> +    ) -> list[Self]:
> +        """Discover all the test suites.
> +
> +        The test suites are discovered in the provided `package_name`. The full module name,
> +        expected under that package, is prefixed with `module_prefix`.
> +        The module name is a standard filename with words separated with underscores.
> +        For each module found, search for a :class:`TestSuite` class which starts
> +        with `self.TEST_SUITE_CLASS_PREFIX`, continuing with the module name in PascalCase.

`self.TEST_SUITE_CLASS_PREFIX` -> 
:attr:`~TestSuiteSpec.TEST_SUITE_CLASS_PREFIX`

> +
> +        The PascalCase convention applies to abbreviations, acronyms, initialisms and so on::
> +
> +            OS -> Os
> +            TCP -> Tcp
> +
> +        Args:
> +            package_name: The name of the package where to find the test suites, if none is set the

I'd separate this into two sentences, with the second one reworded a bit:

If :data:`None`, the :attr:`~TestSuiteSpec.TEST_SUITES_PACKAGE_NAME` 
constant is used.

> +                constant :attr:`~TestSuiteSpec.TEST_SUITES_PACKAGE_NAME` is used instead.
> +            module_prefix: The name prefix defining the test suite module, if none is set the

Same here.

> +                constant :attr:`~TestSuiteSpec.TEST_SUITE_MODULE_PREFIX` is used instead.
> +
> +        Returns:
> +            A list containing all the discovered test suites.
> +        """
> +        if package_name is None:
> +            package_name = cls.TEST_SUITES_PACKAGE_NAME
> +        if module_prefix is None:
> +            module_prefix = cls.TEST_SUITE_MODULE_PREFIX
> +
> +        test_suites = []
> +
> +        test_suites_pkg = import_module(package_name)
> +        for _, module_name, is_pkg in iter_modules(test_suites_pkg.__path__):
> +            if not module_name.startswith(module_prefix) or is_pkg:
> +                continue
> +
> +            test_suite = cls(module_name)
> +            try:
> +                if test_suite.class_type:
> +                    test_suites.append(test_suite)
> +            except Exception:
> +                pass

It may be beneficial to log a warning that we found a {module_prefix} 
test suite module without any actual valid test suites.

> +
> +        return test_suites
> +
> +
> +AVAILABLE_TEST_SUITES: list[TestSuiteSpec] = TestSuiteSpec.discover_all()
> +"""Constant to store all the available, discovered and imported test suites.
> +
> +The test suites should be gathered from this list to avoid importing more than once.
> +"""

We could store this in TestSuiteSpec itself. This would allow us to move 
the find_by_name function into it and also not import everything at 
once, but only what's needed if it hadn't been imported before, but 
maybe we don't want to do that since we lose the verification aspect.

I'm just not a fan of code being executed when we import a module, since 
we didn't call anything, it just sorta happened. Looks like this is used 
when parsing configuration, so we could do the full scan using 
@cached_property and that way it'll be the best of both worlds.

> +
> +
> +def find_by_name(name: str) -> TestSuiteSpec | None:

It should be clearer from the name/args/docstring that we're trying to 
find the test suite by module name.

> +    """Find a requested test suite by name from the available ones."""
> +    test_suites = filter(lambda t: t.name == name, AVAILABLE_TEST_SUITES)

A list comprehension would be easier to understand I think (mostly 
because it would remove the question of why do it this way instead of 
list comprehension):
test_suite_specs = [test_suite_spec for test_suite_spec in 
AVAILABLE_TEST_SUITES if test_suite_spec.name == name]

> +    return next(test_suites, None)

And then test_suite_specs[0] if test_suite_specs else None

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [PATCH 2/5] dts: add Pydantic and remove Warlock
  2024-08-22 16:39 ` [PATCH 2/5] dts: add Pydantic and remove Warlock Luca Vizzarro
@ 2024-09-16 13:17   ` Juraj Linkeš
  2024-09-19 19:56   ` Nicholas Pratte
  1 sibling, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-16 13:17 UTC (permalink / raw)
  To: Luca Vizzarro, dev; +Cc: Honnappa Nagarahalli, Paul Szczepanek



On 22. 8. 2024 18:39, Luca Vizzarro wrote:
> Add Pydantic to the project dependencies while dropping Warlock.
> 

We should explain what pydantic is and why it's replacing warlock (and I 
think make them lowercase as that's how they appear in pyproject.toml).

But maybe we shouldn't remove warlock in this patch since that would 
make the dependencies incorrect (the code still uses it). The code would 
still work with previously installed dependencies so it's probably not a 
big deal though.

> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [PATCH 3/5] dts: use Pydantic in the configuration
  2024-08-22 16:39 ` [PATCH 3/5] dts: use Pydantic in the configuration Luca Vizzarro
@ 2024-09-17 11:13   ` Juraj Linkeš
  0 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-17 11:13 UTC (permalink / raw)
  To: Luca Vizzarro, dev; +Cc: Honnappa Nagarahalli, Paul Szczepanek



On 22. 8. 2024 18:39, Luca Vizzarro wrote:
> This change brings in Pydantic in place of Warlock. Pydantic offers
> a built-in model validation system in the classes, which allows for
> a more resilient and simpler code. As a consequence of this change:
> 
> - most validation is now built-in
> - further validation is added to verify:
>    - cross referencing of node names and ports
>    - test suite and test cases names
> - dictionaries representing the config schema are removed
> - the config schema is no longer used for validation but kept as an
>    alternative format for the developer

If it's not used, we should remove it right away (in this patch). I see 
that it's updated in v5, but we can just add it back.

> - the config schema can now be generated automatically from the
>    Pydantic models
> - the TrafficGeneratorType enum has been changed from inheriting
>    StrEnum to the native str and Enum. This change was necessary to
>    enable the discriminator for object unions
> - the structure of the classes has been slightly changed to perfectly
>    match the structure of the configuration files
> - updates the test suite argument to catch the ValidationError that
>    TestSuiteConfig can now raise

Passive voice is not used here, but the rest of the bullet points are 
using it.

>   delete mode 100644 dts/framework/config/types.py

A note, don't forget to remove this from doc sources if those get merged 
before this.

> 
> diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py

> @@ -2,17 +2,19 @@

> -the YAML test run configuration file
> -and validates it according to :download:`the schema <conf_yaml_schema.json>`.
> +the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
> +dataclass model. The Pydantic model is also available as
> +:download:`JSON schema <conf_yaml_schema.json>`.

This second sentence should be moved to the last patch.


> @@ -33,29 +35,31 @@

> +)
> +from pydantic.config import JsonDict
> +from pydantic.dataclasses import dataclass

We should probably distinguish between built-in dataclasses and pydantic 
dataclasses (as pydantic adds the extra argument). Importing them as 
pydantic_dataclass seems like the easiest way to achieve this.


> @@ -116,14 +120,14 @@ class Compiler(StrEnum):

>   @unique
> -class TrafficGeneratorType(StrEnum):
> +class TrafficGeneratorType(str, Enum):
>       """The supported traffic generators."""
>   
>       #:
> -    SCAPY = auto()
> +    SCAPY = "SCAPY"

Do discriminators not work with auto()?


> -@dataclass(slots=True, frozen=True)
> +@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))

Is there any special reason for kw_only? Maybe we should add the reason 
for this (and also the config arg) to the module dosctring and commit msg.


> @@ -136,12 +140,17 @@ class HugepageConfiguration:

> -@dataclass(slots=True, frozen=True)
> +PciAddress = Annotated[
> +    str, StringConstraints(pattern=r"^[\da-fA-F]{4}:[\da-fA-F]{2}:[\da-fA-F]{2}.\d:?\w*$")

We have a pattern for this in utils.py. We can reuse and maybe update it 
if needed.

> +]
> +"""A constrained string type representing a PCI address."""

This should be above the var and I think regular comment (with #) should 
suffice.


> @@ -150,69 +159,53 @@ class PortConfig:

> +TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
>   
> -@dataclass(slots=True, frozen=True)
> -class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
> -    """Scapy traffic generator specific configuration."""
>   
> -    pass
> +LogicalCores = Annotated[
> +    str,
> +    StringConstraints(pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$"),
> +    Field(
> +        description="Comma-separated list of logical cores to use. "
> +        "An empty string means use all lcores.",
> +        examples=["1,2,3,4,5,18-22", "10-15"],
> +    ),
> +]

These two types don't have have a docstring, but others have.


> @@ -232,69 +225,25 @@ class NodeConfiguration:

>       arch: Architecture
>       os: OS

Adding the descriptions to all fields would be beneficial. Do we want to 
do that in this patch?


> @@ -313,10 +264,14 @@ class TGNodeConfiguration(NodeConfiguration):

> -    traffic_generator: TrafficGeneratorConfig
> +    traffic_generator: TrafficGeneratorConfigTypes
> +
>   
> +NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
> +"""Union type for all the node configuration types."""

Same note as with PciAddress.


> @@ -405,31 +369,63 @@ class TestSuiteConfig:

> +    test_suite_name: str = Field(
> +        title="Test suite name",
> +        description="The identifying name of the test suite.",

I think we need to update this to mention that it's the test suite 
module name. Maybe we can also update the field, as it's only used in 
this object.

> +        alias="test_suite",
> +    )
> +    test_cases_names: list[str] = Field(
> +        default_factory=list,
> +        title="Test cases by name",
> +        description="The identifying name of the test cases of the test suite.",
> +        alias="test_cases",
> +    )

The attributes under Attributes need to be updated.

> +
> +    @cached_property
> +    def test_suite_spec(self) -> "TestSuiteSpec":
> +        """The specification of the requested test suite."""
> +        from framework.test_suite import find_by_name
> +
> +        test_suite_spec = find_by_name(self.test_suite_name)
> +        assert test_suite_spec is not None, f"{self.test_suite_name} is not a valid test suite name"

Doesn't end with a dot; the message should also mention that we're 
dealing with module name.

> +        return test_suite_spec
> +
> +    @model_validator(mode="before")

I think it makes sense to exlude these from docs. I tried putting :meta 
private: into a docstring and it seems to be working.

>       @classmethod
> -    def from_dict(
> -        cls,
> -        entry: str | TestSuiteConfigDict,
> -    ) -> Self:
> -        """Create an instance from two different types.
> +    def convert_from_string(cls, data: Any) -> Any:
> +        """Convert the string representation into a valid mapping."""
> +        if isinstance(data, str):
> +            [test_suite, *test_cases] = data.split()
> +            return dict(test_suite=test_suite, test_cases=test_cases)
> +        return data
> +

Why is this here? To unify the format with the one accepted by the 
--test-suite argument? Do we want to add an alternative format? If so, 
we need to make sure we document clearly that there are two alternatives 
and that they're equivalent.

> +    @model_validator(mode="after")
> +    def validate_names(self) -> Self:
> +        """Validate the supplied test suite and test cases names."""

In Configuration.validate_test_runs_with_nodes() the docstring mentions 
the use of the cached property, let's also do that here.

> +        available_test_cases = map(lambda t: t.name, self.test_suite_spec.test_cases)
> +        for requested_test_case in self.test_cases_names:
> +            assert requested_test_case in available_test_cases, (
> +                f"{requested_test_case} is not a valid test case "
> +                f"for test suite {self.test_suite_name}"

for test suite -> of test suite; also end with a dot. The dot is missing 
in a lot of places (and capital letters where the message doesn't start 
with a var value).


> @@ -442,143 +438,132 @@ class TestRunConfiguration:

> -@dataclass(slots=True, frozen=True)
> +
> +@dataclass(frozen=True, kw_only=True)
>   class Configuration:
>       """DTS testbed and test configuration.
>   
> -    The node configuration is not stored in this object. Rather, all used node configurations
> -    are stored inside the test run configuration where the nodes are actually used.
> -

I think it makes sense to explain the extra validation (with the 
@*_validator decorators) that's being done in the docstring (if we 
remove the validation methods from the generated docs). The docstring 
should be updated for each model that doing the extra validation.

>       Attributes:
>           test_runs: Test run configurations.
> +        nodes: Node configurations.
>       """
>   
> -    test_runs: list[TestRunConfiguration]
> +    test_runs: list[TestRunConfiguration] = Field(min_length=1)
> +    nodes: list[NodeConfigurationTypes] = Field(min_length=1)
>   
> +    @field_validator("nodes")
>       @classmethod
> -    def from_dict(cls, d: ConfigurationDict) -> Self:
> -        """A convenience method that processes the inputs before creating an instance.
> +    def validate_node_names(cls, nodes: list[NodeConfiguration]) -> list[NodeConfiguration]:
> +        """Validate that the node names are unique."""
> +        nodes_by_name: dict[str, int] = {}
> +        for node_no, node in enumerate(nodes):
> +            assert node.name not in nodes_by_name, (
> +                f"node {node_no} cannot have the same name as node {nodes_by_name[node.name]} "
> +                f"({node.name})"
> +            )
> +            nodes_by_name[node.name] = node_no
> +
> +        return nodes
> +
> +    @model_validator(mode="after")
> +    def validate_ports(self) -> Self:
> +        """Validate that the ports are all linked to valid ones."""
> +        port_links: dict[tuple[str, str], Literal[False] | tuple[int, int]] = {
> +            (node.name, port.pci): False for node in self.nodes for port in node.ports
> +        }
> +
> +        for node_no, node in enumerate(self.nodes):

I could see why we're use enumeration for nodes in validate_node_names, 
but here we can just use node names in assert messages. At least that 
should be the case if nodes get validated before this model validator 
runs - it that the case?

> +            for port_no, port in enumerate(node.ports):
> +                peer_port_identifier = (port.peer_node, port.peer_pci)
> +                peer_port = port_links.get(peer_port_identifier, None)
> +                assert peer_port is not None, (
> +                    "invalid peer port specified for " f"nodes.{node_no}.ports.{port_no}"
> +                )
> +                assert peer_port is False, (
> +                    f"the peer port specified for nodes.{node_no}.ports.{port_no} "
> +                    f"is already linked to nodes.{peer_port[0]}.ports.{peer_port[1]}"
> +                )
> +                port_links[peer_port_identifier] = (node_no, port_no)
>   

> +    @cached_property
> +    def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]:

Let's move the property to be the first member of the class, to unify 
the order it with TestSuiteConfig.

> +        """List test runs with the associated nodes."""

This doesn't list the test runs. I think the docstring should say a bit 
more to make it obvious that this is the main attribute to use with this 
class. Or maybe that could be in the the class's docstring.

We're also missing the Returns: section.

> +        test_runs_with_nodes = []
>   
> -        Returns:
> -            The whole configuration instance.
> -        """
> -        nodes: list[SutNodeConfiguration | TGNodeConfiguration] = list(
> -            map(NodeConfiguration.from_dict, d["nodes"])
> -        )
> -        assert len(nodes) > 0, "There must be a node to test"
> +        for test_run_no, test_run in enumerate(self.test_runs):
> +            sut_node_name = test_run.system_under_test_node.node_name
> +            sut_node = next(filter(lambda n: n.name == sut_node_name, self.nodes), None)

There are a number of these instead of a list comprehension (I mentioned 
this in a previous patch). I still don't really see a reason to not use 
list comprehensions in all these cases.


> +
> +
> +ConfigurationType = TypeAdapter(Configuration)

This new transformed class exists only for validation purposes, right? I 
think we can move this to load_config, as it's not going to be used 
anywhere else.

Also I'd rename it to something else, it's not a type. Maybe 
ConfigurationAdapter or PydanticConfiguration or ConfigurationModel (as 
the adapter adds some methods from BaseModel). Or something else, but 
the type in the name confused me.

> diff --git a/dts/framework/runner.py b/dts/framework/runner.py
> @@ -231,10 +234,10 @@ def _get_test_suites_with_cases(
>           test_suites_with_cases = []
>   
>           for test_suite_config in test_suite_configs:
> -            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
> +            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)

We've already done all the validation and importing at this point and we 
should be able to use test_suite_config.test_suite_spec, right? The same 
is true for TestSuiteWithCases, which holds the same information.

Looks like you removed _get_test_suite_class in a subsequent patch, but 
we should think about getting rid of TestSuiteWithCases, as it was 
conceived to do what TestSuiteSpec is doing.


^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [PATCH 4/5] dts: use TestSuiteSpec class imports
  2024-08-22 16:39 ` [PATCH 4/5] dts: use TestSuiteSpec class imports Luca Vizzarro
@ 2024-09-17 11:39   ` Juraj Linkeš
  0 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-17 11:39 UTC (permalink / raw)
  To: Luca Vizzarro, dev; +Cc: Honnappa Nagarahalli, Paul Szczepanek


> diff --git a/dts/framework/runner.py b/dts/framework/runner.py

> @@ -229,139 +221,34 @@ def _get_test_suites_with_cases(

> +            filtered_test_cases: list[TestCase] = [
> +                test_case
> +                for test_case in test_suite_spec.test_cases
> +                if not test_suite_config.test_cases_names
> +                or test_case.name in test_suite_config.test_cases_names
> +            ]

Ah, looks like TestSuiteSpec doesn't contain the subset we want to test. 
Could we adapt it this way? I think we don't really care about test 
cases we don't want to test.


^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [PATCH 5/5] dts: add JSON schema generation script
  2024-08-22 16:39 ` [PATCH 5/5] dts: add JSON schema generation script Luca Vizzarro
@ 2024-09-17 11:59   ` Juraj Linkeš
  0 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-17 11:59 UTC (permalink / raw)
  To: Luca Vizzarro, dev; +Cc: Honnappa Nagarahalli, Paul Szczepanek


>   create mode 100755 dts/generate-schema.py

Could it be worth putting this into devtools? It is a devtool.

> 
> diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst

> @@ -430,6 +430,16 @@ Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
>   Configuration Schema
>   --------------------
>   
> +The configuration schema is automatically generated from Pydantic models and can be found
> +at ``dts/framework/config/conf_yaml_schema.json``. Whenever the models are changed, the schema
> +should be regenerated using the dedicated script at ``dts/generate-schema.py``, e.g.:

Should we add this to devtools/dts-check-format.sh? Looks like a good 
candidate.

> +
> +.. code-block:: console
> +
> +   $ poetry shell
> +   (dts-py3.10) $ ./generate-schema.py
> +
> +
>   Definitions
>   ~~~~~~~~~~~

The definition names have changed and maybe there are also some other 
changes or does that not matter? Can these Pydantic changes help us with 
generating this schema description as well?

> diff --git a/dts/generate-schema.py b/dts/generate-schema.py

> @@ -0,0 +1,38 @@
> +#!/usr/bin/env python3
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2024 Arm Limited
> +
> +"""JSON schema generation script."""

This should at least say how to run the script, but we probably want to 
add more, such as from what it's creating the schema and where it's 
going to put it.


> +from framework.config import ConfigurationType
> +

Ah, so it is used elsewhere. Let's just rename it then.

> +DTS_DIR = os.path.dirname(os.path.realpath(__file__))
> +RELATIVE_PATH_TO_SCHEMA = "framework/config/conf_yaml_schema.json"

We're using pathlib everywhere in DTS, so let's use it here as well. Not 
sure if the portability is needed in this script, but why not.


> +class GenerateSchemaWithDialect(GenerateJsonSchema):
> +    """Custom schema generator which adds the schema dialect."""

I'd add that we're adding a reference to the schema dialect.


> +    print("Schema generated successfully!")
> +except Exception as e:
> +    raise Exception("failed to generate schema") from e

Let's unify the message with the print above by capitalizing and adding 
a dot to the end.


^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [PATCH 2/5] dts: add Pydantic and remove Warlock
  2024-08-22 16:39 ` [PATCH 2/5] dts: add Pydantic and remove Warlock Luca Vizzarro
  2024-09-16 13:17   ` Juraj Linkeš
@ 2024-09-19 19:56   ` Nicholas Pratte
  1 sibling, 0 replies; 13+ messages in thread
From: Nicholas Pratte @ 2024-09-19 19:56 UTC (permalink / raw)
  To: Luca Vizzarro
  Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

I understand Juraj's concern about the dependencies assuming this
change was to stay in its own isolated patch in the log. That aside,
this is pretty straightforward, and I have confidence in the judgement
of whatever decision is made between the two of you.

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

On Thu, Aug 22, 2024 at 12:40 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> Add Pydantic to the project dependencies while dropping Warlock.
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>  dts/poetry.lock    | 346 +++++++++++++++++----------------------------
>  dts/pyproject.toml |   3 +-
>  2 files changed, 135 insertions(+), 214 deletions(-)
>
> diff --git a/dts/poetry.lock b/dts/poetry.lock
> index 5f8fa03933..c5b0d059a8 100644
> --- a/dts/poetry.lock
> +++ b/dts/poetry.lock
> @@ -1,23 +1,16 @@
>  # This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
>
>  [[package]]
> -name = "attrs"
> -version = "23.1.0"
> -description = "Classes Without Boilerplate"
> +name = "annotated-types"
> +version = "0.7.0"
> +description = "Reusable constraint types to use with typing.Annotated"
>  optional = false
> -python-versions = ">=3.7"
> +python-versions = ">=3.8"
>  files = [
> -    {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
> -    {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
> +    {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"},
> +    {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
>  ]
>
> -[package.extras]
> -cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
> -dev = ["attrs[docs,tests]", "pre-commit"]
> -docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
> -tests = ["attrs[tests-no-zope]", "zope-interface"]
> -tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
> -
>  [[package]]
>  name = "bcrypt"
>  version = "4.0.1"
> @@ -280,66 +273,6 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
>  plugins = ["setuptools"]
>  requirements-deprecated-finder = ["pip-api", "pipreqs"]
>
> -[[package]]
> -name = "jsonpatch"
> -version = "1.33"
> -description = "Apply JSON-Patches (RFC 6902)"
> -optional = false
> -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
> -files = [
> -    {file = "jsonpatch-1.33-py2.py3-none-any.whl", hash = "sha256:0ae28c0cd062bbd8b8ecc26d7d164fbbea9652a1a3693f3b956c1eae5145dade"},
> -    {file = "jsonpatch-1.33.tar.gz", hash = "sha256:9fcd4009c41e6d12348b4a0ff2563ba56a2923a7dfee731d004e212e1ee5030c"},
> -]
> -
> -[package.dependencies]
> -jsonpointer = ">=1.9"
> -
> -[[package]]
> -name = "jsonpointer"
> -version = "2.4"
> -description = "Identify specific nodes in a JSON document (RFC 6901)"
> -optional = false
> -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
> -files = [
> -    {file = "jsonpointer-2.4-py2.py3-none-any.whl", hash = "sha256:15d51bba20eea3165644553647711d150376234112651b4f1811022aecad7d7a"},
> -    {file = "jsonpointer-2.4.tar.gz", hash = "sha256:585cee82b70211fa9e6043b7bb89db6e1aa49524340dde8ad6b63206ea689d88"},
> -]
> -
> -[[package]]
> -name = "jsonschema"
> -version = "4.18.4"
> -description = "An implementation of JSON Schema validation for Python"
> -optional = false
> -python-versions = ">=3.8"
> -files = [
> -    {file = "jsonschema-4.18.4-py3-none-any.whl", hash = "sha256:971be834317c22daaa9132340a51c01b50910724082c2c1a2ac87eeec153a3fe"},
> -    {file = "jsonschema-4.18.4.tar.gz", hash = "sha256:fb3642735399fa958c0d2aad7057901554596c63349f4f6b283c493cf692a25d"},
> -]
> -
> -[package.dependencies]
> -attrs = ">=22.2.0"
> -jsonschema-specifications = ">=2023.03.6"
> -referencing = ">=0.28.4"
> -rpds-py = ">=0.7.1"
> -
> -[package.extras]
> -format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
> -format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
> -
> -[[package]]
> -name = "jsonschema-specifications"
> -version = "2023.7.1"
> -description = "The JSON Schema meta-schemas and vocabularies, exposed as a Registry"
> -optional = false
> -python-versions = ">=3.8"
> -files = [
> -    {file = "jsonschema_specifications-2023.7.1-py3-none-any.whl", hash = "sha256:05adf340b659828a004220a9613be00fa3f223f2b82002e273dee62fd50524b1"},
> -    {file = "jsonschema_specifications-2023.7.1.tar.gz", hash = "sha256:c91a50404e88a1f6ba40636778e2ee08f6e24c5613fe4c53ac24578a5a7f72bb"},
> -]
> -
> -[package.dependencies]
> -referencing = ">=0.28.0"
> -
>  [[package]]
>  name = "mccabe"
>  version = "0.7.0"
> @@ -492,6 +425,129 @@ files = [
>      {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
>  ]
>
> +[[package]]
> +name = "pydantic"
> +version = "2.8.2"
> +description = "Data validation using Python type hints"
> +optional = false
> +python-versions = ">=3.8"
> +files = [
> +    {file = "pydantic-2.8.2-py3-none-any.whl", hash = "sha256:73ee9fddd406dc318b885c7a2eab8a6472b68b8fb5ba8150949fc3db939f23c8"},
> +    {file = "pydantic-2.8.2.tar.gz", hash = "sha256:6f62c13d067b0755ad1c21a34bdd06c0c12625a22b0fc09c6b149816604f7c2a"},
> +]
> +
> +[package.dependencies]
> +annotated-types = ">=0.4.0"
> +pydantic-core = "2.20.1"
> +typing-extensions = [
> +    {version = ">=4.12.2", markers = "python_version >= \"3.13\""},
> +    {version = ">=4.6.1", markers = "python_version < \"3.13\""},
> +]
> +
> +[package.extras]
> +email = ["email-validator (>=2.0.0)"]
> +
> +[[package]]
> +name = "pydantic-core"
> +version = "2.20.1"
> +description = "Core functionality for Pydantic validation and serialization"
> +optional = false
> +python-versions = ">=3.8"
> +files = [
> +    {file = "pydantic_core-2.20.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:3acae97ffd19bf091c72df4d726d552c473f3576409b2a7ca36b2f535ffff4a3"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:41f4c96227a67a013e7de5ff8f20fb496ce573893b7f4f2707d065907bffdbd6"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5f239eb799a2081495ea659d8d4a43a8f42cd1fe9ff2e7e436295c38a10c286a"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:53e431da3fc53360db73eedf6f7124d1076e1b4ee4276b36fb25514544ceb4a3"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f1f62b2413c3a0e846c3b838b2ecd6c7a19ec6793b2a522745b0869e37ab5bc1"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5d41e6daee2813ecceea8eda38062d69e280b39df793f5a942fa515b8ed67953"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d482efec8b7dc6bfaedc0f166b2ce349df0011f5d2f1f25537ced4cfc34fd98"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e93e1a4b4b33daed65d781a57a522ff153dcf748dee70b40c7258c5861e1768a"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e7c4ea22b6739b162c9ecaaa41d718dfad48a244909fe7ef4b54c0b530effc5a"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:4f2790949cf385d985a31984907fecb3896999329103df4e4983a4a41e13e840"},
> +    {file = "pydantic_core-2.20.1-cp310-none-win32.whl", hash = "sha256:5e999ba8dd90e93d57410c5e67ebb67ffcaadcea0ad973240fdfd3a135506250"},
> +    {file = "pydantic_core-2.20.1-cp310-none-win_amd64.whl", hash = "sha256:512ecfbefef6dac7bc5eaaf46177b2de58cdf7acac8793fe033b24ece0b9566c"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:d2a8fa9d6d6f891f3deec72f5cc668e6f66b188ab14bb1ab52422fe8e644f312"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:175873691124f3d0da55aeea1d90660a6ea7a3cfea137c38afa0a5ffabe37b88"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:37eee5b638f0e0dcd18d21f59b679686bbd18917b87db0193ae36f9c23c355fc"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:25e9185e2d06c16ee438ed39bf62935ec436474a6ac4f9358524220f1b236e43"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:150906b40ff188a3260cbee25380e7494ee85048584998c1e66df0c7a11c17a6"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8ad4aeb3e9a97286573c03df758fc7627aecdd02f1da04516a86dc159bf70121"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d3f3ed29cd9f978c604708511a1f9c2fdcb6c38b9aae36a51905b8811ee5cbf1"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0dae11d8f5ded51699c74d9548dcc5938e0804cc8298ec0aa0da95c21fff57b"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:faa6b09ee09433b87992fb5a2859efd1c264ddc37280d2dd5db502126d0e7f27"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9dc1b507c12eb0481d071f3c1808f0529ad41dc415d0ca11f7ebfc666e66a18b"},
> +    {file = "pydantic_core-2.20.1-cp311-none-win32.whl", hash = "sha256:fa2fddcb7107e0d1808086ca306dcade7df60a13a6c347a7acf1ec139aa6789a"},
> +    {file = "pydantic_core-2.20.1-cp311-none-win_amd64.whl", hash = "sha256:40a783fb7ee353c50bd3853e626f15677ea527ae556429453685ae32280c19c2"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:595ba5be69b35777474fa07f80fc260ea71255656191adb22a8c53aba4479231"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a4f55095ad087474999ee28d3398bae183a66be4823f753cd7d67dd0153427c9"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f9aa05d09ecf4c75157197f27cdc9cfaeb7c5f15021c6373932bf3e124af029f"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e97fdf088d4b31ff4ba35db26d9cc472ac7ef4a2ff2badeabf8d727b3377fc52"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bc633a9fe1eb87e250b5c57d389cf28998e4292336926b0b6cdaee353f89a237"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d573faf8eb7e6b1cbbcb4f5b247c60ca8be39fe2c674495df0eb4318303137fe"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:26dc97754b57d2fd00ac2b24dfa341abffc380b823211994c4efac7f13b9e90e"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:33499e85e739a4b60c9dac710c20a08dc73cb3240c9a0e22325e671b27b70d24"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:bebb4d6715c814597f85297c332297c6ce81e29436125ca59d1159b07f423eb1"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:516d9227919612425c8ef1c9b869bbbee249bc91912c8aaffb66116c0b447ebd"},
> +    {file = "pydantic_core-2.20.1-cp312-none-win32.whl", hash = "sha256:469f29f9093c9d834432034d33f5fe45699e664f12a13bf38c04967ce233d688"},
> +    {file = "pydantic_core-2.20.1-cp312-none-win_amd64.whl", hash = "sha256:035ede2e16da7281041f0e626459bcae33ed998cca6a0a007a5ebb73414ac72d"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:0827505a5c87e8aa285dc31e9ec7f4a17c81a813d45f70b1d9164e03a813a686"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:19c0fa39fa154e7e0b7f82f88ef85faa2a4c23cc65aae2f5aea625e3c13c735a"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa223cd1e36b642092c326d694d8bf59b71ddddc94cdb752bbbb1c5c91d833b"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:c336a6d235522a62fef872c6295a42ecb0c4e1d0f1a3e500fe949415761b8a19"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7eb6a0587eded33aeefea9f916899d42b1799b7b14b8f8ff2753c0ac1741edac"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:70c8daf4faca8da5a6d655f9af86faf6ec2e1768f4b8b9d0226c02f3d6209703"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e9fa4c9bf273ca41f940bceb86922a7667cd5bf90e95dbb157cbb8441008482c"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:11b71d67b4725e7e2a9f6e9c0ac1239bbc0c48cce3dc59f98635efc57d6dac83"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:270755f15174fb983890c49881e93f8f1b80f0b5e3a3cc1394a255706cabd203"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:c81131869240e3e568916ef4c307f8b99583efaa60a8112ef27a366eefba8ef0"},
> +    {file = "pydantic_core-2.20.1-cp313-none-win32.whl", hash = "sha256:b91ced227c41aa29c672814f50dbb05ec93536abf8f43cd14ec9521ea09afe4e"},
> +    {file = "pydantic_core-2.20.1-cp313-none-win_amd64.whl", hash = "sha256:65db0f2eefcaad1a3950f498aabb4875c8890438bc80b19362cf633b87a8ab20"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:4745f4ac52cc6686390c40eaa01d48b18997cb130833154801a442323cc78f91"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:a8ad4c766d3f33ba8fd692f9aa297c9058970530a32c728a2c4bfd2616d3358b"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41e81317dd6a0127cabce83c0c9c3fbecceae981c8391e6f1dec88a77c8a569a"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:04024d270cf63f586ad41fff13fde4311c4fc13ea74676962c876d9577bcc78f"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:eaad4ff2de1c3823fddf82f41121bdf453d922e9a238642b1dedb33c4e4f98ad"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:26ab812fa0c845df815e506be30337e2df27e88399b985d0bb4e3ecfe72df31c"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3c5ebac750d9d5f2706654c638c041635c385596caf68f81342011ddfa1e5598"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2aafc5a503855ea5885559eae883978c9b6d8c8993d67766ee73d82e841300dd"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:4868f6bd7c9d98904b748a2653031fc9c2f85b6237009d475b1008bfaeb0a5aa"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:aa2f457b4af386254372dfa78a2eda2563680d982422641a85f271c859df1987"},
> +    {file = "pydantic_core-2.20.1-cp38-none-win32.whl", hash = "sha256:225b67a1f6d602de0ce7f6c1c3ae89a4aa25d3de9be857999e9124f15dab486a"},
> +    {file = "pydantic_core-2.20.1-cp38-none-win_amd64.whl", hash = "sha256:6b507132dcfc0dea440cce23ee2182c0ce7aba7054576efc65634f080dbe9434"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:b03f7941783b4c4a26051846dea594628b38f6940a2fdc0df00b221aed39314c"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:1eedfeb6089ed3fad42e81a67755846ad4dcc14d73698c120a82e4ccf0f1f9f6"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:635fee4e041ab9c479e31edda27fcf966ea9614fff1317e280d99eb3e5ab6fe2"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:77bf3ac639c1ff567ae3b47f8d4cc3dc20f9966a2a6dd2311dcc055d3d04fb8a"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7ed1b0132f24beeec5a78b67d9388656d03e6a7c837394f99257e2d55b461611"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c6514f963b023aeee506678a1cf821fe31159b925c4b76fe2afa94cc70b3222b"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:10d4204d8ca33146e761c79f83cc861df20e7ae9f6487ca290a97702daf56006"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2d036c7187b9422ae5b262badb87a20a49eb6c5238b2004e96d4da1231badef1"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9ebfef07dbe1d93efb94b4700f2d278494e9162565a54f124c404a5656d7ff09"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:6b9d9bb600328a1ce523ab4f454859e9d439150abb0906c5a1983c146580ebab"},
> +    {file = "pydantic_core-2.20.1-cp39-none-win32.whl", hash = "sha256:784c1214cb6dd1e3b15dd8b91b9a53852aed16671cc3fbe4786f4f1db07089e2"},
> +    {file = "pydantic_core-2.20.1-cp39-none-win_amd64.whl", hash = "sha256:d2fe69c5434391727efa54b47a1e7986bb0186e72a41b203df8f5b0a19a4f669"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:a45f84b09ac9c3d35dfcf6a27fd0634d30d183205230a0ebe8373a0e8cfa0906"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d02a72df14dfdbaf228424573a07af10637bd490f0901cee872c4f434a735b94"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d2b27e6af28f07e2f195552b37d7d66b150adbaa39a6d327766ffd695799780f"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:084659fac3c83fd674596612aeff6041a18402f1e1bc19ca39e417d554468482"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:242b8feb3c493ab78be289c034a1f659e8826e2233786e36f2893a950a719bb6"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:38cf1c40a921d05c5edc61a785c0ddb4bed67827069f535d794ce6bcded919fc"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:e0bbdd76ce9aa5d4209d65f2b27fc6e5ef1312ae6c5333c26db3f5ade53a1e99"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:254ec27fdb5b1ee60684f91683be95e5133c994cc54e86a0b0963afa25c8f8a6"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:407653af5617f0757261ae249d3fba09504d7a71ab36ac057c938572d1bc9331"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:c693e916709c2465b02ca0ad7b387c4f8423d1db7b4649c551f27a529181c5ad"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5b5ff4911aea936a47d9376fd3ab17e970cc543d1b68921886e7f64bd28308d1"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:177f55a886d74f1808763976ac4efd29b7ed15c69f4d838bbd74d9d09cf6fa86"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:964faa8a861d2664f0c7ab0c181af0bea66098b1919439815ca8803ef136fc4e"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:4dd484681c15e6b9a977c785a345d3e378d72678fd5f1f3c0509608da24f2ac0"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f6d6cff3538391e8486a431569b77921adfcdef14eb18fbf19b7c0a5294d4e6a"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:a6d511cc297ff0883bc3708b465ff82d7560193169a8b93260f74ecb0a5e08a7"},
> +    {file = "pydantic_core-2.20.1.tar.gz", hash = "sha256:26ca695eeee5f9f1aeeb211ffc12f10bcb6f71e2989988fda61dabd65db878d4"},
> +]
> +
> +[package.dependencies]
> +typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
> +
>  [[package]]
>  name = "pydocstyle"
>  version = "6.1.1"
> @@ -633,127 +689,6 @@ files = [
>      {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
>  ]
>
> -[[package]]
> -name = "referencing"
> -version = "0.30.0"
> -description = "JSON Referencing + Python"
> -optional = false
> -python-versions = ">=3.8"
> -files = [
> -    {file = "referencing-0.30.0-py3-none-any.whl", hash = "sha256:c257b08a399b6c2f5a3510a50d28ab5dbc7bbde049bcaf954d43c446f83ab548"},
> -    {file = "referencing-0.30.0.tar.gz", hash = "sha256:47237742e990457f7512c7d27486394a9aadaf876cbfaa4be65b27b4f4d47c6b"},
> -]
> -
> -[package.dependencies]
> -attrs = ">=22.2.0"
> -rpds-py = ">=0.7.0"
> -
> -[[package]]
> -name = "rpds-py"
> -version = "0.9.2"
> -description = "Python bindings to Rust's persistent data structures (rpds)"
> -optional = false
> -python-versions = ">=3.8"
> -files = [
> -    {file = "rpds_py-0.9.2-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:ab6919a09c055c9b092798ce18c6c4adf49d24d4d9e43a92b257e3f2548231e7"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d55777a80f78dd09410bd84ff8c95ee05519f41113b2df90a69622f5540c4f8b"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a216b26e5af0a8e265d4efd65d3bcec5fba6b26909014effe20cd302fd1138fa"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:29cd8bfb2d716366a035913ced99188a79b623a3512292963d84d3e06e63b496"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:44659b1f326214950a8204a248ca6199535e73a694be8d3e0e869f820767f12f"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:745f5a43fdd7d6d25a53ab1a99979e7f8ea419dfefebcab0a5a1e9095490ee5e"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a987578ac5214f18b99d1f2a3851cba5b09f4a689818a106c23dbad0dfeb760f"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf4151acb541b6e895354f6ff9ac06995ad9e4175cbc6d30aaed08856558201f"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:03421628f0dc10a4119d714a17f646e2837126a25ac7a256bdf7c3943400f67f"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:13b602dc3e8dff3063734f02dcf05111e887f301fdda74151a93dbbc249930fe"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:fae5cb554b604b3f9e2c608241b5d8d303e410d7dfb6d397c335f983495ce7f6"},
> -    {file = "rpds_py-0.9.2-cp310-none-win32.whl", hash = "sha256:47c5f58a8e0c2c920cc7783113df2fc4ff12bf3a411d985012f145e9242a2764"},
> -    {file = "rpds_py-0.9.2-cp310-none-win_amd64.whl", hash = "sha256:4ea6b73c22d8182dff91155af018b11aac9ff7eca085750455c5990cb1cfae6e"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:e564d2238512c5ef5e9d79338ab77f1cbbda6c2d541ad41b2af445fb200385e3"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f411330a6376fb50e5b7a3e66894e4a39e60ca2e17dce258d53768fea06a37bd"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e7521f5af0233e89939ad626b15278c71b69dc1dfccaa7b97bd4cdf96536bb7"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8d3335c03100a073883857e91db9f2e0ef8a1cf42dc0369cbb9151c149dbbc1b"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d25b1c1096ef0447355f7293fbe9ad740f7c47ae032c2884113f8e87660d8f6e"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a5d3fbd02efd9cf6a8ffc2f17b53a33542f6b154e88dd7b42ef4a4c0700fdad"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c5934e2833afeaf36bd1eadb57256239785f5af0220ed8d21c2896ec4d3a765f"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:095b460e117685867d45548fbd8598a8d9999227e9061ee7f012d9d264e6048d"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:91378d9f4151adc223d584489591dbb79f78814c0734a7c3bfa9c9e09978121c"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:24a81c177379300220e907e9b864107614b144f6c2a15ed5c3450e19cf536fae"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:de0b6eceb46141984671802d412568d22c6bacc9b230174f9e55fc72ef4f57de"},
> -    {file = "rpds_py-0.9.2-cp311-none-win32.whl", hash = "sha256:700375326ed641f3d9d32060a91513ad668bcb7e2cffb18415c399acb25de2ab"},
> -    {file = "rpds_py-0.9.2-cp311-none-win_amd64.whl", hash = "sha256:0766babfcf941db8607bdaf82569ec38107dbb03c7f0b72604a0b346b6eb3298"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-macosx_10_7_x86_64.whl", hash = "sha256:b1440c291db3f98a914e1afd9d6541e8fc60b4c3aab1a9008d03da4651e67386"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0f2996fbac8e0b77fd67102becb9229986396e051f33dbceada3debaacc7033f"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f30d205755566a25f2ae0382944fcae2f350500ae4df4e795efa9e850821d82"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:159fba751a1e6b1c69244e23ba6c28f879a8758a3e992ed056d86d74a194a0f3"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a1f044792e1adcea82468a72310c66a7f08728d72a244730d14880cd1dabe36b"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9251eb8aa82e6cf88510530b29eef4fac825a2b709baf5b94a6094894f252387"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01899794b654e616c8625b194ddd1e5b51ef5b60ed61baa7a2d9c2ad7b2a4238"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0c43f8ae8f6be1d605b0465671124aa8d6a0e40f1fb81dcea28b7e3d87ca1e1"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:207f57c402d1f8712618f737356e4b6f35253b6d20a324d9a47cb9f38ee43a6b"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b52e7c5ae35b00566d244ffefba0f46bb6bec749a50412acf42b1c3f402e2c90"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:978fa96dbb005d599ec4fd9ed301b1cc45f1a8f7982d4793faf20b404b56677d"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:6aa8326a4a608e1c28da191edd7c924dff445251b94653988efb059b16577a4d"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:aad51239bee6bff6823bbbdc8ad85136c6125542bbc609e035ab98ca1e32a192"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4bd4dc3602370679c2dfb818d9c97b1137d4dd412230cfecd3c66a1bf388a196"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dd9da77c6ec1f258387957b754f0df60766ac23ed698b61941ba9acccd3284d1"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:190ca6f55042ea4649ed19c9093a9be9d63cd8a97880106747d7147f88a49d18"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:876bf9ed62323bc7dcfc261dbc5572c996ef26fe6406b0ff985cbcf460fc8a4c"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa2818759aba55df50592ecbc95ebcdc99917fa7b55cc6796235b04193eb3c55"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9ea4d00850ef1e917815e59b078ecb338f6a8efda23369677c54a5825dbebb55"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:5855c85eb8b8a968a74dc7fb014c9166a05e7e7a8377fb91d78512900aadd13d"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:14c408e9d1a80dcb45c05a5149e5961aadb912fff42ca1dd9b68c0044904eb32"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:65a0583c43d9f22cb2130c7b110e695fff834fd5e832a776a107197e59a1898e"},
> -    {file = "rpds_py-0.9.2-cp38-none-win32.whl", hash = "sha256:71f2f7715935a61fa3e4ae91d91b67e571aeb5cb5d10331ab681256bda2ad920"},
> -    {file = "rpds_py-0.9.2-cp38-none-win_amd64.whl", hash = "sha256:674c704605092e3ebbbd13687b09c9f78c362a4bc710343efe37a91457123044"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:07e2c54bef6838fa44c48dfbc8234e8e2466d851124b551fc4e07a1cfeb37260"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f7fdf55283ad38c33e35e2855565361f4bf0abd02470b8ab28d499c663bc5d7c"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:890ba852c16ace6ed9f90e8670f2c1c178d96510a21b06d2fa12d8783a905193"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:50025635ba8b629a86d9d5474e650da304cb46bbb4d18690532dd79341467846"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:517cbf6e67ae3623c5127206489d69eb2bdb27239a3c3cc559350ef52a3bbf0b"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0836d71ca19071090d524739420a61580f3f894618d10b666cf3d9a1688355b1"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c439fd54b2b9053717cca3de9583be6584b384d88d045f97d409f0ca867d80f"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f68996a3b3dc9335037f82754f9cdbe3a95db42bde571d8c3be26cc6245f2324"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7d68dc8acded354c972116f59b5eb2e5864432948e098c19fe6994926d8e15c3"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:f963c6b1218b96db85fc37a9f0851eaf8b9040aa46dec112611697a7023da535"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5a46859d7f947061b4010e554ccd1791467d1b1759f2dc2ec9055fa239f1bc26"},
> -    {file = "rpds_py-0.9.2-cp39-none-win32.whl", hash = "sha256:e07e5dbf8a83c66783a9fe2d4566968ea8c161199680e8ad38d53e075df5f0d0"},
> -    {file = "rpds_py-0.9.2-cp39-none-win_amd64.whl", hash = "sha256:682726178138ea45a0766907957b60f3a1bf3acdf212436be9733f28b6c5af3c"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_10_7_x86_64.whl", hash = "sha256:196cb208825a8b9c8fc360dc0f87993b8b260038615230242bf18ec84447c08d"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:c7671d45530fcb6d5e22fd40c97e1e1e01965fc298cbda523bb640f3d923b387"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83b32f0940adec65099f3b1c215ef7f1d025d13ff947975a055989cb7fd019a4"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7f67da97f5b9eac838b6980fc6da268622e91f8960e083a34533ca710bec8611"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03975db5f103997904c37e804e5f340c8fdabbb5883f26ee50a255d664eed58c"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:987b06d1cdb28f88a42e4fb8a87f094e43f3c435ed8e486533aea0bf2e53d931"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c861a7e4aef15ff91233751619ce3a3d2b9e5877e0fcd76f9ea4f6847183aa16"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:02938432352359805b6da099c9c95c8a0547fe4b274ce8f1a91677401bb9a45f"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:ef1f08f2a924837e112cba2953e15aacfccbbfcd773b4b9b4723f8f2ddded08e"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:35da5cc5cb37c04c4ee03128ad59b8c3941a1e5cd398d78c37f716f32a9b7f67"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:141acb9d4ccc04e704e5992d35472f78c35af047fa0cfae2923835d153f091be"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_10_7_x86_64.whl", hash = "sha256:79f594919d2c1a0cc17d1988a6adaf9a2f000d2e1048f71f298b056b1018e872"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:a06418fe1155e72e16dddc68bb3780ae44cebb2912fbd8bb6ff9161de56e1798"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b2eb034c94b0b96d5eddb290b7b5198460e2d5d0c421751713953a9c4e47d10"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b08605d248b974eb02f40bdcd1a35d3924c83a2a5e8f5d0fa5af852c4d960af"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a0805911caedfe2736935250be5008b261f10a729a303f676d3d5fea6900c96a"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab2299e3f92aa5417d5e16bb45bb4586171c1327568f638e8453c9f8d9e0f020"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c8d7594e38cf98d8a7df25b440f684b510cf4627fe038c297a87496d10a174f"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8b9ec12ad5f0a4625db34db7e0005be2632c1013b253a4a60e8302ad4d462afd"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:1fcdee18fea97238ed17ab6478c66b2095e4ae7177e35fb71fbe561a27adf620"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_i686.whl", hash = "sha256:933a7d5cd4b84f959aedeb84f2030f0a01d63ae6cf256629af3081cf3e3426e8"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:686ba516e02db6d6f8c279d1641f7067ebb5dc58b1d0536c4aaebb7bf01cdc5d"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_10_7_x86_64.whl", hash = "sha256:0173c0444bec0a3d7d848eaeca2d8bd32a1b43f3d3fde6617aac3731fa4be05f"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:d576c3ef8c7b2d560e301eb33891d1944d965a4d7a2eacb6332eee8a71827db6"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed89861ee8c8c47d6beb742a602f912b1bb64f598b1e2f3d758948721d44d468"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1054a08e818f8e18910f1bee731583fe8f899b0a0a5044c6e680ceea34f93876"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99e7c4bb27ff1aab90dcc3e9d37ee5af0231ed98d99cb6f5250de28889a3d502"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c545d9d14d47be716495076b659db179206e3fd997769bc01e2d550eeb685596"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9039a11bca3c41be5a58282ed81ae422fa680409022b996032a43badef2a3752"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fb39aca7a64ad0c9490adfa719dbeeb87d13be137ca189d2564e596f8ba32c07"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:2d8b3b3a2ce0eaa00c5bbbb60b6713e94e7e0becab7b3db6c5c77f979e8ed1f1"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:99b1c16f732b3a9971406fbfe18468592c5a3529585a45a35adbc1389a529a03"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:c27ee01a6c3223025f4badd533bea5e87c988cb0ba2811b690395dfe16088cfe"},
> -    {file = "rpds_py-0.9.2.tar.gz", hash = "sha256:8d70e8f14900f2657c249ea4def963bed86a29b81f81f5b76b5a9215680de945"},
> -]
> -
>  [[package]]
>  name = "scapy"
>  version = "2.5.0"
> @@ -826,31 +761,16 @@ files = [
>
>  [[package]]
>  name = "typing-extensions"
> -version = "4.11.0"
> +version = "4.12.2"
>  description = "Backported and Experimental Type Hints for Python 3.8+"
>  optional = false
>  python-versions = ">=3.8"
>  files = [
> -    {file = "typing_extensions-4.11.0-py3-none-any.whl", hash = "sha256:c1f94d72897edaf4ce775bb7558d5b79d8126906a14ea5ed1635921406c0387a"},
> -    {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
> +    {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"},
> +    {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"},
>  ]
>
> -[[package]]
> -name = "warlock"
> -version = "2.0.1"
> -description = "Python object model built on JSON schema and JSON patch."
> -optional = false
> -python-versions = ">=3.7,<4.0"
> -files = [
> -    {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"},
> -    {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"},
> -]
> -
> -[package.dependencies]
> -jsonpatch = ">=1,<2"
> -jsonschema = ">=4,<5"
> -
>  [metadata]
>  lock-version = "2.0"
>  python-versions = "^3.10"
> -content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
> +content-hash = "f69ffb8c1545d7beb035533dab109722f844f39f9ffd46b7aceb386e90fa039d"
> diff --git a/dts/pyproject.toml b/dts/pyproject.toml
> index 0b9b09805a..e5785f27d8 100644
> --- a/dts/pyproject.toml
> +++ b/dts/pyproject.toml
> @@ -19,13 +19,13 @@ documentation = "https://doc.dpdk.org/guides/tools/dts.html"
>
>  [tool.poetry.dependencies]
>  python = "^3.10"
> -warlock = "^2.0.1"
>  PyYAML = "^6.0"
>  types-PyYAML = "^6.0.8"
>  fabric = "^2.7.1"
>  scapy = "^2.5.0"
>  pydocstyle = "6.1.1"
>  typing-extensions = "^4.11.0"
> +pydantic = "^2.8.2"
>
>  [tool.poetry.group.dev.dependencies]
>  mypy = "^1.10.0"
> @@ -55,6 +55,7 @@ python_version = "3.10"
>  enable_error_code = ["ignore-without-code"]
>  show_error_codes = true
>  warn_unused_ignores = true
> +plugins = "pydantic.mypy"
>
>  [tool.isort]
>  profile = "black"
> --
> 2.34.1
>

-- 



*Let's Connect!*

...  *October Webinars*

Ask Us Anything: IOL Services 
Open Q&A 
<https://unh.zoom.us/webinar/register/9017265932716/WN_OUo5S7iQRLmKKY7CsmwZhw#/registration>Your 
questions. Our answers. Let's get started.


Oct 3rd


Live Tour of INTACT® 
for IPv6 Testing and Validation 
<https://unh.zoom.us/webinar/register/7117231236474/WN_I2zfyi_2S2yEiXkxBRi8sA#/registration>
Open tour. Open Q&A. See why we think you'll love INTACT.

Oct 9th


How to 
Prep for Our NVMe® Plugfest #21 
<https://unh.zoom.us/webinar/register/4017266809553/WN_X1iA2SZ8QhmcGboF2DImNg#/registration>
Checklists. Conversation. Let's get ready to plugin! 
Oct 15th


... * 
Newsletter*

*
*
Get the IOL Connector 
<https://www.iol.unh.edu/news/email-newsletters> for our latest news and 
event info.



.

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [PATCH 1/5] dts: add TestSuiteSpec class and discovery
  2024-08-22 16:39 ` [PATCH 1/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
  2024-09-16 13:00   ` Juraj Linkeš
@ 2024-09-19 20:01   ` Nicholas Pratte
  1 sibling, 0 replies; 13+ messages in thread
From: Nicholas Pratte @ 2024-09-19 20:01 UTC (permalink / raw)
  To: Luca Vizzarro
  Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

I think Juraj's comments here make sense, it probably would make sense
to separate this in-conjunction with Juraj's decorator patch and use
it as a dependency. From what I can understand, the changes offered
here make sense to me.

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

-- 



*Let's Connect!*

...  *October Webinars*

Ask Us Anything: IOL Services 
Open Q&A 
<https://unh.zoom.us/webinar/register/9017265932716/WN_OUo5S7iQRLmKKY7CsmwZhw#/registration>Your 
questions. Our answers. Let's get started.


Oct 3rd


Live Tour of INTACT® 
for IPv6 Testing and Validation 
<https://unh.zoom.us/webinar/register/7117231236474/WN_I2zfyi_2S2yEiXkxBRi8sA#/registration>
Open tour. Open Q&A. See why we think you'll love INTACT.

Oct 9th


How to 
Prep for Our NVMe® Plugfest #21 
<https://unh.zoom.us/webinar/register/4017266809553/WN_X1iA2SZ8QhmcGboF2DImNg#/registration>
Checklists. Conversation. Let's get ready to plugin! 
Oct 15th


... * 
Newsletter*

*
*
Get the IOL Connector 
<https://www.iol.unh.edu/news/email-newsletters> for our latest news and 
event info.



.

^ permalink raw reply	[flat|nested] 13+ messages in thread

end of thread, other threads:[~2024-09-19 20:01 UTC | newest]

Thread overview: 13+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
2024-08-22 16:39 ` [PATCH 1/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
2024-09-16 13:00   ` Juraj Linkeš
2024-09-19 20:01   ` Nicholas Pratte
2024-08-22 16:39 ` [PATCH 2/5] dts: add Pydantic and remove Warlock Luca Vizzarro
2024-09-16 13:17   ` Juraj Linkeš
2024-09-19 19:56   ` Nicholas Pratte
2024-08-22 16:39 ` [PATCH 3/5] dts: use Pydantic in the configuration Luca Vizzarro
2024-09-17 11:13   ` Juraj Linkeš
2024-08-22 16:39 ` [PATCH 4/5] dts: use TestSuiteSpec class imports Luca Vizzarro
2024-09-17 11:39   ` Juraj Linkeš
2024-08-22 16:39 ` [PATCH 5/5] dts: add JSON schema generation script Luca Vizzarro
2024-09-17 11:59   ` Juraj Linkeš

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).