DPDK patches and discussions
 help / color / mirror / Atom feed
* [PATCH 0/5] dts: Pydantic configuration
@ 2024-08-22 16:39 Luca Vizzarro
  2024-08-22 16:39 ` [PATCH 1/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
                   ` (9 more replies)
  0 siblings, 10 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-08-22 16:39 UTC (permalink / raw)
  To: dev; +Cc: Honnappa Nagarahalli, Juraj Linkeš, Luca Vizzarro

Hello,

sending the first version for the Pydantic configuration update work.

Best,
Luca

Luca Vizzarro (5):
  dts: add TestSuiteSpec class and discovery
  dts: add Pydantic and remove Warlock
  dts: use Pydantic in the configuration
  dts: use TestSuiteSpec class imports
  dts: add JSON schema generation script

 doc/guides/tools/dts.rst                      |  10 +
 dts/framework/config/__init__.py              | 588 +++++++------
 dts/framework/config/conf_yaml_schema.json    | 776 ++++++++++--------
 dts/framework/config/types.py                 | 132 ---
 dts/framework/runner.py                       | 198 ++---
 dts/framework/settings.py                     |  16 +-
 dts/framework/test_suite.py                   | 182 +++-
 dts/framework/testbed_model/sut_node.py       |   2 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 dts/generate-schema.py                        |  38 +
 dts/poetry.lock                               | 346 +++-----
 dts/pyproject.toml                            |   3 +-
 13 files changed, 1152 insertions(+), 1145 deletions(-)
 delete mode 100644 dts/framework/config/types.py
 create mode 100755 dts/generate-schema.py

-- 
2.34.1


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH 1/5] dts: add TestSuiteSpec class and discovery
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
@ 2024-08-22 16:39 ` Luca Vizzarro
  2024-09-16 13:00   ` Juraj Linkeš
  2024-09-19 20:01   ` Nicholas Pratte
  2024-08-22 16:39 ` [PATCH 2/5] dts: add Pydantic and remove Warlock Luca Vizzarro
                   ` (8 subsequent siblings)
  9 siblings, 2 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-08-22 16:39 UTC (permalink / raw)
  To: dev
  Cc: Honnappa Nagarahalli, Juraj Linkeš, Luca Vizzarro, Paul Szczepanek

Currently there is a lack of a definition which identifies all the test
suites available to test. This change intends to simplify the process to
discover all the test suites and idenfity them.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/test_suite.py | 182 +++++++++++++++++++++++++++++++++++-
 1 file changed, 181 insertions(+), 1 deletion(-)

diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index 694b2eba65..972968b036 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2010-2014 Intel Corporation
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Features common to all test suites.
 
@@ -13,12 +14,22 @@
     * Test case verification.
 """
 
+import inspect
+import re
+from dataclasses import dataclass
+from enum import Enum, auto
+from functools import cached_property
+from importlib import import_module
 from ipaddress import IPv4Interface, IPv6Interface, ip_interface
-from typing import ClassVar, Union
+from pkgutil import iter_modules
+from types import FunctionType, ModuleType
+from typing import ClassVar, NamedTuple, Union
 
+from pydantic.alias_generators import to_pascal
 from scapy.layers.inet import IP  # type: ignore[import-untyped]
 from scapy.layers.l2 import Ether  # type: ignore[import-untyped]
 from scapy.packet import Packet, Padding  # type: ignore[import-untyped]
+from typing_extensions import Self
 
 from framework.testbed_model.port import Port, PortLink
 from framework.testbed_model.sut_node import SutNode
@@ -365,3 +376,172 @@ def _verify_l3_packet(self, received_packet: IP, expected_packet: IP) -> bool:
         if received_packet.src != expected_packet.src or received_packet.dst != expected_packet.dst:
             return False
         return True
+
+
+class TestCaseVariant(Enum):
+    """Enum representing the variant of the test case."""
+
+    #:
+    FUNCTIONAL = auto()
+    #:
+    PERFORMANCE = auto()
+
+
+class TestCase(NamedTuple):
+    """Tuple representing a test case."""
+
+    #: The name of the test case without prefix
+    name: str
+    #: The reference to the function
+    function_type: FunctionType
+    #: The test case variant
+    variant: TestCaseVariant
+
+
+@dataclass
+class TestSuiteSpec:
+    """A class defining the specification of a test suite.
+
+    Apart from defining all the specs of a test suite, a helper function :meth:`discover_all` is
+    provided to automatically discover all the available test suites.
+
+    Attributes:
+        module_name: The name of the test suite's module.
+    """
+
+    #:
+    TEST_SUITES_PACKAGE_NAME = "tests"
+    #:
+    TEST_SUITE_MODULE_PREFIX = "TestSuite_"
+    #:
+    TEST_SUITE_CLASS_PREFIX = "Test"
+    #:
+    TEST_CASE_METHOD_PREFIX = "test_"
+    #:
+    FUNC_TEST_CASE_REGEX = r"test_(?!perf_)"
+    #:
+    PERF_TEST_CASE_REGEX = r"test_perf_"
+
+    module_name: str
+
+    @cached_property
+    def name(self) -> str:
+        """The name of the test suite's module."""
+        return self.module_name[len(self.TEST_SUITE_MODULE_PREFIX) :]
+
+    @cached_property
+    def module_type(self) -> ModuleType:
+        """A reference to the test suite's module."""
+        return import_module(f"{self.TEST_SUITES_PACKAGE_NAME}.{self.module_name}")
+
+    @cached_property
+    def class_name(self) -> str:
+        """The name of the test suite's class."""
+        return f"{self.TEST_SUITE_CLASS_PREFIX}{to_pascal(self.name)}"
+
+    @cached_property
+    def class_type(self) -> type[TestSuite]:
+        """A reference to the test suite's class."""
+
+        def is_test_suite(obj) -> bool:
+            """Check whether `obj` is a :class:`TestSuite`.
+
+            The `obj` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
+
+            Args:
+                obj: The object to be checked.
+
+            Returns:
+                :data:`True` if `obj` is a subclass of `TestSuite`.
+            """
+            try:
+                if issubclass(obj, TestSuite) and obj is not TestSuite:
+                    return True
+            except TypeError:
+                return False
+            return False
+
+        for class_name, class_type in inspect.getmembers(self.module_type, is_test_suite):
+            if class_name == self.class_name:
+                return class_type
+
+        raise Exception("class not found in eligible test module")
+
+    @cached_property
+    def test_cases(self) -> list[TestCase]:
+        """A list of all the available test cases."""
+        test_cases = []
+
+        functions = inspect.getmembers(self.class_type, inspect.isfunction)
+        for fn_name, fn_type in functions:
+            if prefix := re.match(self.FUNC_TEST_CASE_REGEX, fn_name):
+                variant = TestCaseVariant.FUNCTIONAL
+            elif prefix := re.match(self.PERF_TEST_CASE_REGEX, fn_name):
+                variant = TestCaseVariant.PERFORMANCE
+            else:
+                continue
+
+            name = fn_name[len(prefix.group(0)) :]
+            test_cases.append(TestCase(name, fn_type, variant))
+
+        return test_cases
+
+    @classmethod
+    def discover_all(
+        cls, package_name: str | None = None, module_prefix: str | None = None
+    ) -> list[Self]:
+        """Discover all the test suites.
+
+        The test suites are discovered in the provided `package_name`. The full module name,
+        expected under that package, is prefixed with `module_prefix`.
+        The module name is a standard filename with words separated with underscores.
+        For each module found, search for a :class:`TestSuite` class which starts
+        with `self.TEST_SUITE_CLASS_PREFIX`, continuing with the module name in PascalCase.
+
+        The PascalCase convention applies to abbreviations, acronyms, initialisms and so on::
+
+            OS -> Os
+            TCP -> Tcp
+
+        Args:
+            package_name: The name of the package where to find the test suites, if none is set the
+                constant :attr:`~TestSuiteSpec.TEST_SUITES_PACKAGE_NAME` is used instead.
+            module_prefix: The name prefix defining the test suite module, if none is set the
+                constant :attr:`~TestSuiteSpec.TEST_SUITE_MODULE_PREFIX` is used instead.
+
+        Returns:
+            A list containing all the discovered test suites.
+        """
+        if package_name is None:
+            package_name = cls.TEST_SUITES_PACKAGE_NAME
+        if module_prefix is None:
+            module_prefix = cls.TEST_SUITE_MODULE_PREFIX
+
+        test_suites = []
+
+        test_suites_pkg = import_module(package_name)
+        for _, module_name, is_pkg in iter_modules(test_suites_pkg.__path__):
+            if not module_name.startswith(module_prefix) or is_pkg:
+                continue
+
+            test_suite = cls(module_name)
+            try:
+                if test_suite.class_type:
+                    test_suites.append(test_suite)
+            except Exception:
+                pass
+
+        return test_suites
+
+
+AVAILABLE_TEST_SUITES: list[TestSuiteSpec] = TestSuiteSpec.discover_all()
+"""Constant to store all the available, discovered and imported test suites.
+
+The test suites should be gathered from this list to avoid importing more than once.
+"""
+
+
+def find_by_name(name: str) -> TestSuiteSpec | None:
+    """Find a requested test suite by name from the available ones."""
+    test_suites = filter(lambda t: t.name == name, AVAILABLE_TEST_SUITES)
+    return next(test_suites, None)
-- 
2.34.1


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH 2/5] dts: add Pydantic and remove Warlock
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
  2024-08-22 16:39 ` [PATCH 1/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
@ 2024-08-22 16:39 ` Luca Vizzarro
  2024-09-16 13:17   ` Juraj Linkeš
                     ` (2 more replies)
  2024-08-22 16:39 ` [PATCH 3/5] dts: use Pydantic in the configuration Luca Vizzarro
                   ` (7 subsequent siblings)
  9 siblings, 3 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-08-22 16:39 UTC (permalink / raw)
  To: dev
  Cc: Honnappa Nagarahalli, Juraj Linkeš, Luca Vizzarro, Paul Szczepanek

Add Pydantic to the project dependencies while dropping Warlock.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/poetry.lock    | 346 +++++++++++++++++----------------------------
 dts/pyproject.toml |   3 +-
 2 files changed, 135 insertions(+), 214 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..c5b0d059a8 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,23 +1,16 @@
 # This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
 
 [[package]]
-name = "attrs"
-version = "23.1.0"
-description = "Classes Without Boilerplate"
+name = "annotated-types"
+version = "0.7.0"
+description = "Reusable constraint types to use with typing.Annotated"
 optional = false
-python-versions = ">=3.7"
+python-versions = ">=3.8"
 files = [
-    {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
-    {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
+    {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"},
+    {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
 ]
 
-[package.extras]
-cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
-dev = ["attrs[docs,tests]", "pre-commit"]
-docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
-tests = ["attrs[tests-no-zope]", "zope-interface"]
-tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
-
 [[package]]
 name = "bcrypt"
 version = "4.0.1"
@@ -280,66 +273,6 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
 plugins = ["setuptools"]
 requirements-deprecated-finder = ["pip-api", "pipreqs"]
 
-[[package]]
-name = "jsonpatch"
-version = "1.33"
-description = "Apply JSON-Patches (RFC 6902)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpatch-1.33-py2.py3-none-any.whl", hash = "sha256:0ae28c0cd062bbd8b8ecc26d7d164fbbea9652a1a3693f3b956c1eae5145dade"},
-    {file = "jsonpatch-1.33.tar.gz", hash = "sha256:9fcd4009c41e6d12348b4a0ff2563ba56a2923a7dfee731d004e212e1ee5030c"},
-]
-
-[package.dependencies]
-jsonpointer = ">=1.9"
-
-[[package]]
-name = "jsonpointer"
-version = "2.4"
-description = "Identify specific nodes in a JSON document (RFC 6901)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpointer-2.4-py2.py3-none-any.whl", hash = "sha256:15d51bba20eea3165644553647711d150376234112651b4f1811022aecad7d7a"},
-    {file = "jsonpointer-2.4.tar.gz", hash = "sha256:585cee82b70211fa9e6043b7bb89db6e1aa49524340dde8ad6b63206ea689d88"},
-]
-
-[[package]]
-name = "jsonschema"
-version = "4.18.4"
-description = "An implementation of JSON Schema validation for Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema-4.18.4-py3-none-any.whl", hash = "sha256:971be834317c22daaa9132340a51c01b50910724082c2c1a2ac87eeec153a3fe"},
-    {file = "jsonschema-4.18.4.tar.gz", hash = "sha256:fb3642735399fa958c0d2aad7057901554596c63349f4f6b283c493cf692a25d"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-jsonschema-specifications = ">=2023.03.6"
-referencing = ">=0.28.4"
-rpds-py = ">=0.7.1"
-
-[package.extras]
-format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
-format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
-
-[[package]]
-name = "jsonschema-specifications"
-version = "2023.7.1"
-description = "The JSON Schema meta-schemas and vocabularies, exposed as a Registry"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema_specifications-2023.7.1-py3-none-any.whl", hash = "sha256:05adf340b659828a004220a9613be00fa3f223f2b82002e273dee62fd50524b1"},
-    {file = "jsonschema_specifications-2023.7.1.tar.gz", hash = "sha256:c91a50404e88a1f6ba40636778e2ee08f6e24c5613fe4c53ac24578a5a7f72bb"},
-]
-
-[package.dependencies]
-referencing = ">=0.28.0"
-
 [[package]]
 name = "mccabe"
 version = "0.7.0"
@@ -492,6 +425,129 @@ files = [
     {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
 ]
 
+[[package]]
+name = "pydantic"
+version = "2.8.2"
+description = "Data validation using Python type hints"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic-2.8.2-py3-none-any.whl", hash = "sha256:73ee9fddd406dc318b885c7a2eab8a6472b68b8fb5ba8150949fc3db939f23c8"},
+    {file = "pydantic-2.8.2.tar.gz", hash = "sha256:6f62c13d067b0755ad1c21a34bdd06c0c12625a22b0fc09c6b149816604f7c2a"},
+]
+
+[package.dependencies]
+annotated-types = ">=0.4.0"
+pydantic-core = "2.20.1"
+typing-extensions = [
+    {version = ">=4.12.2", markers = "python_version >= \"3.13\""},
+    {version = ">=4.6.1", markers = "python_version < \"3.13\""},
+]
+
+[package.extras]
+email = ["email-validator (>=2.0.0)"]
+
+[[package]]
+name = "pydantic-core"
+version = "2.20.1"
+description = "Core functionality for Pydantic validation and serialization"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic_core-2.20.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:3acae97ffd19bf091c72df4d726d552c473f3576409b2a7ca36b2f535ffff4a3"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:41f4c96227a67a013e7de5ff8f20fb496ce573893b7f4f2707d065907bffdbd6"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5f239eb799a2081495ea659d8d4a43a8f42cd1fe9ff2e7e436295c38a10c286a"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:53e431da3fc53360db73eedf6f7124d1076e1b4ee4276b36fb25514544ceb4a3"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f1f62b2413c3a0e846c3b838b2ecd6c7a19ec6793b2a522745b0869e37ab5bc1"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5d41e6daee2813ecceea8eda38062d69e280b39df793f5a942fa515b8ed67953"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d482efec8b7dc6bfaedc0f166b2ce349df0011f5d2f1f25537ced4cfc34fd98"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e93e1a4b4b33daed65d781a57a522ff153dcf748dee70b40c7258c5861e1768a"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e7c4ea22b6739b162c9ecaaa41d718dfad48a244909fe7ef4b54c0b530effc5a"},
+    {file = "pydantic_core-2.20.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:4f2790949cf385d985a31984907fecb3896999329103df4e4983a4a41e13e840"},
+    {file = "pydantic_core-2.20.1-cp310-none-win32.whl", hash = "sha256:5e999ba8dd90e93d57410c5e67ebb67ffcaadcea0ad973240fdfd3a135506250"},
+    {file = "pydantic_core-2.20.1-cp310-none-win_amd64.whl", hash = "sha256:512ecfbefef6dac7bc5eaaf46177b2de58cdf7acac8793fe033b24ece0b9566c"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:d2a8fa9d6d6f891f3deec72f5cc668e6f66b188ab14bb1ab52422fe8e644f312"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:175873691124f3d0da55aeea1d90660a6ea7a3cfea137c38afa0a5ffabe37b88"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:37eee5b638f0e0dcd18d21f59b679686bbd18917b87db0193ae36f9c23c355fc"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:25e9185e2d06c16ee438ed39bf62935ec436474a6ac4f9358524220f1b236e43"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:150906b40ff188a3260cbee25380e7494ee85048584998c1e66df0c7a11c17a6"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8ad4aeb3e9a97286573c03df758fc7627aecdd02f1da04516a86dc159bf70121"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d3f3ed29cd9f978c604708511a1f9c2fdcb6c38b9aae36a51905b8811ee5cbf1"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0dae11d8f5ded51699c74d9548dcc5938e0804cc8298ec0aa0da95c21fff57b"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:faa6b09ee09433b87992fb5a2859efd1c264ddc37280d2dd5db502126d0e7f27"},
+    {file = "pydantic_core-2.20.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9dc1b507c12eb0481d071f3c1808f0529ad41dc415d0ca11f7ebfc666e66a18b"},
+    {file = "pydantic_core-2.20.1-cp311-none-win32.whl", hash = "sha256:fa2fddcb7107e0d1808086ca306dcade7df60a13a6c347a7acf1ec139aa6789a"},
+    {file = "pydantic_core-2.20.1-cp311-none-win_amd64.whl", hash = "sha256:40a783fb7ee353c50bd3853e626f15677ea527ae556429453685ae32280c19c2"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:595ba5be69b35777474fa07f80fc260ea71255656191adb22a8c53aba4479231"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a4f55095ad087474999ee28d3398bae183a66be4823f753cd7d67dd0153427c9"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f9aa05d09ecf4c75157197f27cdc9cfaeb7c5f15021c6373932bf3e124af029f"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e97fdf088d4b31ff4ba35db26d9cc472ac7ef4a2ff2badeabf8d727b3377fc52"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bc633a9fe1eb87e250b5c57d389cf28998e4292336926b0b6cdaee353f89a237"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d573faf8eb7e6b1cbbcb4f5b247c60ca8be39fe2c674495df0eb4318303137fe"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:26dc97754b57d2fd00ac2b24dfa341abffc380b823211994c4efac7f13b9e90e"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:33499e85e739a4b60c9dac710c20a08dc73cb3240c9a0e22325e671b27b70d24"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:bebb4d6715c814597f85297c332297c6ce81e29436125ca59d1159b07f423eb1"},
+    {file = "pydantic_core-2.20.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:516d9227919612425c8ef1c9b869bbbee249bc91912c8aaffb66116c0b447ebd"},
+    {file = "pydantic_core-2.20.1-cp312-none-win32.whl", hash = "sha256:469f29f9093c9d834432034d33f5fe45699e664f12a13bf38c04967ce233d688"},
+    {file = "pydantic_core-2.20.1-cp312-none-win_amd64.whl", hash = "sha256:035ede2e16da7281041f0e626459bcae33ed998cca6a0a007a5ebb73414ac72d"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:0827505a5c87e8aa285dc31e9ec7f4a17c81a813d45f70b1d9164e03a813a686"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:19c0fa39fa154e7e0b7f82f88ef85faa2a4c23cc65aae2f5aea625e3c13c735a"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa223cd1e36b642092c326d694d8bf59b71ddddc94cdb752bbbb1c5c91d833b"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:c336a6d235522a62fef872c6295a42ecb0c4e1d0f1a3e500fe949415761b8a19"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7eb6a0587eded33aeefea9f916899d42b1799b7b14b8f8ff2753c0ac1741edac"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:70c8daf4faca8da5a6d655f9af86faf6ec2e1768f4b8b9d0226c02f3d6209703"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e9fa4c9bf273ca41f940bceb86922a7667cd5bf90e95dbb157cbb8441008482c"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:11b71d67b4725e7e2a9f6e9c0ac1239bbc0c48cce3dc59f98635efc57d6dac83"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:270755f15174fb983890c49881e93f8f1b80f0b5e3a3cc1394a255706cabd203"},
+    {file = "pydantic_core-2.20.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:c81131869240e3e568916ef4c307f8b99583efaa60a8112ef27a366eefba8ef0"},
+    {file = "pydantic_core-2.20.1-cp313-none-win32.whl", hash = "sha256:b91ced227c41aa29c672814f50dbb05ec93536abf8f43cd14ec9521ea09afe4e"},
+    {file = "pydantic_core-2.20.1-cp313-none-win_amd64.whl", hash = "sha256:65db0f2eefcaad1a3950f498aabb4875c8890438bc80b19362cf633b87a8ab20"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:4745f4ac52cc6686390c40eaa01d48b18997cb130833154801a442323cc78f91"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:a8ad4c766d3f33ba8fd692f9aa297c9058970530a32c728a2c4bfd2616d3358b"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41e81317dd6a0127cabce83c0c9c3fbecceae981c8391e6f1dec88a77c8a569a"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:04024d270cf63f586ad41fff13fde4311c4fc13ea74676962c876d9577bcc78f"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:eaad4ff2de1c3823fddf82f41121bdf453d922e9a238642b1dedb33c4e4f98ad"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:26ab812fa0c845df815e506be30337e2df27e88399b985d0bb4e3ecfe72df31c"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3c5ebac750d9d5f2706654c638c041635c385596caf68f81342011ddfa1e5598"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2aafc5a503855ea5885559eae883978c9b6d8c8993d67766ee73d82e841300dd"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:4868f6bd7c9d98904b748a2653031fc9c2f85b6237009d475b1008bfaeb0a5aa"},
+    {file = "pydantic_core-2.20.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:aa2f457b4af386254372dfa78a2eda2563680d982422641a85f271c859df1987"},
+    {file = "pydantic_core-2.20.1-cp38-none-win32.whl", hash = "sha256:225b67a1f6d602de0ce7f6c1c3ae89a4aa25d3de9be857999e9124f15dab486a"},
+    {file = "pydantic_core-2.20.1-cp38-none-win_amd64.whl", hash = "sha256:6b507132dcfc0dea440cce23ee2182c0ce7aba7054576efc65634f080dbe9434"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:b03f7941783b4c4a26051846dea594628b38f6940a2fdc0df00b221aed39314c"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:1eedfeb6089ed3fad42e81a67755846ad4dcc14d73698c120a82e4ccf0f1f9f6"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:635fee4e041ab9c479e31edda27fcf966ea9614fff1317e280d99eb3e5ab6fe2"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:77bf3ac639c1ff567ae3b47f8d4cc3dc20f9966a2a6dd2311dcc055d3d04fb8a"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7ed1b0132f24beeec5a78b67d9388656d03e6a7c837394f99257e2d55b461611"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c6514f963b023aeee506678a1cf821fe31159b925c4b76fe2afa94cc70b3222b"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:10d4204d8ca33146e761c79f83cc861df20e7ae9f6487ca290a97702daf56006"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2d036c7187b9422ae5b262badb87a20a49eb6c5238b2004e96d4da1231badef1"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9ebfef07dbe1d93efb94b4700f2d278494e9162565a54f124c404a5656d7ff09"},
+    {file = "pydantic_core-2.20.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:6b9d9bb600328a1ce523ab4f454859e9d439150abb0906c5a1983c146580ebab"},
+    {file = "pydantic_core-2.20.1-cp39-none-win32.whl", hash = "sha256:784c1214cb6dd1e3b15dd8b91b9a53852aed16671cc3fbe4786f4f1db07089e2"},
+    {file = "pydantic_core-2.20.1-cp39-none-win_amd64.whl", hash = "sha256:d2fe69c5434391727efa54b47a1e7986bb0186e72a41b203df8f5b0a19a4f669"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:a45f84b09ac9c3d35dfcf6a27fd0634d30d183205230a0ebe8373a0e8cfa0906"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d02a72df14dfdbaf228424573a07af10637bd490f0901cee872c4f434a735b94"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d2b27e6af28f07e2f195552b37d7d66b150adbaa39a6d327766ffd695799780f"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:084659fac3c83fd674596612aeff6041a18402f1e1bc19ca39e417d554468482"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:242b8feb3c493ab78be289c034a1f659e8826e2233786e36f2893a950a719bb6"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:38cf1c40a921d05c5edc61a785c0ddb4bed67827069f535d794ce6bcded919fc"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:e0bbdd76ce9aa5d4209d65f2b27fc6e5ef1312ae6c5333c26db3f5ade53a1e99"},
+    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:254ec27fdb5b1ee60684f91683be95e5133c994cc54e86a0b0963afa25c8f8a6"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:407653af5617f0757261ae249d3fba09504d7a71ab36ac057c938572d1bc9331"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:c693e916709c2465b02ca0ad7b387c4f8423d1db7b4649c551f27a529181c5ad"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5b5ff4911aea936a47d9376fd3ab17e970cc543d1b68921886e7f64bd28308d1"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:177f55a886d74f1808763976ac4efd29b7ed15c69f4d838bbd74d9d09cf6fa86"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:964faa8a861d2664f0c7ab0c181af0bea66098b1919439815ca8803ef136fc4e"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:4dd484681c15e6b9a977c785a345d3e378d72678fd5f1f3c0509608da24f2ac0"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f6d6cff3538391e8486a431569b77921adfcdef14eb18fbf19b7c0a5294d4e6a"},
+    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:a6d511cc297ff0883bc3708b465ff82d7560193169a8b93260f74ecb0a5e08a7"},
+    {file = "pydantic_core-2.20.1.tar.gz", hash = "sha256:26ca695eeee5f9f1aeeb211ffc12f10bcb6f71e2989988fda61dabd65db878d4"},
+]
+
+[package.dependencies]
+typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
+
 [[package]]
 name = "pydocstyle"
 version = "6.1.1"
@@ -633,127 +689,6 @@ files = [
     {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
 ]
 
-[[package]]
-name = "referencing"
-version = "0.30.0"
-description = "JSON Referencing + Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "referencing-0.30.0-py3-none-any.whl", hash = "sha256:c257b08a399b6c2f5a3510a50d28ab5dbc7bbde049bcaf954d43c446f83ab548"},
-    {file = "referencing-0.30.0.tar.gz", hash = "sha256:47237742e990457f7512c7d27486394a9aadaf876cbfaa4be65b27b4f4d47c6b"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-rpds-py = ">=0.7.0"
-
-[[package]]
-name = "rpds-py"
-version = "0.9.2"
-description = "Python bindings to Rust's persistent data structures (rpds)"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:ab6919a09c055c9b092798ce18c6c4adf49d24d4d9e43a92b257e3f2548231e7"},
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d55777a80f78dd09410bd84ff8c95ee05519f41113b2df90a69622f5540c4f8b"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a216b26e5af0a8e265d4efd65d3bcec5fba6b26909014effe20cd302fd1138fa"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:29cd8bfb2d716366a035913ced99188a79b623a3512292963d84d3e06e63b496"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:44659b1f326214950a8204a248ca6199535e73a694be8d3e0e869f820767f12f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:745f5a43fdd7d6d25a53ab1a99979e7f8ea419dfefebcab0a5a1e9095490ee5e"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a987578ac5214f18b99d1f2a3851cba5b09f4a689818a106c23dbad0dfeb760f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf4151acb541b6e895354f6ff9ac06995ad9e4175cbc6d30aaed08856558201f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:03421628f0dc10a4119d714a17f646e2837126a25ac7a256bdf7c3943400f67f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:13b602dc3e8dff3063734f02dcf05111e887f301fdda74151a93dbbc249930fe"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:fae5cb554b604b3f9e2c608241b5d8d303e410d7dfb6d397c335f983495ce7f6"},
-    {file = "rpds_py-0.9.2-cp310-none-win32.whl", hash = "sha256:47c5f58a8e0c2c920cc7783113df2fc4ff12bf3a411d985012f145e9242a2764"},
-    {file = "rpds_py-0.9.2-cp310-none-win_amd64.whl", hash = "sha256:4ea6b73c22d8182dff91155af018b11aac9ff7eca085750455c5990cb1cfae6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:e564d2238512c5ef5e9d79338ab77f1cbbda6c2d541ad41b2af445fb200385e3"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f411330a6376fb50e5b7a3e66894e4a39e60ca2e17dce258d53768fea06a37bd"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e7521f5af0233e89939ad626b15278c71b69dc1dfccaa7b97bd4cdf96536bb7"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8d3335c03100a073883857e91db9f2e0ef8a1cf42dc0369cbb9151c149dbbc1b"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d25b1c1096ef0447355f7293fbe9ad740f7c47ae032c2884113f8e87660d8f6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a5d3fbd02efd9cf6a8ffc2f17b53a33542f6b154e88dd7b42ef4a4c0700fdad"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c5934e2833afeaf36bd1eadb57256239785f5af0220ed8d21c2896ec4d3a765f"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:095b460e117685867d45548fbd8598a8d9999227e9061ee7f012d9d264e6048d"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:91378d9f4151adc223d584489591dbb79f78814c0734a7c3bfa9c9e09978121c"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:24a81c177379300220e907e9b864107614b144f6c2a15ed5c3450e19cf536fae"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:de0b6eceb46141984671802d412568d22c6bacc9b230174f9e55fc72ef4f57de"},
-    {file = "rpds_py-0.9.2-cp311-none-win32.whl", hash = "sha256:700375326ed641f3d9d32060a91513ad668bcb7e2cffb18415c399acb25de2ab"},
-    {file = "rpds_py-0.9.2-cp311-none-win_amd64.whl", hash = "sha256:0766babfcf941db8607bdaf82569ec38107dbb03c7f0b72604a0b346b6eb3298"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_10_7_x86_64.whl", hash = "sha256:b1440c291db3f98a914e1afd9d6541e8fc60b4c3aab1a9008d03da4651e67386"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0f2996fbac8e0b77fd67102becb9229986396e051f33dbceada3debaacc7033f"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f30d205755566a25f2ae0382944fcae2f350500ae4df4e795efa9e850821d82"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:159fba751a1e6b1c69244e23ba6c28f879a8758a3e992ed056d86d74a194a0f3"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a1f044792e1adcea82468a72310c66a7f08728d72a244730d14880cd1dabe36b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9251eb8aa82e6cf88510530b29eef4fac825a2b709baf5b94a6094894f252387"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01899794b654e616c8625b194ddd1e5b51ef5b60ed61baa7a2d9c2ad7b2a4238"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0c43f8ae8f6be1d605b0465671124aa8d6a0e40f1fb81dcea28b7e3d87ca1e1"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:207f57c402d1f8712618f737356e4b6f35253b6d20a324d9a47cb9f38ee43a6b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b52e7c5ae35b00566d244ffefba0f46bb6bec749a50412acf42b1c3f402e2c90"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:978fa96dbb005d599ec4fd9ed301b1cc45f1a8f7982d4793faf20b404b56677d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:6aa8326a4a608e1c28da191edd7c924dff445251b94653988efb059b16577a4d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:aad51239bee6bff6823bbbdc8ad85136c6125542bbc609e035ab98ca1e32a192"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4bd4dc3602370679c2dfb818d9c97b1137d4dd412230cfecd3c66a1bf388a196"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dd9da77c6ec1f258387957b754f0df60766ac23ed698b61941ba9acccd3284d1"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:190ca6f55042ea4649ed19c9093a9be9d63cd8a97880106747d7147f88a49d18"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:876bf9ed62323bc7dcfc261dbc5572c996ef26fe6406b0ff985cbcf460fc8a4c"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa2818759aba55df50592ecbc95ebcdc99917fa7b55cc6796235b04193eb3c55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9ea4d00850ef1e917815e59b078ecb338f6a8efda23369677c54a5825dbebb55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:5855c85eb8b8a968a74dc7fb014c9166a05e7e7a8377fb91d78512900aadd13d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:14c408e9d1a80dcb45c05a5149e5961aadb912fff42ca1dd9b68c0044904eb32"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:65a0583c43d9f22cb2130c7b110e695fff834fd5e832a776a107197e59a1898e"},
-    {file = "rpds_py-0.9.2-cp38-none-win32.whl", hash = "sha256:71f2f7715935a61fa3e4ae91d91b67e571aeb5cb5d10331ab681256bda2ad920"},
-    {file = "rpds_py-0.9.2-cp38-none-win_amd64.whl", hash = "sha256:674c704605092e3ebbbd13687b09c9f78c362a4bc710343efe37a91457123044"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:07e2c54bef6838fa44c48dfbc8234e8e2466d851124b551fc4e07a1cfeb37260"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f7fdf55283ad38c33e35e2855565361f4bf0abd02470b8ab28d499c663bc5d7c"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:890ba852c16ace6ed9f90e8670f2c1c178d96510a21b06d2fa12d8783a905193"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:50025635ba8b629a86d9d5474e650da304cb46bbb4d18690532dd79341467846"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:517cbf6e67ae3623c5127206489d69eb2bdb27239a3c3cc559350ef52a3bbf0b"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0836d71ca19071090d524739420a61580f3f894618d10b666cf3d9a1688355b1"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c439fd54b2b9053717cca3de9583be6584b384d88d045f97d409f0ca867d80f"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f68996a3b3dc9335037f82754f9cdbe3a95db42bde571d8c3be26cc6245f2324"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7d68dc8acded354c972116f59b5eb2e5864432948e098c19fe6994926d8e15c3"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:f963c6b1218b96db85fc37a9f0851eaf8b9040aa46dec112611697a7023da535"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5a46859d7f947061b4010e554ccd1791467d1b1759f2dc2ec9055fa239f1bc26"},
-    {file = "rpds_py-0.9.2-cp39-none-win32.whl", hash = "sha256:e07e5dbf8a83c66783a9fe2d4566968ea8c161199680e8ad38d53e075df5f0d0"},
-    {file = "rpds_py-0.9.2-cp39-none-win_amd64.whl", hash = "sha256:682726178138ea45a0766907957b60f3a1bf3acdf212436be9733f28b6c5af3c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_10_7_x86_64.whl", hash = "sha256:196cb208825a8b9c8fc360dc0f87993b8b260038615230242bf18ec84447c08d"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:c7671d45530fcb6d5e22fd40c97e1e1e01965fc298cbda523bb640f3d923b387"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83b32f0940adec65099f3b1c215ef7f1d025d13ff947975a055989cb7fd019a4"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7f67da97f5b9eac838b6980fc6da268622e91f8960e083a34533ca710bec8611"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03975db5f103997904c37e804e5f340c8fdabbb5883f26ee50a255d664eed58c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:987b06d1cdb28f88a42e4fb8a87f094e43f3c435ed8e486533aea0bf2e53d931"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c861a7e4aef15ff91233751619ce3a3d2b9e5877e0fcd76f9ea4f6847183aa16"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:02938432352359805b6da099c9c95c8a0547fe4b274ce8f1a91677401bb9a45f"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:ef1f08f2a924837e112cba2953e15aacfccbbfcd773b4b9b4723f8f2ddded08e"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:35da5cc5cb37c04c4ee03128ad59b8c3941a1e5cd398d78c37f716f32a9b7f67"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:141acb9d4ccc04e704e5992d35472f78c35af047fa0cfae2923835d153f091be"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_10_7_x86_64.whl", hash = "sha256:79f594919d2c1a0cc17d1988a6adaf9a2f000d2e1048f71f298b056b1018e872"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:a06418fe1155e72e16dddc68bb3780ae44cebb2912fbd8bb6ff9161de56e1798"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b2eb034c94b0b96d5eddb290b7b5198460e2d5d0c421751713953a9c4e47d10"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b08605d248b974eb02f40bdcd1a35d3924c83a2a5e8f5d0fa5af852c4d960af"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a0805911caedfe2736935250be5008b261f10a729a303f676d3d5fea6900c96a"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab2299e3f92aa5417d5e16bb45bb4586171c1327568f638e8453c9f8d9e0f020"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c8d7594e38cf98d8a7df25b440f684b510cf4627fe038c297a87496d10a174f"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8b9ec12ad5f0a4625db34db7e0005be2632c1013b253a4a60e8302ad4d462afd"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:1fcdee18fea97238ed17ab6478c66b2095e4ae7177e35fb71fbe561a27adf620"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_i686.whl", hash = "sha256:933a7d5cd4b84f959aedeb84f2030f0a01d63ae6cf256629af3081cf3e3426e8"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:686ba516e02db6d6f8c279d1641f7067ebb5dc58b1d0536c4aaebb7bf01cdc5d"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_10_7_x86_64.whl", hash = "sha256:0173c0444bec0a3d7d848eaeca2d8bd32a1b43f3d3fde6617aac3731fa4be05f"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:d576c3ef8c7b2d560e301eb33891d1944d965a4d7a2eacb6332eee8a71827db6"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed89861ee8c8c47d6beb742a602f912b1bb64f598b1e2f3d758948721d44d468"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1054a08e818f8e18910f1bee731583fe8f899b0a0a5044c6e680ceea34f93876"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99e7c4bb27ff1aab90dcc3e9d37ee5af0231ed98d99cb6f5250de28889a3d502"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c545d9d14d47be716495076b659db179206e3fd997769bc01e2d550eeb685596"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9039a11bca3c41be5a58282ed81ae422fa680409022b996032a43badef2a3752"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fb39aca7a64ad0c9490adfa719dbeeb87d13be137ca189d2564e596f8ba32c07"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:2d8b3b3a2ce0eaa00c5bbbb60b6713e94e7e0becab7b3db6c5c77f979e8ed1f1"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:99b1c16f732b3a9971406fbfe18468592c5a3529585a45a35adbc1389a529a03"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:c27ee01a6c3223025f4badd533bea5e87c988cb0ba2811b690395dfe16088cfe"},
-    {file = "rpds_py-0.9.2.tar.gz", hash = "sha256:8d70e8f14900f2657c249ea4def963bed86a29b81f81f5b76b5a9215680de945"},
-]
-
 [[package]]
 name = "scapy"
 version = "2.5.0"
@@ -826,31 +761,16 @@ files = [
 
 [[package]]
 name = "typing-extensions"
-version = "4.11.0"
+version = "4.12.2"
 description = "Backported and Experimental Type Hints for Python 3.8+"
 optional = false
 python-versions = ">=3.8"
 files = [
-    {file = "typing_extensions-4.11.0-py3-none-any.whl", hash = "sha256:c1f94d72897edaf4ce775bb7558d5b79d8126906a14ea5ed1635921406c0387a"},
-    {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
+    {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"},
+    {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"},
 ]
 
-[[package]]
-name = "warlock"
-version = "2.0.1"
-description = "Python object model built on JSON schema and JSON patch."
-optional = false
-python-versions = ">=3.7,<4.0"
-files = [
-    {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"},
-    {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"},
-]
-
-[package.dependencies]
-jsonpatch = ">=1,<2"
-jsonschema = ">=4,<5"
-
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "f69ffb8c1545d7beb035533dab109722f844f39f9ffd46b7aceb386e90fa039d"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..e5785f27d8 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -19,13 +19,13 @@ documentation = "https://doc.dpdk.org/guides/tools/dts.html"
 
 [tool.poetry.dependencies]
 python = "^3.10"
-warlock = "^2.0.1"
 PyYAML = "^6.0"
 types-PyYAML = "^6.0.8"
 fabric = "^2.7.1"
 scapy = "^2.5.0"
 pydocstyle = "6.1.1"
 typing-extensions = "^4.11.0"
+pydantic = "^2.8.2"
 
 [tool.poetry.group.dev.dependencies]
 mypy = "^1.10.0"
@@ -55,6 +55,7 @@ python_version = "3.10"
 enable_error_code = ["ignore-without-code"]
 show_error_codes = true
 warn_unused_ignores = true
+plugins = "pydantic.mypy"
 
 [tool.isort]
 profile = "black"
-- 
2.34.1


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH 3/5] dts: use Pydantic in the configuration
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
  2024-08-22 16:39 ` [PATCH 1/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
  2024-08-22 16:39 ` [PATCH 2/5] dts: add Pydantic and remove Warlock Luca Vizzarro
@ 2024-08-22 16:39 ` Luca Vizzarro
  2024-09-17 11:13   ` Juraj Linkeš
                     ` (2 more replies)
  2024-08-22 16:39 ` [PATCH 4/5] dts: use TestSuiteSpec class imports Luca Vizzarro
                   ` (6 subsequent siblings)
  9 siblings, 3 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-08-22 16:39 UTC (permalink / raw)
  To: dev
  Cc: Honnappa Nagarahalli, Juraj Linkeš, Luca Vizzarro, Paul Szczepanek

This change brings in Pydantic in place of Warlock. Pydantic offers
a built-in model validation system in the classes, which allows for
a more resilient and simpler code. As a consequence of this change:

- most validation is now built-in
- further validation is added to verify:
  - cross referencing of node names and ports
  - test suite and test cases names
- dictionaries representing the config schema are removed
- the config schema is no longer used for validation but kept as an
  alternative format for the developer
- the config schema can now be generated automatically from the
  Pydantic models
- the TrafficGeneratorType enum has been changed from inheriting
  StrEnum to the native str and Enum. This change was necessary to
  enable the discriminator for object unions
- the structure of the classes has been slightly changed to perfectly
  match the structure of the configuration files
- updates the test suite argument to catch the ValidationError that
  TestSuiteConfig can now raise

Bugzilla ID: 1508

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/config/__init__.py              | 588 +++++++++---------
 dts/framework/config/types.py                 | 132 ----
 dts/framework/runner.py                       |  35 +-
 dts/framework/settings.py                     |  16 +-
 dts/framework/testbed_model/sut_node.py       |   2 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 7 files changed, 325 insertions(+), 454 deletions(-)
 delete mode 100644 dts/framework/config/types.py

diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index df60a5030e..013c529829 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -2,17 +2,19 @@
 # Copyright(c) 2010-2021 Intel Corporation
 # Copyright(c) 2022-2023 University of New Hampshire
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Testbed configuration and test suite specification.
 
 This package offers classes that hold real-time information about the testbed, hold test run
 configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
-the YAML test run configuration file
-and validates it according to :download:`the schema <conf_yaml_schema.json>`.
+the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
+dataclass model. The Pydantic model is also available as
+:download:`JSON schema <conf_yaml_schema.json>`.
 
 The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
-this package. The allowed keys and types inside this dictionary are defined in
-the :doc:`types <framework.config.types>` module.
+this package. The allowed keys and types inside this dictionary map directly to the
+:class:`Configuration` model, its fields and sub-models.
 
 The test run configuration has two main sections:
 
@@ -24,7 +26,7 @@
 
 The real-time information about testbed is supposed to be gathered at runtime.
 
-The classes defined in this package make heavy use of :mod:`dataclasses`.
+The classes defined in this package make heavy use of :mod:`pydantic.dataclasses`.
 All of them use slots and are frozen:
 
     * Slots enables some optimizations, by pre-allocating space for the defined
@@ -33,29 +35,31 @@
       and makes it thread safe should we ever want to move in that direction.
 """
 
-import json
-import os.path
-from dataclasses import dataclass, fields
-from enum import auto, unique
+from enum import Enum, auto, unique
+from functools import cached_property
 from pathlib import Path
-from typing import Union
+from typing import TYPE_CHECKING, Annotated, Any, Literal, NamedTuple, Protocol
 
-import warlock  # type: ignore[import-untyped]
 import yaml
+from pydantic import (
+    ConfigDict,
+    Field,
+    StringConstraints,
+    TypeAdapter,
+    ValidationError,
+    field_validator,
+    model_validator,
+)
+from pydantic.config import JsonDict
+from pydantic.dataclasses import dataclass
 from typing_extensions import Self
 
-from framework.config.types import (
-    BuildTargetConfigDict,
-    ConfigurationDict,
-    NodeConfigDict,
-    PortConfigDict,
-    TestRunConfigDict,
-    TestSuiteConfigDict,
-    TrafficGeneratorConfigDict,
-)
 from framework.exception import ConfigurationError
 from framework.utils import StrEnum
 
+if TYPE_CHECKING:
+    from framework.test_suite import TestSuiteSpec
+
 
 @unique
 class Architecture(StrEnum):
@@ -116,14 +120,14 @@ class Compiler(StrEnum):
 
 
 @unique
-class TrafficGeneratorType(StrEnum):
+class TrafficGeneratorType(str, Enum):
     """The supported traffic generators."""
 
     #:
-    SCAPY = auto()
+    SCAPY = "SCAPY"
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class HugepageConfiguration:
     r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
 
@@ -136,12 +140,17 @@ class HugepageConfiguration:
     force_first_numa: bool
 
 
-@dataclass(slots=True, frozen=True)
+PciAddress = Annotated[
+    str, StringConstraints(pattern=r"^[\da-fA-F]{4}:[\da-fA-F]{2}:[\da-fA-F]{2}.\d:?\w*$")
+]
+"""A constrained string type representing a PCI address."""
+
+
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class PortConfig:
     r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
-        node: The :class:`~framework.testbed_model.node.Node` where this port exists.
         pci: The PCI address of the port.
         os_driver_for_dpdk: The operating system driver name for use with DPDK.
         os_driver: The operating system driver name when the operating system controls the port.
@@ -150,69 +159,53 @@ class PortConfig:
         peer_pci: The PCI address of the port connected to this port.
     """
 
-    node: str
-    pci: str
-    os_driver_for_dpdk: str
-    os_driver: str
-    peer_node: str
-    peer_pci: str
-
-    @classmethod
-    def from_dict(cls, node: str, d: PortConfigDict) -> Self:
-        """A convenience method that creates the object from fewer inputs.
-
-        Args:
-            node: The node where this port exists.
-            d: The configuration dictionary.
-
-        Returns:
-            The port configuration instance.
-        """
-        return cls(node=node, **d)
-
+    pci: PciAddress = Field(description="The local PCI address of the port.")
+    os_driver_for_dpdk: str = Field(
+        description="The driver that the kernel should bind this device to for DPDK to use it.",
+        examples=["vfio-pci", "mlx5_core"],
+    )
+    os_driver: str = Field(
+        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
+    )
+    peer_node: str = Field(description="The name of the peer node this port is connected to.")
+    peer_pci: PciAddress = Field(
+        description="The PCI address of the peer port this port is connected to."
+    )
 
-@dataclass(slots=True, frozen=True)
-class TrafficGeneratorConfig:
-    """The configuration of traffic generators.
 
-    The class will be expanded when more configuration is needed.
+class TrafficGeneratorConfig(Protocol):
+    """A protocol required to define traffic generator types.
 
     Attributes:
-        traffic_generator_type: The type of the traffic generator.
+        type: The traffic generator type, the child class is required to define to be distinguished
+            among others.
     """
 
-    traffic_generator_type: TrafficGeneratorType
+    type: TrafficGeneratorType
 
-    @staticmethod
-    def from_dict(d: TrafficGeneratorConfigDict) -> "TrafficGeneratorConfig":
-        """A convenience method that produces traffic generator config of the proper type.
 
-        Args:
-            d: The configuration dictionary.
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
+class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
+    """Scapy traffic generator specific configuration."""
 
-        Returns:
-            The traffic generator configuration instance.
+    type: Literal[TrafficGeneratorType.SCAPY]
 
-        Raises:
-            ConfigurationError: An unknown traffic generator type was encountered.
-        """
-        match TrafficGeneratorType(d["type"]):
-            case TrafficGeneratorType.SCAPY:
-                return ScapyTrafficGeneratorConfig(
-                    traffic_generator_type=TrafficGeneratorType.SCAPY
-                )
-            case _:
-                raise ConfigurationError(f'Unknown traffic generator type "{d["type"]}".')
 
+TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
 
-@dataclass(slots=True, frozen=True)
-class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
-    """Scapy traffic generator specific configuration."""
 
-    pass
+LogicalCores = Annotated[
+    str,
+    StringConstraints(pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$"),
+    Field(
+        description="Comma-separated list of logical cores to use. "
+        "An empty string means use all lcores.",
+        examples=["1,2,3,4,5,18-22", "10-15"],
+    ),
+]
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class NodeConfiguration:
     r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
 
@@ -232,69 +225,25 @@ class NodeConfiguration:
         ports: The ports that can be used in testing.
     """
 
-    name: str
-    hostname: str
-    user: str
-    password: str | None
+    name: str = Field(description="A unique identifier for this node.")
+    hostname: str = Field(description="The hostname or IP address of the node.")
+    user: str = Field(description="The login user to use to connect to this node.")
+    password: str | None = Field(
+        default=None,
+        description="The login password to use to connect to this node. "
+        "SSH keys are STRONGLY preferred, use only as last resort.",
+    )
     arch: Architecture
     os: OS
-    lcores: str
-    use_first_core: bool
-    hugepages: HugepageConfiguration | None
-    ports: list[PortConfig]
-
-    @staticmethod
-    def from_dict(
-        d: NodeConfigDict,
-    ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
-        """A convenience method that processes the inputs before creating a specialized instance.
-
-        Args:
-            d: The configuration dictionary.
-
-        Returns:
-            Either an SUT or TG configuration instance.
-        """
-        hugepage_config = None
-        if "hugepages_2mb" in d:
-            hugepage_config_dict = d["hugepages_2mb"]
-            if "force_first_numa" not in hugepage_config_dict:
-                hugepage_config_dict["force_first_numa"] = False
-            hugepage_config = HugepageConfiguration(**hugepage_config_dict)
-
-        # The calls here contain duplicated code which is here because Mypy doesn't
-        # properly support dictionary unpacking with TypedDicts
-        if "traffic_generator" in d:
-            return TGNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                traffic_generator=TrafficGeneratorConfig.from_dict(d["traffic_generator"]),
-            )
-        else:
-            return SutNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                memory_channels=d.get("memory_channels", 1),
-            )
+    lcores: LogicalCores = "1"
+    use_first_core: bool = Field(
+        default=False, description="DPDK won't use the first physical core if set to False."
+    )
+    hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
+    ports: list[PortConfig] = Field(min_length=1)
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class SutNodeConfiguration(NodeConfiguration):
     """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
 
@@ -302,10 +251,12 @@ class SutNodeConfiguration(NodeConfiguration):
         memory_channels: The number of memory channels to use when running DPDK.
     """
 
-    memory_channels: int
+    memory_channels: int = Field(
+        default=1, description="Number of memory channels to use when running DPDK."
+    )
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class TGNodeConfiguration(NodeConfiguration):
     """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
 
@@ -313,10 +264,14 @@ class TGNodeConfiguration(NodeConfiguration):
         traffic_generator: The configuration of the traffic generator present on the TG node.
     """
 
-    traffic_generator: TrafficGeneratorConfig
+    traffic_generator: TrafficGeneratorConfigTypes
+
 
+NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
+"""Union type for all the node configuration types."""
 
-@dataclass(slots=True, frozen=True)
+
+@dataclass(slots=True, frozen=True, config=ConfigDict(extra="forbid"))
 class NodeInfo:
     """Supplemental node information.
 
@@ -334,7 +289,7 @@ class NodeInfo:
     kernel_version: str
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class BuildTargetConfiguration:
     """DPDK build configuration.
 
@@ -347,40 +302,21 @@ class BuildTargetConfiguration:
         compiler: The compiler executable to use.
         compiler_wrapper: This string will be put in front of the compiler when
             executing the build. Useful for adding wrapper commands, such as ``ccache``.
-        name: The name of the compiler.
     """
 
     arch: Architecture
     os: OS
     cpu: CPUType
     compiler: Compiler
-    compiler_wrapper: str
-    name: str
+    compiler_wrapper: str = ""
 
-    @classmethod
-    def from_dict(cls, d: BuildTargetConfigDict) -> Self:
-        r"""A convenience method that processes the inputs before creating an instance.
-
-        `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
-        `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
-
-        Args:
-            d: The configuration dictionary.
-
-        Returns:
-            The build target configuration instance.
-        """
-        return cls(
-            arch=Architecture(d["arch"]),
-            os=OS(d["os"]),
-            cpu=CPUType(d["cpu"]),
-            compiler=Compiler(d["compiler"]),
-            compiler_wrapper=d.get("compiler_wrapper", ""),
-            name=f"{d['arch']}-{d['os']}-{d['cpu']}-{d['compiler']}",
-        )
+    @cached_property
+    def name(self) -> str:
+        """The name of the compiler."""
+        return f"{self.arch}-{self.os}-{self.cpu}-{self.compiler}"
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class BuildTargetInfo:
     """Various versions and other information about a build target.
 
@@ -393,11 +329,39 @@ class BuildTargetInfo:
     compiler_version: str
 
 
-@dataclass(slots=True, frozen=True)
+def make_parsable_schema(schema: JsonDict):
+    """Updates a model's JSON schema to make a string representation a valid alternative.
+
+    This utility function is required to be used with models that can be represented and validated
+    as a string instead of an object mapping. Normally the generated JSON schema will just show
+    the object mapping. This function wraps the mapping under an anyOf property sequenced with a
+    string type.
+
+    This function is a valid `Callable` for the `json_schema_extra` attribute of
+    `~pydantic.config.ConfigDict`.
+    """
+    inner_schema = schema.copy()
+    del inner_schema["title"]
+
+    title = schema.get("title")
+    description = schema.get("description")
+
+    schema.clear()
+
+    schema["title"] = title
+    schema["description"] = description
+    schema["anyOf"] = [inner_schema, {"type": "string"}]
+
+
+@dataclass(
+    frozen=True,
+    config=ConfigDict(extra="forbid", json_schema_extra=make_parsable_schema),
+)
 class TestSuiteConfig:
     """Test suite configuration.
 
-    Information about a single test suite to be executed.
+    Information about a single test suite to be executed. It can be represented and validated as a
+    string type in the form of: ``TEST_SUITE [TEST_CASE, ...]``, in the configuration file.
 
     Attributes:
         test_suite: The name of the test suite module without the starting ``TestSuite_``.
@@ -405,31 +369,63 @@ class TestSuiteConfig:
             If empty, all test cases will be executed.
     """
 
-    test_suite: str
-    test_cases: list[str]
-
+    test_suite_name: str = Field(
+        title="Test suite name",
+        description="The identifying name of the test suite.",
+        alias="test_suite",
+    )
+    test_cases_names: list[str] = Field(
+        default_factory=list,
+        title="Test cases by name",
+        description="The identifying name of the test cases of the test suite.",
+        alias="test_cases",
+    )
+
+    @cached_property
+    def test_suite_spec(self) -> "TestSuiteSpec":
+        """The specification of the requested test suite."""
+        from framework.test_suite import find_by_name
+
+        test_suite_spec = find_by_name(self.test_suite_name)
+        assert test_suite_spec is not None, f"{self.test_suite_name} is not a valid test suite name"
+        return test_suite_spec
+
+    @model_validator(mode="before")
     @classmethod
-    def from_dict(
-        cls,
-        entry: str | TestSuiteConfigDict,
-    ) -> Self:
-        """Create an instance from two different types.
+    def convert_from_string(cls, data: Any) -> Any:
+        """Convert the string representation into a valid mapping."""
+        if isinstance(data, str):
+            [test_suite, *test_cases] = data.split()
+            return dict(test_suite=test_suite, test_cases=test_cases)
+        return data
+
+    @model_validator(mode="after")
+    def validate_names(self) -> Self:
+        """Validate the supplied test suite and test cases names."""
+        available_test_cases = map(lambda t: t.name, self.test_suite_spec.test_cases)
+        for requested_test_case in self.test_cases_names:
+            assert requested_test_case in available_test_cases, (
+                f"{requested_test_case} is not a valid test case "
+                f"for test suite {self.test_suite_name}"
+            )
 
-        Args:
-            entry: Either a suite name or a dictionary containing the config.
+        return self
 
-        Returns:
-            The test suite configuration instance.
-        """
-        if isinstance(entry, str):
-            return cls(test_suite=entry, test_cases=[])
-        elif isinstance(entry, dict):
-            return cls(test_suite=entry["suite"], test_cases=entry["cases"])
-        else:
-            raise TypeError(f"{type(entry)} is not valid for a test suite config.")
+
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
+class TestRunSUTNodeConfiguration:
+    """The SUT node configuration of a test run.
+
+    Attributes:
+        node_name: The SUT node to use in this test run.
+        vdevs: The names of virtual devices to test.
+    """
+
+    node_name: str
+    vdevs: list[str] = Field(default_factory=list)
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))
 class TestRunConfiguration:
     """The configuration of a test run.
 
@@ -442,143 +438,132 @@ class TestRunConfiguration:
         func: Whether to run functional tests.
         skip_smoke_tests: Whether to skip smoke tests.
         test_suites: The names of test suites and/or test cases to execute.
-        system_under_test_node: The SUT node to use in this test run.
-        traffic_generator_node: The TG node to use in this test run.
-        vdevs: The names of virtual devices to test.
+        system_under_test_node: The SUT node configuration to use in this test run.
+        traffic_generator_node: The TG node name to use in this test run.
     """
 
     build_targets: list[BuildTargetConfiguration]
-    perf: bool
-    func: bool
-    skip_smoke_tests: bool
-    test_suites: list[TestSuiteConfig]
-    system_under_test_node: SutNodeConfiguration
-    traffic_generator_node: TGNodeConfiguration
-    vdevs: list[str]
+    perf: bool = Field(description="Enable performance testing.")
+    func: bool = Field(description="Enable functional testing.")
+    skip_smoke_tests: bool = False
+    test_suites: list[TestSuiteConfig] = Field(min_length=1)
+    system_under_test_node: TestRunSUTNodeConfiguration
+    traffic_generator_node: str
 
-    @classmethod
-    def from_dict(
-        cls,
-        d: TestRunConfigDict,
-        node_map: dict[str, SutNodeConfiguration | TGNodeConfiguration],
-    ) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
-
-        The build target and the test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
-
-        Args:
-            d: The test run configuration dictionary.
-            node_map: A dictionary mapping node names to their config objects.
-
-        Returns:
-            The test run configuration instance.
-        """
-        build_targets: list[BuildTargetConfiguration] = list(
-            map(BuildTargetConfiguration.from_dict, d["build_targets"])
-        )
-        test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"]))
-        sut_name = d["system_under_test_node"]["node_name"]
-        skip_smoke_tests = d.get("skip_smoke_tests", False)
-        assert sut_name in node_map, f"Unknown SUT {sut_name} in test run {d}"
-        system_under_test_node = node_map[sut_name]
-        assert isinstance(
-            system_under_test_node, SutNodeConfiguration
-        ), f"Invalid SUT configuration {system_under_test_node}"
-
-        tg_name = d["traffic_generator_node"]
-        assert tg_name in node_map, f"Unknown TG {tg_name} in test run {d}"
-        traffic_generator_node = node_map[tg_name]
-        assert isinstance(
-            traffic_generator_node, TGNodeConfiguration
-        ), f"Invalid TG configuration {traffic_generator_node}"
-
-        vdevs = (
-            d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else []
-        )
-        return cls(
-            build_targets=build_targets,
-            perf=d["perf"],
-            func=d["func"],
-            skip_smoke_tests=skip_smoke_tests,
-            test_suites=test_suites,
-            system_under_test_node=system_under_test_node,
-            traffic_generator_node=traffic_generator_node,
-            vdevs=vdevs,
-        )
-
-    def copy_and_modify(self, **kwargs) -> Self:
-        """Create a shallow copy with any of the fields modified.
-
-        The only new data are those passed to this method.
-        The rest are copied from the object's fields calling the method.
-
-        Args:
-            **kwargs: The names and types of keyword arguments are defined
-                by the fields of the :class:`TestRunConfiguration` class.
-
-        Returns:
-            The copied and modified test run configuration.
-        """
-        new_config = {}
-        for field in fields(self):
-            if field.name in kwargs:
-                new_config[field.name] = kwargs[field.name]
-            else:
-                new_config[field.name] = getattr(self, field.name)
 
-        return type(self)(**new_config)
+class TestRunWithNodesConfiguration(NamedTuple):
+    """Tuple containing the configuration of the test run and its associated nodes."""
 
+    #:
+    test_run_config: TestRunConfiguration
+    #:
+    sut_node_config: SutNodeConfiguration
+    #:
+    tg_node_config: TGNodeConfiguration
 
-@dataclass(slots=True, frozen=True)
+
+@dataclass(frozen=True, kw_only=True)
 class Configuration:
     """DTS testbed and test configuration.
 
-    The node configuration is not stored in this object. Rather, all used node configurations
-    are stored inside the test run configuration where the nodes are actually used.
-
     Attributes:
         test_runs: Test run configurations.
+        nodes: Node configurations.
     """
 
-    test_runs: list[TestRunConfiguration]
+    test_runs: list[TestRunConfiguration] = Field(min_length=1)
+    nodes: list[NodeConfigurationTypes] = Field(min_length=1)
 
+    @field_validator("nodes")
     @classmethod
-    def from_dict(cls, d: ConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
+    def validate_node_names(cls, nodes: list[NodeConfiguration]) -> list[NodeConfiguration]:
+        """Validate that the node names are unique."""
+        nodes_by_name: dict[str, int] = {}
+        for node_no, node in enumerate(nodes):
+            assert node.name not in nodes_by_name, (
+                f"node {node_no} cannot have the same name as node {nodes_by_name[node.name]} "
+                f"({node.name})"
+            )
+            nodes_by_name[node.name] = node_no
+
+        return nodes
+
+    @model_validator(mode="after")
+    def validate_ports(self) -> Self:
+        """Validate that the ports are all linked to valid ones."""
+        port_links: dict[tuple[str, str], Literal[False] | tuple[int, int]] = {
+            (node.name, port.pci): False for node in self.nodes for port in node.ports
+        }
+
+        for node_no, node in enumerate(self.nodes):
+            for port_no, port in enumerate(node.ports):
+                peer_port_identifier = (port.peer_node, port.peer_pci)
+                peer_port = port_links.get(peer_port_identifier, None)
+                assert peer_port is not None, (
+                    "invalid peer port specified for " f"nodes.{node_no}.ports.{port_no}"
+                )
+                assert peer_port is False, (
+                    f"the peer port specified for nodes.{node_no}.ports.{port_no} "
+                    f"is already linked to nodes.{peer_port[0]}.ports.{peer_port[1]}"
+                )
+                port_links[peer_port_identifier] = (node_no, port_no)
 
-        Build target and test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
+        return self
 
-        Args:
-            d: The configuration dictionary.
+    @cached_property
+    def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]:
+        """List test runs with the associated nodes."""
+        test_runs_with_nodes = []
 
-        Returns:
-            The whole configuration instance.
-        """
-        nodes: list[SutNodeConfiguration | TGNodeConfiguration] = list(
-            map(NodeConfiguration.from_dict, d["nodes"])
-        )
-        assert len(nodes) > 0, "There must be a node to test"
+        for test_run_no, test_run in enumerate(self.test_runs):
+            sut_node_name = test_run.system_under_test_node.node_name
+            sut_node = next(filter(lambda n: n.name == sut_node_name, self.nodes), None)
 
-        node_map = {node.name: node for node in nodes}
-        assert len(nodes) == len(node_map), "Duplicate node names are not allowed"
+            assert sut_node is not None, (
+                f"test_runs.{test_run_no}.sut_node_config.node_name "
+                f"({test_run.system_under_test_node.node_name}) is not a valid node name"
+            )
+            assert isinstance(sut_node, SutNodeConfiguration), (
+                f"test_runs.{test_run_no}.sut_node_config.node_name is a valid node name, "
+                "but it is not a valid SUT node"
+            )
+
+            tg_node_name = test_run.traffic_generator_node
+            tg_node = next(filter(lambda n: n.name == tg_node_name, self.nodes), None)
 
-        test_runs: list[TestRunConfiguration] = list(
-            map(TestRunConfiguration.from_dict, d["test_runs"], [node_map for _ in d])
-        )
+            assert tg_node is not None, (
+                f"test_runs.{test_run_no}.tg_node_name "
+                f"({test_run.traffic_generator_node}) is not a valid node name"
+            )
+            assert isinstance(tg_node, TGNodeConfiguration), (
+                f"test_runs.{test_run_no}.tg_node_name is a valid node name, "
+                "but it is not a valid TG node"
+            )
 
-        return cls(test_runs=test_runs)
+            test_runs_with_nodes.append(TestRunWithNodesConfiguration(test_run, sut_node, tg_node))
+
+        return test_runs_with_nodes
+
+    @model_validator(mode="after")
+    def validate_test_runs_with_nodes(self) -> Self:
+        """Validate the test runs to nodes associations.
+
+        This validator relies on the cached property `test_runs_with_nodes` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        if self.test_runs_with_nodes:
+            pass
+        return self
+
+
+ConfigurationType = TypeAdapter(Configuration)
 
 
 def load_config(config_file_path: Path) -> Configuration:
     """Load DTS test run configuration from a file.
 
-    Load the YAML test run configuration file
-    and :download:`the configuration file schema <conf_yaml_schema.json>`,
-    validate the test run configuration file, and create a test run configuration object.
+    Load the YAML test run configuration file, validate it, and create a test run configuration
+    object.
 
     The YAML test run configuration file is specified in the :option:`--config-file` command line
     argument or the :envvar:`DTS_CFG_FILE` environment variable.
@@ -588,14 +573,15 @@ def load_config(config_file_path: Path) -> Configuration:
 
     Returns:
         The parsed test run configuration.
+
+    Raises:
+        ConfigurationError: If the supplied configuration file is invalid.
     """
     with open(config_file_path, "r") as f:
         config_data = yaml.safe_load(f)
 
-    schema_path = os.path.join(Path(__file__).parent.resolve(), "conf_yaml_schema.json")
-
-    with open(schema_path, "r") as f:
-        schema = json.load(f)
-    config = warlock.model_factory(schema, name="_Config")(config_data)
-    config_obj: Configuration = Configuration.from_dict(dict(config))  # type: ignore[arg-type]
-    return config_obj
+    try:
+        ConfigurationType.json_schema()
+        return ConfigurationType.validate_python(config_data)
+    except ValidationError as e:
+        raise ConfigurationError("failed to load the supplied configuration") from e
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
deleted file mode 100644
index cf16556403..0000000000
--- a/dts/framework/config/types.py
+++ /dev/null
@@ -1,132 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-"""Configuration dictionary contents specification.
-
-These type definitions serve as documentation of the configuration dictionary contents.
-
-The definitions use the built-in :class:`~typing.TypedDict` construct.
-"""
-
-from typing import TypedDict
-
-
-class PortConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    pci: str
-    #:
-    os_driver_for_dpdk: str
-    #:
-    os_driver: str
-    #:
-    peer_node: str
-    #:
-    peer_pci: str
-
-
-class TrafficGeneratorConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    type: str
-
-
-class HugepageConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    number_of: int
-    #:
-    force_first_numa: bool
-
-
-class NodeConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    hugepages_2mb: HugepageConfigurationDict
-    #:
-    name: str
-    #:
-    hostname: str
-    #:
-    user: str
-    #:
-    password: str
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    lcores: str
-    #:
-    use_first_core: bool
-    #:
-    ports: list[PortConfigDict]
-    #:
-    memory_channels: int
-    #:
-    traffic_generator: TrafficGeneratorConfigDict
-
-
-class BuildTargetConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    cpu: str
-    #:
-    compiler: str
-    #:
-    compiler_wrapper: str
-
-
-class TestSuiteConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    suite: str
-    #:
-    cases: list[str]
-
-
-class TestRunSUTConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    node_name: str
-    #:
-    vdevs: list[str]
-
-
-class TestRunConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    build_targets: list[BuildTargetConfigDict]
-    #:
-    perf: bool
-    #:
-    func: bool
-    #:
-    skip_smoke_tests: bool
-    #:
-    test_suites: TestSuiteConfigDict
-    #:
-    system_under_test_node: TestRunSUTConfigDict
-    #:
-    traffic_generator_node: str
-
-
-class ConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    nodes: list[NodeConfigDict]
-    #:
-    test_runs: list[TestRunConfigDict]
diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 6b6f6a05f5..14e405aced 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -32,8 +32,10 @@
 from .config import (
     BuildTargetConfiguration,
     Configuration,
+    SutNodeConfiguration,
     TestRunConfiguration,
     TestSuiteConfig,
+    TGNodeConfiguration,
     load_config,
 )
 from .exception import (
@@ -142,18 +144,17 @@ def run(self) -> None:
             self._result.update_setup(Result.PASS)
 
             # for all test run sections
-            for test_run_config in self._configuration.test_runs:
+            for test_run_with_nodes_config in self._configuration.test_runs_with_nodes:
+                test_run_config, sut_node_config, tg_node_config = test_run_with_nodes_config
                 self._logger.set_stage(DtsStage.test_run_setup)
-                self._logger.info(
-                    f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
-                )
+                self._logger.info(f"Running test run with SUT '{sut_node_config.name}'.")
                 test_run_result = self._result.add_test_run(test_run_config)
                 # we don't want to modify the original config, so create a copy
                 test_run_test_suites = list(
                     SETTINGS.test_suites if SETTINGS.test_suites else test_run_config.test_suites
                 )
                 if not test_run_config.skip_smoke_tests:
-                    test_run_test_suites[:0] = [TestSuiteConfig.from_dict("smoke_tests")]
+                    test_run_test_suites[:0] = [TestSuiteConfig("smoke_tests")]
                 try:
                     test_suites_with_cases = self._get_test_suites_with_cases(
                         test_run_test_suites, test_run_config.func, test_run_config.perf
@@ -169,6 +170,8 @@ def run(self) -> None:
                     self._connect_nodes_and_run_test_run(
                         sut_nodes,
                         tg_nodes,
+                        sut_node_config,
+                        tg_node_config,
                         test_run_config,
                         test_run_result,
                         test_suites_with_cases,
@@ -231,10 +234,10 @@ def _get_test_suites_with_cases(
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
+            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
             test_cases = []
             func_test_cases, perf_test_cases = self._filter_test_cases(
-                test_suite_class, test_suite_config.test_cases
+                test_suite_class, test_suite_config.test_cases_names
             )
             if func:
                 test_cases.extend(func_test_cases)
@@ -364,6 +367,8 @@ def _connect_nodes_and_run_test_run(
         self,
         sut_nodes: dict[str, SutNode],
         tg_nodes: dict[str, TGNode],
+        sut_node_config: SutNodeConfiguration,
+        tg_node_config: TGNodeConfiguration,
         test_run_config: TestRunConfiguration,
         test_run_result: TestRunResult,
         test_suites_with_cases: Iterable[TestSuiteWithCases],
@@ -378,24 +383,26 @@ def _connect_nodes_and_run_test_run(
         Args:
             sut_nodes: A dictionary storing connected/to be connected SUT nodes.
             tg_nodes: A dictionary storing connected/to be connected TG nodes.
+            sut_node_config: The test run's SUT node configuration.
+            tg_node_config: The test run's TG node configuration.
             test_run_config: A test run configuration.
             test_run_result: The test run's result.
             test_suites_with_cases: The test suites with test cases to run.
         """
-        sut_node = sut_nodes.get(test_run_config.system_under_test_node.name)
-        tg_node = tg_nodes.get(test_run_config.traffic_generator_node.name)
+        sut_node = sut_nodes.get(sut_node_config.name)
+        tg_node = tg_nodes.get(tg_node_config.name)
 
         try:
             if not sut_node:
-                sut_node = SutNode(test_run_config.system_under_test_node)
+                sut_node = SutNode(sut_node_config)
                 sut_nodes[sut_node.name] = sut_node
             if not tg_node:
-                tg_node = TGNode(test_run_config.traffic_generator_node)
+                tg_node = TGNode(tg_node_config)
                 tg_nodes[tg_node.name] = tg_node
         except Exception as e:
-            failed_node = test_run_config.system_under_test_node.name
+            failed_node = test_run_config.system_under_test_node.node_name
             if sut_node:
-                failed_node = test_run_config.traffic_generator_node.name
+                failed_node = test_run_config.traffic_generator_node
             self._logger.exception(f"The Creation of node {failed_node} failed.")
             test_run_result.update_setup(Result.FAIL, e)
 
@@ -425,7 +432,7 @@ def _run_test_run(
             test_suites_with_cases: The test suites with test cases to run.
         """
         self._logger.info(
-            f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
+            f"Running test run with SUT '{test_run_config.system_under_test_node.node_name}'."
         )
         test_run_result.add_sut_info(sut_node.node_info)
         try:
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index f6303066d4..2e8dedef4f 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -85,6 +85,8 @@
 from pathlib import Path
 from typing import Callable
 
+from pydantic import ValidationError
+
 from .config import TestSuiteConfig
 from .exception import ConfigurationError
 from .utils import DPDKGitTarball, get_commit_id
@@ -391,11 +393,21 @@ def _process_test_suites(
     Returns:
         A list of test suite configurations to execute.
     """
-    if parser.find_action("test_suites", _is_from_env):
+    action = parser.find_action("test_suites", _is_from_env)
+    if action:
         # Environment variable in the form of "SUITE1 CASE1 CASE2, SUITE2 CASE1, SUITE3, ..."
         args = [suite_with_cases.split() for suite_with_cases in args[0][0].split(",")]
 
-    return [TestSuiteConfig(test_suite, test_cases) for [test_suite, *test_cases] in args]
+    try:
+        return [TestSuiteConfig(test_suite, test_cases) for [test_suite, *test_cases] in args]
+    except ValidationError as e:
+        print(
+            "An error has occurred while validating the test suites supplied in the "
+            f"{'environment variable' if action else 'arguments'}:",
+            file=sys.stderr,
+        )
+        print(e, file=sys.stderr)
+        sys.exit(1)
 
 
 def get_settings() -> Settings:
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 2855fe0276..5957e8140c 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -181,7 +181,7 @@ def set_up_test_run(self, test_run_config: TestRunConfiguration) -> None:
                 the setup steps will be taken.
         """
         super().set_up_test_run(test_run_config)
-        for vdev in test_run_config.vdevs:
+        for vdev in test_run_config.system_under_test_node.vdevs:
             self.virtual_devices.append(VirtualDevice(vdev))
 
     def tear_down_test_run(self) -> None:
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
index 6dac86a224..bb271836a9 100644
--- a/dts/framework/testbed_model/traffic_generator/__init__.py
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -38,6 +38,4 @@ def create_traffic_generator(
         case ScapyTrafficGeneratorConfig():
             return ScapyTrafficGenerator(tg_node, traffic_generator_config)
         case _:
-            raise ConfigurationError(
-                f"Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
-            )
+            raise ConfigurationError(f"Unknown traffic generator: {traffic_generator_config.type}")
diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 4ce1148706..39a4170979 100644
--- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -39,7 +39,7 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
         """
         self._config = config
         self._tg_node = tg_node
-        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
+        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.type}")
 
     def send_packet(self, packet: Packet, port: Port) -> None:
         """Send `packet` and block until it is fully sent.
-- 
2.34.1


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH 4/5] dts: use TestSuiteSpec class imports
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
                   ` (2 preceding siblings ...)
  2024-08-22 16:39 ` [PATCH 3/5] dts: use Pydantic in the configuration Luca Vizzarro
@ 2024-08-22 16:39 ` Luca Vizzarro
  2024-09-17 11:39   ` Juraj Linkeš
                     ` (2 more replies)
  2024-08-22 16:39 ` [PATCH 5/5] dts: add JSON schema generation script Luca Vizzarro
                   ` (5 subsequent siblings)
  9 siblings, 3 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-08-22 16:39 UTC (permalink / raw)
  To: dev
  Cc: Honnappa Nagarahalli, Juraj Linkeš, Luca Vizzarro, Paul Szczepanek

The introduction of TestSuiteSpec adds auto-discovery of test suites,
which are also automatically imported. This causes double imports as the
runner loads the test suites. This changes the behaviour of the runner
to load the imported classes from TestSuiteSpec instead of importing
them again.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/runner.py | 167 +++++++---------------------------------
 1 file changed, 27 insertions(+), 140 deletions(-)

diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 14e405aced..00b63cc292 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -2,6 +2,7 @@
 # Copyright(c) 2010-2019 Intel Corporation
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
+# Copyright(c) 2024 Arm Limited
 
 """Test suite runner module.
 
@@ -17,14 +18,11 @@
 and the test case stage runs test cases individually.
 """
 
-import importlib
-import inspect
 import os
-import re
 import sys
 from pathlib import Path
 from types import FunctionType
-from typing import Iterable, Sequence
+from typing import Iterable
 
 from framework.testbed_model.sut_node import SutNode
 from framework.testbed_model.tg_node import TGNode
@@ -38,12 +36,7 @@
     TGNodeConfiguration,
     load_config,
 )
-from .exception import (
-    BlockingTestSuiteError,
-    ConfigurationError,
-    SSHTimeoutError,
-    TestCaseVerifyError,
-)
+from .exception import BlockingTestSuiteError, SSHTimeoutError, TestCaseVerifyError
 from .logger import DTSLogger, DtsStage, get_dts_logger
 from .settings import SETTINGS
 from .test_result import (
@@ -55,7 +48,7 @@
     TestSuiteResult,
     TestSuiteWithCases,
 )
-from .test_suite import TestSuite
+from .test_suite import TestCase, TestCaseVariant, TestSuite
 
 
 class DTSRunner:
@@ -217,11 +210,10 @@ def _get_test_suites_with_cases(
         func: bool,
         perf: bool,
     ) -> list[TestSuiteWithCases]:
-        """Test suites with test cases discovery.
+        """Get test suites with selected cases.
 
-        The test suites with test cases defined in the user configuration are discovered
-        and stored for future use so that we don't import the modules twice and so that
-        the list of test suites with test cases is available for recording right away.
+        The test suites with test cases defined in the user configuration are selected
+        and the corresponding functions and classes are gathered.
 
         Args:
             test_suite_configs: Test suite configurations.
@@ -229,139 +221,34 @@ def _get_test_suites_with_cases(
             perf: Whether to include performance test cases in the final list.
 
         Returns:
-            The discovered test suites, each with test cases.
+            The test suites, each with test cases.
         """
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
-            test_cases = []
-            func_test_cases, perf_test_cases = self._filter_test_cases(
-                test_suite_class, test_suite_config.test_cases_names
-            )
-            if func:
-                test_cases.extend(func_test_cases)
-            if perf:
-                test_cases.extend(perf_test_cases)
-
-            test_suites_with_cases.append(
-                TestSuiteWithCases(test_suite_class=test_suite_class, test_cases=test_cases)
-            )
-
-        return test_suites_with_cases
-
-    def _get_test_suite_class(self, module_name: str) -> type[TestSuite]:
-        """Find the :class:`TestSuite` class in `module_name`.
-
-        The full module name is `module_name` prefixed with `self._test_suite_module_prefix`.
-        The module name is a standard filename with words separated with underscores.
-        Search the `module_name` for a :class:`TestSuite` class which starts
-        with `self._test_suite_class_prefix`, continuing with CamelCase `module_name`.
-        The first matching class is returned.
-
-        The CamelCase convention applies to abbreviations, acronyms, initialisms and so on::
-
-            OS -> Os
-            TCP -> Tcp
-
-        Args:
-            module_name: The module name without prefix where to search for the test suite.
-
-        Returns:
-            The found test suite class.
-
-        Raises:
-            ConfigurationError: If the corresponding module is not found or
-                a valid :class:`TestSuite` is not found in the module.
-        """
-
-        def is_test_suite(object) -> bool:
-            """Check whether `object` is a :class:`TestSuite`.
-
-            The `object` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
-
-            Args:
-                object: The object to be checked.
-
-            Returns:
-                :data:`True` if `object` is a subclass of `TestSuite`.
-            """
-            try:
-                if issubclass(object, TestSuite) and object is not TestSuite:
-                    return True
-            except TypeError:
-                return False
-            return False
-
-        testsuite_module_path = f"{self._test_suite_module_prefix}{module_name}"
-        try:
-            test_suite_module = importlib.import_module(testsuite_module_path)
-        except ModuleNotFoundError as e:
-            raise ConfigurationError(
-                f"Test suite module '{testsuite_module_path}' not found."
-            ) from e
-
-        camel_case_suite_name = "".join(
-            [suite_word.capitalize() for suite_word in module_name.split("_")]
-        )
-        full_suite_name_to_find = f"{self._test_suite_class_prefix}{camel_case_suite_name}"
-        for class_name, class_obj in inspect.getmembers(test_suite_module, is_test_suite):
-            if class_name == full_suite_name_to_find:
-                return class_obj
-        raise ConfigurationError(
-            f"Couldn't find any valid test suites in {test_suite_module.__name__}."
-        )
-
-    def _filter_test_cases(
-        self, test_suite_class: type[TestSuite], test_cases_to_run: Sequence[str]
-    ) -> tuple[list[FunctionType], list[FunctionType]]:
-        """Filter `test_cases_to_run` from `test_suite_class`.
-
-        There are two rounds of filtering if `test_cases_to_run` is not empty.
-        The first filters `test_cases_to_run` from all methods of `test_suite_class`.
-        Then the methods are separated into functional and performance test cases.
-        If a method matches neither the functional nor performance name prefix, it's an error.
-
-        Args:
-            test_suite_class: The class of the test suite.
-            test_cases_to_run: Test case names to filter from `test_suite_class`.
-                If empty, return all matching test cases.
-
-        Returns:
-            A list of test case methods that should be executed.
+            test_suite_spec = test_suite_config.test_suite_spec
+            test_suite_class = test_suite_spec.class_type
+
+            filtered_test_cases: list[TestCase] = [
+                test_case
+                for test_case in test_suite_spec.test_cases
+                if not test_suite_config.test_cases_names
+                or test_case.name in test_suite_config.test_cases_names
+            ]
 
-        Raises:
-            ConfigurationError: If a test case from `test_cases_to_run` is not found
-                or it doesn't match either the functional nor performance name prefix.
-        """
-        func_test_cases = []
-        perf_test_cases = []
-        name_method_tuples = inspect.getmembers(test_suite_class, inspect.isfunction)
-        if test_cases_to_run:
-            name_method_tuples = [
-                (name, method) for name, method in name_method_tuples if name in test_cases_to_run
+            selected_test_cases: list[FunctionType] = [
+                test_case.function_type  # type: ignore[misc]
+                for test_case in filtered_test_cases
+                if (func and test_case.variant == TestCaseVariant.FUNCTIONAL)
+                or (perf and test_case.variant == TestCaseVariant.PERFORMANCE)
             ]
-            if len(name_method_tuples) < len(test_cases_to_run):
-                missing_test_cases = set(test_cases_to_run) - {
-                    name for name, _ in name_method_tuples
-                }
-                raise ConfigurationError(
-                    f"Test cases {missing_test_cases} not found among methods "
-                    f"of {test_suite_class.__name__}."
-                )
 
-        for test_case_name, test_case_method in name_method_tuples:
-            if re.match(self._func_test_case_regex, test_case_name):
-                func_test_cases.append(test_case_method)
-            elif re.match(self._perf_test_case_regex, test_case_name):
-                perf_test_cases.append(test_case_method)
-            elif test_cases_to_run:
-                raise ConfigurationError(
-                    f"Method '{test_case_name}' matches neither "
-                    f"a functional nor a performance test case name."
+            test_suites_with_cases.append(
+                TestSuiteWithCases(
+                    test_suite_class=test_suite_class, test_cases=selected_test_cases
                 )
-
-        return func_test_cases, perf_test_cases
+            )
+        return test_suites_with_cases
 
     def _connect_nodes_and_run_test_run(
         self,
-- 
2.34.1


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH 5/5] dts: add JSON schema generation script
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
                   ` (3 preceding siblings ...)
  2024-08-22 16:39 ` [PATCH 4/5] dts: use TestSuiteSpec class imports Luca Vizzarro
@ 2024-08-22 16:39 ` Luca Vizzarro
  2024-09-17 11:59   ` Juraj Linkeš
  2024-10-01 20:48   ` Nicholas Pratte
  2024-10-25 15:58 ` [PATCH v2 0/5] dts: Pydantic configuration Luca Vizzarro
                   ` (4 subsequent siblings)
  9 siblings, 2 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-08-22 16:39 UTC (permalink / raw)
  To: dev
  Cc: Honnappa Nagarahalli, Juraj Linkeš, Luca Vizzarro, Paul Szczepanek

Adds a new script which automatically re-generates the JSON schema file
based on the Pydantic configuration models.

Moreover, update the JSON schema with this script for the first time.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 doc/guides/tools/dts.rst                   |  10 +
 dts/framework/config/conf_yaml_schema.json | 776 ++++++++++++---------
 dts/generate-schema.py                     |  38 +
 3 files changed, 486 insertions(+), 338 deletions(-)
 create mode 100755 dts/generate-schema.py

diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..317bd0ff99 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -430,6 +430,16 @@ Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
 Configuration Schema
 --------------------
 
+The configuration schema is automatically generated from Pydantic models and can be found
+at ``dts/framework/config/conf_yaml_schema.json``. Whenever the models are changed, the schema
+should be regenerated using the dedicated script at ``dts/generate-schema.py``, e.g.:
+
+.. code-block:: console
+
+   $ poetry shell
+   (dts-py3.10) $ ./generate-schema.py
+
+
 Definitions
 ~~~~~~~~~~~
 
diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
index f02a310bb5..1cf1bb098a 100644
--- a/dts/framework/config/conf_yaml_schema.json
+++ b/dts/framework/config/conf_yaml_schema.json
@@ -1,402 +1,502 @@
 {
-  "$schema": "https://json-schema.org/draft-07/schema",
-  "title": "DTS Config Schema",
-  "definitions": {
-    "node_name": {
-      "type": "string",
-      "description": "A unique identifier for a node"
-    },
-    "NIC": {
-      "type": "string",
-      "enum": [
-        "ALL",
-        "ConnectX3_MT4103",
-        "ConnectX4_LX_MT4117",
-        "ConnectX4_MT4115",
-        "ConnectX5_MT4119",
-        "ConnectX5_MT4121",
-        "I40E_10G-10G_BASE_T_BC",
-        "I40E_10G-10G_BASE_T_X722",
-        "I40E_10G-SFP_X722",
-        "I40E_10G-SFP_XL710",
-        "I40E_10G-X722_A0",
-        "I40E_1G-1G_BASE_T_X722",
-        "I40E_25G-25G_SFP28",
-        "I40E_40G-QSFP_A",
-        "I40E_40G-QSFP_B",
-        "IAVF-ADAPTIVE_VF",
-        "IAVF-VF",
-        "IAVF_10G-X722_VF",
-        "ICE_100G-E810C_QSFP",
-        "ICE_25G-E810C_SFP",
-        "ICE_25G-E810_XXV_SFP",
-        "IGB-I350_VF",
-        "IGB_1G-82540EM",
-        "IGB_1G-82545EM_COPPER",
-        "IGB_1G-82571EB_COPPER",
-        "IGB_1G-82574L",
-        "IGB_1G-82576",
-        "IGB_1G-82576_QUAD_COPPER",
-        "IGB_1G-82576_QUAD_COPPER_ET2",
-        "IGB_1G-82580_COPPER",
-        "IGB_1G-I210_COPPER",
-        "IGB_1G-I350_COPPER",
-        "IGB_1G-I354_SGMII",
-        "IGB_1G-PCH_LPTLP_I218_LM",
-        "IGB_1G-PCH_LPTLP_I218_V",
-        "IGB_1G-PCH_LPT_I217_LM",
-        "IGB_1G-PCH_LPT_I217_V",
-        "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
-        "IGC-I225_LM",
-        "IGC-I226_LM",
-        "IXGBE_10G-82599_SFP",
-        "IXGBE_10G-82599_SFP_SF_QP",
-        "IXGBE_10G-82599_T3_LOM",
-        "IXGBE_10G-82599_VF",
-        "IXGBE_10G-X540T",
-        "IXGBE_10G-X540_VF",
-        "IXGBE_10G-X550EM_A_SFP",
-        "IXGBE_10G-X550EM_X_10G_T",
-        "IXGBE_10G-X550EM_X_SFP",
-        "IXGBE_10G-X550EM_X_VF",
-        "IXGBE_10G-X550T",
-        "IXGBE_10G-X550_VF",
-        "brcm_57414",
-        "brcm_P2100G",
-        "cavium_0011",
-        "cavium_a034",
-        "cavium_a063",
-        "cavium_a064",
-        "fastlinq_ql41000",
-        "fastlinq_ql41000_vf",
-        "fastlinq_ql45000",
-        "fastlinq_ql45000_vf",
-        "hi1822",
-        "virtio"
-      ]
-    },
-
-    "ARCH": {
-      "type": "string",
+  "$defs": {
+    "Architecture": {
+      "description": "The supported architectures of :class:`~framework.testbed_model.node.Node`\\s.",
       "enum": [
+        "i686",
         "x86_64",
+        "x86_32",
         "arm64",
         "ppc64le"
-      ]
-    },
-    "OS": {
-      "type": "string",
-      "enum": [
-        "linux"
-      ]
-    },
-    "cpu": {
-      "type": "string",
-      "description": "Native should be the default on x86",
-      "enum": [
-        "native",
-        "armv8a",
-        "dpaa2",
-        "thunderx",
-        "xgene1"
-      ]
-    },
-    "compiler": {
-      "type": "string",
-      "enum": [
-        "gcc",
-        "clang",
-        "icc",
-        "mscv"
-      ]
+      ],
+      "title": "Architecture",
+      "type": "string"
     },
-    "build_target": {
-      "type": "object",
-      "description": "Targets supported by DTS",
+    "BuildTargetConfiguration": {
+      "additionalProperties": false,
+      "description": "DPDK build configuration.\n\nThe configuration used for building DPDK.\n\nAttributes:\n    arch: The target architecture to build for.\n    os: The target os to build for.\n    cpu: The target CPU to build for.\n    compiler: The compiler executable to use.\n    compiler_wrapper: This string will be put in front of the compiler when\n        executing the build. Useful for adding wrapper commands, such as ``ccache``.",
       "properties": {
         "arch": {
-          "type": "string",
-          "enum": [
-            "ALL",
-            "x86_64",
-            "arm64",
-            "ppc64le",
-            "other"
-          ]
+          "$ref": "#/$defs/Architecture"
         },
         "os": {
-          "$ref": "#/definitions/OS"
+          "$ref": "#/$defs/OS"
         },
         "cpu": {
-          "$ref": "#/definitions/cpu"
+          "$ref": "#/$defs/CPUType"
         },
         "compiler": {
-          "$ref": "#/definitions/compiler"
+          "$ref": "#/$defs/Compiler"
         },
-          "compiler_wrapper": {
-          "type": "string",
-          "description": "This will be added before compiler to the CC variable when building DPDK. Optional."
+        "compiler_wrapper": {
+          "default": "",
+          "title": "Compiler Wrapper",
+          "type": "string"
         }
       },
-      "additionalProperties": false,
       "required": [
         "arch",
         "os",
         "cpu",
         "compiler"
-      ]
+      ],
+      "title": "BuildTargetConfiguration",
+      "type": "object"
     },
-    "hugepages_2mb": {
-      "type": "object",
-      "description": "Optional hugepage configuration. If not specified, hugepages won't be configured and DTS will use system configuration.",
+    "CPUType": {
+      "description": "The supported CPUs of :class:`~framework.testbed_model.node.Node`\\s.",
+      "enum": [
+        "native",
+        "armv8a",
+        "dpaa2",
+        "thunderx",
+        "xgene1"
+      ],
+      "title": "CPUType",
+      "type": "string"
+    },
+    "Compiler": {
+      "description": "The supported compilers of :class:`~framework.testbed_model.node.Node`\\s.",
+      "enum": [
+        "gcc",
+        "clang",
+        "icc",
+        "msvc"
+      ],
+      "title": "Compiler",
+      "type": "string"
+    },
+    "HugepageConfiguration": {
+      "additionalProperties": false,
+      "description": "The hugepage configuration of :class:`~framework.testbed_model.node.Node`\\s.\n\nAttributes:\n    number_of: The number of hugepages to allocate.\n    force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.",
       "properties": {
         "number_of": {
-          "type": "integer",
-          "description": "The number of hugepages to configure. Hugepage size will be the system default."
+          "title": "Number Of",
+          "type": "integer"
         },
         "force_first_numa": {
-          "type": "boolean",
-          "description": "Set to True to force configuring hugepages on the first NUMA node. Defaults to False."
+          "title": "Force First Numa",
+          "type": "boolean"
         }
       },
-      "additionalProperties": false,
       "required": [
-        "number_of"
-      ]
-    },
-    "mac_address": {
-      "type": "string",
-      "description": "A MAC address",
-      "pattern": "^([0-9A-Fa-f]{2}[:-]){5}([0-9A-Fa-f]{2})$"
+        "number_of",
+        "force_first_numa"
+      ],
+      "title": "HugepageConfiguration",
+      "type": "object"
     },
-    "pci_address": {
-      "type": "string",
-      "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
+    "OS": {
+      "description": "The supported operating systems of :class:`~framework.testbed_model.node.Node`\\s.",
+      "enum": [
+        "linux",
+        "freebsd",
+        "windows"
+      ],
+      "title": "OS",
+      "type": "string"
     },
-    "port_peer_address": {
-      "description": "Peer is a TRex port, and IXIA port or a PCI address",
-      "oneOf": [
-        {
-          "description": "PCI peer port",
-          "$ref": "#/definitions/pci_address"
+    "PortConfig": {
+      "additionalProperties": false,
+      "description": "The port configuration of :class:`~framework.testbed_model.node.Node`\\s.\n\nAttributes:\n    pci: The PCI address of the port.\n    os_driver_for_dpdk: The operating system driver name for use with DPDK.\n    os_driver: The operating system driver name when the operating system controls the port.\n    peer_node: The :class:`~framework.testbed_model.node.Node` of the port\n        connected to this port.\n    peer_pci: The PCI address of the port connected to this port.",
+      "properties": {
+        "pci": {
+          "description": "The local PCI address of the port.",
+          "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$",
+          "title": "Pci",
+          "type": "string"
+        },
+        "os_driver_for_dpdk": {
+          "description": "The driver that the kernel should bind this device to for DPDK to use it.",
+          "examples": [
+            "vfio-pci",
+            "mlx5_core"
+          ],
+          "title": "Os Driver For Dpdk",
+          "type": "string"
+        },
+        "os_driver": {
+          "description": "The driver normally used by this port",
+          "examples": [
+            "i40e",
+            "ice",
+            "mlx5_core"
+          ],
+          "title": "Os Driver",
+          "type": "string"
+        },
+        "peer_node": {
+          "description": "The name of the peer node this port is connected to.",
+          "title": "Peer Node",
+          "type": "string"
+        },
+        "peer_pci": {
+          "description": "The PCI address of the peer port this port is connected to.",
+          "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$",
+          "title": "Peer Pci",
+          "type": "string"
         }
-      ]
+      },
+      "required": [
+        "pci",
+        "os_driver_for_dpdk",
+        "os_driver",
+        "peer_node",
+        "peer_pci"
+      ],
+      "title": "PortConfig",
+      "type": "object"
     },
-    "test_suite": {
-      "type": "string",
-      "enum": [
-        "hello_world",
-        "os_udp",
-        "pmd_buffer_scatter"
-      ]
+    "ScapyTrafficGeneratorConfig": {
+      "additionalProperties": false,
+      "description": "Scapy traffic generator specific configuration.",
+      "properties": {
+        "type": {
+          "const": "SCAPY",
+          "enum": [
+            "SCAPY"
+          ],
+          "title": "Type",
+          "type": "string"
+        }
+      },
+      "required": [
+        "type"
+      ],
+      "title": "ScapyTrafficGeneratorConfig",
+      "type": "object"
     },
-    "test_target": {
-      "type": "object",
+    "SutNodeConfiguration": {
+      "additionalProperties": false,
+      "description": ":class:`~framework.testbed_model.sut_node.SutNode` specific configuration.\n\nAttributes:\n    memory_channels: The number of memory channels to use when running DPDK.",
       "properties": {
-        "suite": {
-          "$ref": "#/definitions/test_suite"
+        "name": {
+          "description": "A unique identifier for this node.",
+          "title": "Name",
+          "type": "string"
+        },
+        "hostname": {
+          "description": "The hostname or IP address of the node.",
+          "title": "Hostname",
+          "type": "string"
+        },
+        "user": {
+          "description": "The login user to use to connect to this node.",
+          "title": "User",
+          "type": "string"
         },
-        "cases": {
-          "type": "array",
-          "description": "If specified, only this subset of test suite's test cases will be run.",
+        "password": {
+          "anyOf": [
+            {
+              "type": "string"
+            },
+            {
+              "type": "null"
+            }
+          ],
+          "default": null,
+          "description": "The login password to use to connect to this node. SSH keys are STRONGLY preferred, use only as last resort.",
+          "title": "Password"
+        },
+        "use_first_core": {
+          "default": false,
+          "description": "DPDK won't use the first physical core if set to False.",
+          "title": "Use First Core",
+          "type": "boolean"
+        },
+        "hugepages_2mb": {
+          "anyOf": [
+            {
+              "$ref": "#/$defs/HugepageConfiguration"
+            },
+            {
+              "type": "null"
+            }
+          ],
+          "default": null
+        },
+        "ports": {
           "items": {
-            "type": "string"
+            "$ref": "#/$defs/PortConfig"
           },
-          "minimum": 1
+          "minItems": 1,
+          "title": "Ports",
+          "type": "array"
+        },
+        "memory_channels": {
+          "default": 1,
+          "description": "Number of memory channels to use when running DPDK.",
+          "title": "Memory Channels",
+          "type": "integer"
+        },
+        "arch": {
+          "$ref": "#/$defs/Architecture"
+        },
+        "os": {
+          "$ref": "#/$defs/OS"
+        },
+        "lcores": {
+          "default": "1",
+          "description": "Comma-separated list of logical cores to use. An empty string means use all lcores.",
+          "examples": [
+            "1,2,3,4,5,18-22",
+            "10-15"
+          ],
+          "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
+          "title": "Lcores",
+          "type": "string"
         }
       },
       "required": [
-        "suite"
+        "name",
+        "hostname",
+        "user",
+        "ports",
+        "arch",
+        "os"
       ],
-      "additionalProperties": false
-    }
-  },
-  "type": "object",
-  "properties": {
-    "nodes": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "name": {
-            "type": "string",
-            "description": "A unique identifier for this node"
-          },
-          "hostname": {
-            "type": "string",
-            "description": "A hostname from which the node running DTS can access this node. This can also be an IP address."
-          },
-          "user": {
-            "type": "string",
-            "description": "The user to access this node with."
-          },
-          "password": {
-            "type": "string",
-            "description": "The password to use on this node. Use only as a last resort. SSH keys are STRONGLY preferred."
-          },
-          "arch": {
-            "$ref": "#/definitions/ARCH"
-          },
-          "os": {
-            "$ref": "#/definitions/OS"
-          },
-          "lcores": {
-            "type": "string",
-            "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
-            "description": "Optional comma-separated list of logical cores to use, e.g.: 1,2,3,4,5,18-22. Defaults to 1. An empty string means use all lcores."
+      "title": "SutNodeConfiguration",
+      "type": "object"
+    },
+    "TGNodeConfiguration": {
+      "additionalProperties": false,
+      "description": ":class:`~framework.testbed_model.tg_node.TGNode` specific configuration.\n\nAttributes:\n    traffic_generator: The configuration of the traffic generator present on the TG node.",
+      "properties": {
+        "name": {
+          "description": "A unique identifier for this node.",
+          "title": "Name",
+          "type": "string"
+        },
+        "hostname": {
+          "description": "The hostname or IP address of the node.",
+          "title": "Hostname",
+          "type": "string"
+        },
+        "user": {
+          "description": "The login user to use to connect to this node.",
+          "title": "User",
+          "type": "string"
+        },
+        "password": {
+          "anyOf": [
+            {
+              "type": "string"
+            },
+            {
+              "type": "null"
+            }
+          ],
+          "default": null,
+          "description": "The login password to use to connect to this node. SSH keys are STRONGLY preferred, use only as last resort.",
+          "title": "Password"
+        },
+        "use_first_core": {
+          "default": false,
+          "description": "DPDK won't use the first physical core if set to False.",
+          "title": "Use First Core",
+          "type": "boolean"
+        },
+        "hugepages_2mb": {
+          "anyOf": [
+            {
+              "$ref": "#/$defs/HugepageConfiguration"
+            },
+            {
+              "type": "null"
+            }
+          ],
+          "default": null
+        },
+        "ports": {
+          "items": {
+            "$ref": "#/$defs/PortConfig"
           },
-          "use_first_core": {
-            "type": "boolean",
-            "description": "Indicate whether DPDK should use the first physical core. It won't be used by default."
+          "minItems": 1,
+          "title": "Ports",
+          "type": "array"
+        },
+        "arch": {
+          "$ref": "#/$defs/Architecture"
+        },
+        "os": {
+          "$ref": "#/$defs/OS"
+        },
+        "lcores": {
+          "default": "1",
+          "description": "Comma-separated list of logical cores to use. An empty string means use all lcores.",
+          "examples": [
+            "1,2,3,4,5,18-22",
+            "10-15"
+          ],
+          "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
+          "title": "Lcores",
+          "type": "string"
+        },
+        "traffic_generator": {
+          "discriminator": {
+            "mapping": {
+              "SCAPY": "#/$defs/ScapyTrafficGeneratorConfig"
+            },
+            "propertyName": "type"
           },
-          "memory_channels": {
-            "type": "integer",
-            "description": "How many memory channels to use. Optional, defaults to 1."
+          "oneOf": [
+            {
+              "$ref": "#/$defs/ScapyTrafficGeneratorConfig"
+            }
+          ],
+          "title": "Traffic Generator"
+        }
+      },
+      "required": [
+        "name",
+        "hostname",
+        "user",
+        "ports",
+        "arch",
+        "os",
+        "traffic_generator"
+      ],
+      "title": "TGNodeConfiguration",
+      "type": "object"
+    },
+    "TestRunConfiguration": {
+      "additionalProperties": false,
+      "description": "The configuration of a test run.\n\nThe configuration contains testbed information, what tests to execute\nand with what DPDK build.\n\nAttributes:\n    build_targets: A list of DPDK builds to test.\n    perf: Whether to run performance tests.\n    func: Whether to run functional tests.\n    skip_smoke_tests: Whether to skip smoke tests.\n    test_suites: The names of test suites and/or test cases to execute.\n    system_under_test_node: The SUT node configuration to use in this test run.\n    traffic_generator_node: The TG node name to use in this test run.",
+      "properties": {
+        "perf": {
+          "description": "Enable performance testing.",
+          "title": "Perf",
+          "type": "boolean"
+        },
+        "func": {
+          "description": "Enable functional testing.",
+          "title": "Func",
+          "type": "boolean"
+        },
+        "test_suites": {
+          "items": {
+            "$ref": "#/$defs/TestSuiteConfig"
           },
-          "hugepages_2mb": {
-            "$ref": "#/definitions/hugepages_2mb"
+          "minItems": 1,
+          "title": "Test Suites",
+          "type": "array"
+        },
+        "build_targets": {
+          "items": {
+            "$ref": "#/$defs/BuildTargetConfiguration"
           },
-          "ports": {
-            "type": "array",
-            "items": {
-              "type": "object",
-              "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.",
-              "properties": {
-                "pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The local PCI address of the port"
-                },
-                "os_driver_for_dpdk": {
-                  "type": "string",
-                  "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)"
-                },
-                "os_driver": {
-                  "type": "string",
-                  "description": "The driver normally used by this port (ex: i40e)"
-                },
-                "peer_node": {
-                  "type": "string",
-                  "description": "The name of the node the peer port is on"
-                },
-                "peer_pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The PCI address of the peer port"
-                }
-              },
-              "additionalProperties": false,
-              "required": [
-                "pci",
-                "os_driver_for_dpdk",
-                "os_driver",
-                "peer_node",
-                "peer_pci"
-              ]
-            },
-            "minimum": 1
+          "title": "Build Targets",
+          "type": "array"
+        },
+        "skip_smoke_tests": {
+          "default": false,
+          "title": "Skip Smoke Tests",
+          "type": "boolean"
+        },
+        "system_under_test_node": {
+          "$ref": "#/$defs/TestRunSUTNodeConfiguration"
+        },
+        "traffic_generator_node": {
+          "title": "Traffic Generator Node",
+          "type": "string"
+        }
+      },
+      "required": [
+        "perf",
+        "func",
+        "test_suites",
+        "build_targets",
+        "system_under_test_node",
+        "traffic_generator_node"
+      ],
+      "title": "TestRunConfiguration",
+      "type": "object"
+    },
+    "TestRunSUTNodeConfiguration": {
+      "additionalProperties": false,
+      "description": "The SUT node configuration of a test run.\n\nAttributes:\n    node_name: The SUT node to use in this test run.\n    vdevs: The names of virtual devices to test.",
+      "properties": {
+        "vdevs": {
+          "items": {
+            "type": "string"
           },
-          "traffic_generator": {
-            "oneOf": [
-              {
-                "type": "object",
-                "description": "Scapy traffic generator. Used for functional testing.",
-                "properties": {
-                  "type": {
-                    "type": "string",
-                    "enum": [
-                      "SCAPY"
-                    ]
-                  }
-                }
-              }
-            ]
-          }
+          "title": "Vdevs",
+          "type": "array"
         },
-        "additionalProperties": false,
-        "required": [
-          "name",
-          "hostname",
-          "user",
-          "arch",
-          "os"
-        ]
+        "node_name": {
+          "title": "Node Name",
+          "type": "string"
+        }
       },
-      "minimum": 1
+      "required": [
+        "node_name"
+      ],
+      "title": "TestRunSUTNodeConfiguration",
+      "type": "object"
     },
-    "test_runs": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "build_targets": {
-            "type": "array",
-            "items": {
-              "$ref": "#/definitions/build_target"
+    "TestSuiteConfig": {
+      "anyOf": [
+        {
+          "additionalProperties": false,
+          "properties": {
+            "test_suite": {
+              "description": "The identifying name of the test suite.",
+              "title": "Test suite name",
+              "type": "string"
             },
-            "minimum": 1
-          },
-          "perf": {
-            "type": "boolean",
-            "description": "Enable performance testing."
-          },
-          "func": {
-            "type": "boolean",
-            "description": "Enable functional testing."
-          },
-          "test_suites": {
-            "type": "array",
-            "items": {
-              "oneOf": [
-                {
-                  "$ref": "#/definitions/test_suite"
-                },
-                {
-                  "$ref": "#/definitions/test_target"
-                }
-              ]
+            "test_cases": {
+              "description": "The identifying name of the test cases of the test suite.",
+              "items": {
+                "type": "string"
+              },
+              "title": "Test cases by name",
+              "type": "array"
             }
           },
-          "skip_smoke_tests": {
-            "description": "Optional field that allows you to skip smoke testing",
-            "type": "boolean"
-          },
-          "system_under_test_node": {
-            "type":"object",
-            "properties": {
-              "node_name": {
-                "$ref": "#/definitions/node_name"
-              },
-              "vdevs": {
-                "description": "Optional list of names of vdevs to be used in the test run",
-                "type": "array",
-                "items": {
-                  "type": "string"
-                }
-              }
-            },
-            "required": [
-              "node_name"
-            ]
+          "required": [
+            "test_suite"
+          ],
+          "type": "object"
+        },
+        {
+          "type": "string"
+        }
+      ],
+      "description": "Test suite configuration.\n\nInformation about a single test suite to be executed. It can be represented and validated as a\nstring type in the form of: ``TEST_SUITE [TEST_CASE, ...]``, in the configuration file.\n\nAttributes:\n    test_suite: The name of the test suite module without the starting ``TestSuite_``.\n    test_cases: The names of test cases from this test suite to execute.\n        If empty, all test cases will be executed.",
+      "title": "TestSuiteConfig"
+    }
+  },
+  "description": "DTS testbed and test configuration.\n\nAttributes:\n    test_runs: Test run configurations.\n    nodes: Node configurations.",
+  "properties": {
+    "test_runs": {
+      "items": {
+        "$ref": "#/$defs/TestRunConfiguration"
+      },
+      "minItems": 1,
+      "title": "Test Runs",
+      "type": "array"
+    },
+    "nodes": {
+      "items": {
+        "anyOf": [
+          {
+            "$ref": "#/$defs/TGNodeConfiguration"
           },
-          "traffic_generator_node": {
-            "$ref": "#/definitions/node_name"
+          {
+            "$ref": "#/$defs/SutNodeConfiguration"
           }
-        },
-        "additionalProperties": false,
-        "required": [
-          "build_targets",
-          "perf",
-          "func",
-          "test_suites",
-          "system_under_test_node",
-          "traffic_generator_node"
         ]
       },
-      "minimum": 1
+      "minItems": 1,
+      "title": "Nodes",
+      "type": "array"
     }
   },
   "required": [
     "test_runs",
     "nodes"
   ],
-  "additionalProperties": false
-}
+  "title": "Configuration",
+  "type": "object",
+  "$schema": "https://json-schema.org/draft/2020-12/schema"
+}
\ No newline at end of file
diff --git a/dts/generate-schema.py b/dts/generate-schema.py
new file mode 100755
index 0000000000..b41d28492f
--- /dev/null
+++ b/dts/generate-schema.py
@@ -0,0 +1,38 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 Arm Limited
+
+"""JSON schema generation script."""
+
+import json
+import os
+
+from pydantic.json_schema import GenerateJsonSchema
+
+from framework.config import ConfigurationType
+
+DTS_DIR = os.path.dirname(os.path.realpath(__file__))
+RELATIVE_PATH_TO_SCHEMA = "framework/config/conf_yaml_schema.json"
+
+
+class GenerateSchemaWithDialect(GenerateJsonSchema):
+    """Custom schema generator which adds the schema dialect."""
+
+    def generate(self, schema, mode="validation"):
+        """Generate JSON schema."""
+        json_schema = super().generate(schema, mode=mode)
+        json_schema["$schema"] = self.schema_dialect
+        return json_schema
+
+
+try:
+    path = os.path.join(DTS_DIR, RELATIVE_PATH_TO_SCHEMA)
+
+    with open(path, "w") as schema_file:
+        schema_dict = ConfigurationType.json_schema(schema_generator=GenerateSchemaWithDialect)
+        schema_json = json.dumps(schema_dict, indent=2)
+        schema_file.write(schema_json)
+
+    print("Schema generated successfully!")
+except Exception as e:
+    raise Exception("failed to generate schema") from e
-- 
2.34.1


^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 1/5] dts: add TestSuiteSpec class and discovery
  2024-08-22 16:39 ` [PATCH 1/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
@ 2024-09-16 13:00   ` Juraj Linkeš
  2024-10-29 12:57     ` Luca Vizzarro
  2024-09-19 20:01   ` Nicholas Pratte
  1 sibling, 1 reply; 83+ messages in thread
From: Juraj Linkeš @ 2024-09-16 13:00 UTC (permalink / raw)
  To: Luca Vizzarro, dev; +Cc: Honnappa Nagarahalli, Paul Szczepanek

There are some elements which seem to be present in 
https://patches.dpdk.org/project/dpdk/patch/20240821145315.97974-4-juraj.linkes@pantheon.tech/, 
which is an attempt at decorating test cases (buzgilla 1460) as part of 
the capabilities series.

Looks like we could create a separate patch with 1460 and this patch in 
it on which both capabilities and this series would depend. What do you 
think? Certainly makes sense to have decorating tests cases separate 
from capabilities.

On 22. 8. 2024 18:39, Luca Vizzarro wrote:
> Currently there is a lack of a definition which identifies all the test
> suites available to test. This change intends to simplify the process to
> discover all the test suites and idenfity them.
> 
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>   dts/framework/test_suite.py | 182 +++++++++++++++++++++++++++++++++++-
>   1 file changed, 181 insertions(+), 1 deletion(-)
> 
> diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
> index 694b2eba65..972968b036 100644
> --- a/dts/framework/test_suite.py
> +++ b/dts/framework/test_suite.py
> @@ -1,6 +1,7 @@
>   # SPDX-License-Identifier: BSD-3-Clause
>   # Copyright(c) 2010-2014 Intel Corporation
>   # Copyright(c) 2023 PANTHEON.tech s.r.o.
> +# Copyright(c) 2024 Arm Limited
>   
>   """Features common to all test suites.
>   
> @@ -13,12 +14,22 @@
>       * Test case verification.
>   """
>   
> +import inspect
> +import re
> +from dataclasses import dataclass
> +from enum import Enum, auto
> +from functools import cached_property
> +from importlib import import_module
>   from ipaddress import IPv4Interface, IPv6Interface, ip_interface
> -from typing import ClassVar, Union
> +from pkgutil import iter_modules
> +from types import FunctionType, ModuleType
> +from typing import ClassVar, NamedTuple, Union
>   
> +from pydantic.alias_generators import to_pascal

This is using pydantic, but it's only added in the subsequent patch.

>   from scapy.layers.inet import IP  # type: ignore[import-untyped]
>   from scapy.layers.l2 import Ether  # type: ignore[import-untyped]
>   from scapy.packet import Packet, Padding  # type: ignore[import-untyped]
> +from typing_extensions import Self
>   
>   from framework.testbed_model.port import Port, PortLink
>   from framework.testbed_model.sut_node import SutNode
> @@ -365,3 +376,172 @@ def _verify_l3_packet(self, received_packet: IP, expected_packet: IP) -> bool:
>           if received_packet.src != expected_packet.src or received_packet.dst != expected_packet.dst:
>               return False
>           return True
> +
> +
> +class TestCaseVariant(Enum):
> +    """Enum representing the variant of the test case."""
> +
> +    #:
> +    FUNCTIONAL = auto()
> +    #:
> +    PERFORMANCE = auto()
> +
> +
> +class TestCase(NamedTuple):
> +    """Tuple representing a test case."""
> +
> +    #: The name of the test case without prefix
> +    name: str
> +    #: The reference to the function
> +    function_type: FunctionType

I had to read almost the whole patch to understand what this is. It's 
not the type of a function, it's the function object, which is what the 
docstring says, but I glossed over that. This should be just function or 
maybe function_obj.

> +    #: The test case variant
> +    variant: TestCaseVariant
> +
> +
> +@dataclass
> +class TestSuiteSpec:
> +    """A class defining the specification of a test suite.
> +
> +    Apart from defining all the specs of a test suite, a helper function :meth:`discover_all` is
> +    provided to automatically discover all the available test suites.
> +

We should probably document the assumption that there's only one 
TestCase class in a test case module.

> +    Attributes:
> +        module_name: The name of the test suite's module.
> +    """
> +
> +    #:
> +    TEST_SUITES_PACKAGE_NAME = "tests"

Formally speaking, the tests dir doesn't have an __init__.py file in it, 
so it isn't a package, but the name is fine.

> +    #:
> +    TEST_SUITE_MODULE_PREFIX = "TestSuite_"
> +    #:
> +    TEST_SUITE_CLASS_PREFIX = "Test"
> +    #:
> +    TEST_CASE_METHOD_PREFIX = "test_"
> +    #:
> +    FUNC_TEST_CASE_REGEX = r"test_(?!perf_)"
> +    #:
> +    PERF_TEST_CASE_REGEX = r"test_perf_"
> +

These are common to all test suites, so they should be class variables.

I'm also wondering whether these should be documented in the module 
level docstring. It makes sense that we document there what a subclass 
is supposed to look like (and where it's supposed to be located by 
default). If we do this, we may need to move parts of the class's 
docstring as well.

> +    module_name: str
> +
> +    @cached_property

Nice touch, we are using our own implementation of this elsewhere, so 
maybe we should create a ticket to update those to use @cached_property 
instead.

> +    def name(self) -> str:

TestSuiteSpec.name really sound the name of a TestSuite, so I'd rename 
this to module_name.

> +        """The name of the test suite's module."""
> +        return self.module_name[len(self.TEST_SUITE_MODULE_PREFIX) :]
> +
> +    @cached_property
> +    def module_type(self) -> ModuleType:

This isn't a module type, just an instance of the module object, right? 
Could be named just module.

> +        """A reference to the test suite's module."""
> +        return import_module(f"{self.TEST_SUITES_PACKAGE_NAME}.{self.module_name}")
> +
> +    @cached_property
> +    def class_name(self) -> str:
> +        """The name of the test suite's class."""
> +        return f"{self.TEST_SUITE_CLASS_PREFIX}{to_pascal(self.name)}"
> +
> +    @cached_property
> +    def class_type(self) -> type[TestSuite]:

Class type would be the type of the class, but this is just the class, 
right? Could be named just class.

> +        """A reference to the test suite's class."""
> +
> +        def is_test_suite(obj) -> bool:
> +            """Check whether `obj` is a :class:`TestSuite`.
> +
> +            The `obj` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
> +
> +            Args:
> +                obj: The object to be checked.
> +
> +            Returns:
> +                :data:`True` if `obj` is a subclass of `TestSuite`.
> +            """
> +            try:
> +                if issubclass(obj, TestSuite) and obj is not TestSuite:
> +                    return True
> +            except TypeError:
> +                return False
> +            return False
> +
> +        for class_name, class_type in inspect.getmembers(self.module_type, is_test_suite):
> +            if class_name == self.class_name:
> +                return class_type
> +
> +        raise Exception("class not found in eligible test module")

This should be a DTS error, maybe InternalError? This doesn't seem like 
ConfigurationError. It should also say which module and be a proper 
sentence (capital first letter, end with a dot).

> +
> +    @cached_property
> +    def test_cases(self) -> list[TestCase]:
> +        """A list of all the available test cases."""
> +        test_cases = []
> +
> +        functions = inspect.getmembers(self.class_type, inspect.isfunction)
> +        for fn_name, fn_type in functions:

fn_obj instead of fn_type. The type suffix used in the whole module is 
very confusing.

> +            if prefix := re.match(self.FUNC_TEST_CASE_REGEX, fn_name):
> +                variant = TestCaseVariant.FUNCTIONAL
> +            elif prefix := re.match(self.PERF_TEST_CASE_REGEX, fn_name):
> +                variant = TestCaseVariant.PERFORMANCE
> +            else:
> +                continue
> +
> +            name = fn_name[len(prefix.group(0)) :]

Do we actually want to strip the prefix? It could be confusing if it 
appears in logs.

> +            test_cases.append(TestCase(name, fn_type, variant))
> +
> +        return test_cases
> +
> +    @classmethod
> +    def discover_all(
> +        cls, package_name: str | None = None, module_prefix: str | None = None
> +    ) -> list[Self]:
> +        """Discover all the test suites.
> +
> +        The test suites are discovered in the provided `package_name`. The full module name,
> +        expected under that package, is prefixed with `module_prefix`.
> +        The module name is a standard filename with words separated with underscores.
> +        For each module found, search for a :class:`TestSuite` class which starts
> +        with `self.TEST_SUITE_CLASS_PREFIX`, continuing with the module name in PascalCase.

`self.TEST_SUITE_CLASS_PREFIX` -> 
:attr:`~TestSuiteSpec.TEST_SUITE_CLASS_PREFIX`

> +
> +        The PascalCase convention applies to abbreviations, acronyms, initialisms and so on::
> +
> +            OS -> Os
> +            TCP -> Tcp
> +
> +        Args:
> +            package_name: The name of the package where to find the test suites, if none is set the

I'd separate this into two sentences, with the second one reworded a bit:

If :data:`None`, the :attr:`~TestSuiteSpec.TEST_SUITES_PACKAGE_NAME` 
constant is used.

> +                constant :attr:`~TestSuiteSpec.TEST_SUITES_PACKAGE_NAME` is used instead.
> +            module_prefix: The name prefix defining the test suite module, if none is set the

Same here.

> +                constant :attr:`~TestSuiteSpec.TEST_SUITE_MODULE_PREFIX` is used instead.
> +
> +        Returns:
> +            A list containing all the discovered test suites.
> +        """
> +        if package_name is None:
> +            package_name = cls.TEST_SUITES_PACKAGE_NAME
> +        if module_prefix is None:
> +            module_prefix = cls.TEST_SUITE_MODULE_PREFIX
> +
> +        test_suites = []
> +
> +        test_suites_pkg = import_module(package_name)
> +        for _, module_name, is_pkg in iter_modules(test_suites_pkg.__path__):
> +            if not module_name.startswith(module_prefix) or is_pkg:
> +                continue
> +
> +            test_suite = cls(module_name)
> +            try:
> +                if test_suite.class_type:
> +                    test_suites.append(test_suite)
> +            except Exception:
> +                pass

It may be beneficial to log a warning that we found a {module_prefix} 
test suite module without any actual valid test suites.

> +
> +        return test_suites
> +
> +
> +AVAILABLE_TEST_SUITES: list[TestSuiteSpec] = TestSuiteSpec.discover_all()
> +"""Constant to store all the available, discovered and imported test suites.
> +
> +The test suites should be gathered from this list to avoid importing more than once.
> +"""

We could store this in TestSuiteSpec itself. This would allow us to move 
the find_by_name function into it and also not import everything at 
once, but only what's needed if it hadn't been imported before, but 
maybe we don't want to do that since we lose the verification aspect.

I'm just not a fan of code being executed when we import a module, since 
we didn't call anything, it just sorta happened. Looks like this is used 
when parsing configuration, so we could do the full scan using 
@cached_property and that way it'll be the best of both worlds.

> +
> +
> +def find_by_name(name: str) -> TestSuiteSpec | None:

It should be clearer from the name/args/docstring that we're trying to 
find the test suite by module name.

> +    """Find a requested test suite by name from the available ones."""
> +    test_suites = filter(lambda t: t.name == name, AVAILABLE_TEST_SUITES)

A list comprehension would be easier to understand I think (mostly 
because it would remove the question of why do it this way instead of 
list comprehension):
test_suite_specs = [test_suite_spec for test_suite_spec in 
AVAILABLE_TEST_SUITES if test_suite_spec.name == name]

> +    return next(test_suites, None)

And then test_suite_specs[0] if test_suite_specs else None

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 2/5] dts: add Pydantic and remove Warlock
  2024-08-22 16:39 ` [PATCH 2/5] dts: add Pydantic and remove Warlock Luca Vizzarro
@ 2024-09-16 13:17   ` Juraj Linkeš
  2024-09-19 19:56   ` Nicholas Pratte
  2024-09-30 20:41   ` Dean Marx
  2 siblings, 0 replies; 83+ messages in thread
From: Juraj Linkeš @ 2024-09-16 13:17 UTC (permalink / raw)
  To: Luca Vizzarro, dev; +Cc: Honnappa Nagarahalli, Paul Szczepanek



On 22. 8. 2024 18:39, Luca Vizzarro wrote:
> Add Pydantic to the project dependencies while dropping Warlock.
> 

We should explain what pydantic is and why it's replacing warlock (and I 
think make them lowercase as that's how they appear in pyproject.toml).

But maybe we shouldn't remove warlock in this patch since that would 
make the dependencies incorrect (the code still uses it). The code would 
still work with previously installed dependencies so it's probably not a 
big deal though.

> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 3/5] dts: use Pydantic in the configuration
  2024-08-22 16:39 ` [PATCH 3/5] dts: use Pydantic in the configuration Luca Vizzarro
@ 2024-09-17 11:13   ` Juraj Linkeš
  2024-10-29 13:00     ` Luca Vizzarro
  2024-09-30 17:56   ` Nicholas Pratte
  2024-09-30 21:45   ` Dean Marx
  2 siblings, 1 reply; 83+ messages in thread
From: Juraj Linkeš @ 2024-09-17 11:13 UTC (permalink / raw)
  To: Luca Vizzarro, dev; +Cc: Honnappa Nagarahalli, Paul Szczepanek



On 22. 8. 2024 18:39, Luca Vizzarro wrote:
> This change brings in Pydantic in place of Warlock. Pydantic offers
> a built-in model validation system in the classes, which allows for
> a more resilient and simpler code. As a consequence of this change:
> 
> - most validation is now built-in
> - further validation is added to verify:
>    - cross referencing of node names and ports
>    - test suite and test cases names
> - dictionaries representing the config schema are removed
> - the config schema is no longer used for validation but kept as an
>    alternative format for the developer

If it's not used, we should remove it right away (in this patch). I see 
that it's updated in v5, but we can just add it back.

> - the config schema can now be generated automatically from the
>    Pydantic models
> - the TrafficGeneratorType enum has been changed from inheriting
>    StrEnum to the native str and Enum. This change was necessary to
>    enable the discriminator for object unions
> - the structure of the classes has been slightly changed to perfectly
>    match the structure of the configuration files
> - updates the test suite argument to catch the ValidationError that
>    TestSuiteConfig can now raise

Passive voice is not used here, but the rest of the bullet points are 
using it.

>   delete mode 100644 dts/framework/config/types.py

A note, don't forget to remove this from doc sources if those get merged 
before this.

> 
> diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py

> @@ -2,17 +2,19 @@

> -the YAML test run configuration file
> -and validates it according to :download:`the schema <conf_yaml_schema.json>`.
> +the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
> +dataclass model. The Pydantic model is also available as
> +:download:`JSON schema <conf_yaml_schema.json>`.

This second sentence should be moved to the last patch.


> @@ -33,29 +35,31 @@

> +)
> +from pydantic.config import JsonDict
> +from pydantic.dataclasses import dataclass

We should probably distinguish between built-in dataclasses and pydantic 
dataclasses (as pydantic adds the extra argument). Importing them as 
pydantic_dataclass seems like the easiest way to achieve this.


> @@ -116,14 +120,14 @@ class Compiler(StrEnum):

>   @unique
> -class TrafficGeneratorType(StrEnum):
> +class TrafficGeneratorType(str, Enum):
>       """The supported traffic generators."""
>   
>       #:
> -    SCAPY = auto()
> +    SCAPY = "SCAPY"

Do discriminators not work with auto()?


> -@dataclass(slots=True, frozen=True)
> +@dataclass(slots=True, frozen=True, kw_only=True, config=ConfigDict(extra="forbid"))

Is there any special reason for kw_only? Maybe we should add the reason 
for this (and also the config arg) to the module dosctring and commit msg.


> @@ -136,12 +140,17 @@ class HugepageConfiguration:

> -@dataclass(slots=True, frozen=True)
> +PciAddress = Annotated[
> +    str, StringConstraints(pattern=r"^[\da-fA-F]{4}:[\da-fA-F]{2}:[\da-fA-F]{2}.\d:?\w*$")

We have a pattern for this in utils.py. We can reuse and maybe update it 
if needed.

> +]
> +"""A constrained string type representing a PCI address."""

This should be above the var and I think regular comment (with #) should 
suffice.


> @@ -150,69 +159,53 @@ class PortConfig:

> +TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
>   
> -@dataclass(slots=True, frozen=True)
> -class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
> -    """Scapy traffic generator specific configuration."""
>   
> -    pass
> +LogicalCores = Annotated[
> +    str,
> +    StringConstraints(pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$"),
> +    Field(
> +        description="Comma-separated list of logical cores to use. "
> +        "An empty string means use all lcores.",
> +        examples=["1,2,3,4,5,18-22", "10-15"],
> +    ),
> +]

These two types don't have have a docstring, but others have.


> @@ -232,69 +225,25 @@ class NodeConfiguration:

>       arch: Architecture
>       os: OS

Adding the descriptions to all fields would be beneficial. Do we want to 
do that in this patch?


> @@ -313,10 +264,14 @@ class TGNodeConfiguration(NodeConfiguration):

> -    traffic_generator: TrafficGeneratorConfig
> +    traffic_generator: TrafficGeneratorConfigTypes
> +
>   
> +NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
> +"""Union type for all the node configuration types."""

Same note as with PciAddress.


> @@ -405,31 +369,63 @@ class TestSuiteConfig:

> +    test_suite_name: str = Field(
> +        title="Test suite name",
> +        description="The identifying name of the test suite.",

I think we need to update this to mention that it's the test suite 
module name. Maybe we can also update the field, as it's only used in 
this object.

> +        alias="test_suite",
> +    )
> +    test_cases_names: list[str] = Field(
> +        default_factory=list,
> +        title="Test cases by name",
> +        description="The identifying name of the test cases of the test suite.",
> +        alias="test_cases",
> +    )

The attributes under Attributes need to be updated.

> +
> +    @cached_property
> +    def test_suite_spec(self) -> "TestSuiteSpec":
> +        """The specification of the requested test suite."""
> +        from framework.test_suite import find_by_name
> +
> +        test_suite_spec = find_by_name(self.test_suite_name)
> +        assert test_suite_spec is not None, f"{self.test_suite_name} is not a valid test suite name"

Doesn't end with a dot; the message should also mention that we're 
dealing with module name.

> +        return test_suite_spec
> +
> +    @model_validator(mode="before")

I think it makes sense to exlude these from docs. I tried putting :meta 
private: into a docstring and it seems to be working.

>       @classmethod
> -    def from_dict(
> -        cls,
> -        entry: str | TestSuiteConfigDict,
> -    ) -> Self:
> -        """Create an instance from two different types.
> +    def convert_from_string(cls, data: Any) -> Any:
> +        """Convert the string representation into a valid mapping."""
> +        if isinstance(data, str):
> +            [test_suite, *test_cases] = data.split()
> +            return dict(test_suite=test_suite, test_cases=test_cases)
> +        return data
> +

Why is this here? To unify the format with the one accepted by the 
--test-suite argument? Do we want to add an alternative format? If so, 
we need to make sure we document clearly that there are two alternatives 
and that they're equivalent.

> +    @model_validator(mode="after")
> +    def validate_names(self) -> Self:
> +        """Validate the supplied test suite and test cases names."""

In Configuration.validate_test_runs_with_nodes() the docstring mentions 
the use of the cached property, let's also do that here.

> +        available_test_cases = map(lambda t: t.name, self.test_suite_spec.test_cases)
> +        for requested_test_case in self.test_cases_names:
> +            assert requested_test_case in available_test_cases, (
> +                f"{requested_test_case} is not a valid test case "
> +                f"for test suite {self.test_suite_name}"

for test suite -> of test suite; also end with a dot. The dot is missing 
in a lot of places (and capital letters where the message doesn't start 
with a var value).


> @@ -442,143 +438,132 @@ class TestRunConfiguration:

> -@dataclass(slots=True, frozen=True)
> +
> +@dataclass(frozen=True, kw_only=True)
>   class Configuration:
>       """DTS testbed and test configuration.
>   
> -    The node configuration is not stored in this object. Rather, all used node configurations
> -    are stored inside the test run configuration where the nodes are actually used.
> -

I think it makes sense to explain the extra validation (with the 
@*_validator decorators) that's being done in the docstring (if we 
remove the validation methods from the generated docs). The docstring 
should be updated for each model that doing the extra validation.

>       Attributes:
>           test_runs: Test run configurations.
> +        nodes: Node configurations.
>       """
>   
> -    test_runs: list[TestRunConfiguration]
> +    test_runs: list[TestRunConfiguration] = Field(min_length=1)
> +    nodes: list[NodeConfigurationTypes] = Field(min_length=1)
>   
> +    @field_validator("nodes")
>       @classmethod
> -    def from_dict(cls, d: ConfigurationDict) -> Self:
> -        """A convenience method that processes the inputs before creating an instance.
> +    def validate_node_names(cls, nodes: list[NodeConfiguration]) -> list[NodeConfiguration]:
> +        """Validate that the node names are unique."""
> +        nodes_by_name: dict[str, int] = {}
> +        for node_no, node in enumerate(nodes):
> +            assert node.name not in nodes_by_name, (
> +                f"node {node_no} cannot have the same name as node {nodes_by_name[node.name]} "
> +                f"({node.name})"
> +            )
> +            nodes_by_name[node.name] = node_no
> +
> +        return nodes
> +
> +    @model_validator(mode="after")
> +    def validate_ports(self) -> Self:
> +        """Validate that the ports are all linked to valid ones."""
> +        port_links: dict[tuple[str, str], Literal[False] | tuple[int, int]] = {
> +            (node.name, port.pci): False for node in self.nodes for port in node.ports
> +        }
> +
> +        for node_no, node in enumerate(self.nodes):

I could see why we're use enumeration for nodes in validate_node_names, 
but here we can just use node names in assert messages. At least that 
should be the case if nodes get validated before this model validator 
runs - it that the case?

> +            for port_no, port in enumerate(node.ports):
> +                peer_port_identifier = (port.peer_node, port.peer_pci)
> +                peer_port = port_links.get(peer_port_identifier, None)
> +                assert peer_port is not None, (
> +                    "invalid peer port specified for " f"nodes.{node_no}.ports.{port_no}"
> +                )
> +                assert peer_port is False, (
> +                    f"the peer port specified for nodes.{node_no}.ports.{port_no} "
> +                    f"is already linked to nodes.{peer_port[0]}.ports.{peer_port[1]}"
> +                )
> +                port_links[peer_port_identifier] = (node_no, port_no)
>   

> +    @cached_property
> +    def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]:

Let's move the property to be the first member of the class, to unify 
the order it with TestSuiteConfig.

> +        """List test runs with the associated nodes."""

This doesn't list the test runs. I think the docstring should say a bit 
more to make it obvious that this is the main attribute to use with this 
class. Or maybe that could be in the the class's docstring.

We're also missing the Returns: section.

> +        test_runs_with_nodes = []
>   
> -        Returns:
> -            The whole configuration instance.
> -        """
> -        nodes: list[SutNodeConfiguration | TGNodeConfiguration] = list(
> -            map(NodeConfiguration.from_dict, d["nodes"])
> -        )
> -        assert len(nodes) > 0, "There must be a node to test"
> +        for test_run_no, test_run in enumerate(self.test_runs):
> +            sut_node_name = test_run.system_under_test_node.node_name
> +            sut_node = next(filter(lambda n: n.name == sut_node_name, self.nodes), None)

There are a number of these instead of a list comprehension (I mentioned 
this in a previous patch). I still don't really see a reason to not use 
list comprehensions in all these cases.


> +
> +
> +ConfigurationType = TypeAdapter(Configuration)

This new transformed class exists only for validation purposes, right? I 
think we can move this to load_config, as it's not going to be used 
anywhere else.

Also I'd rename it to something else, it's not a type. Maybe 
ConfigurationAdapter or PydanticConfiguration or ConfigurationModel (as 
the adapter adds some methods from BaseModel). Or something else, but 
the type in the name confused me.

> diff --git a/dts/framework/runner.py b/dts/framework/runner.py
> @@ -231,10 +234,10 @@ def _get_test_suites_with_cases(
>           test_suites_with_cases = []
>   
>           for test_suite_config in test_suite_configs:
> -            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
> +            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)

We've already done all the validation and importing at this point and we 
should be able to use test_suite_config.test_suite_spec, right? The same 
is true for TestSuiteWithCases, which holds the same information.

Looks like you removed _get_test_suite_class in a subsequent patch, but 
we should think about getting rid of TestSuiteWithCases, as it was 
conceived to do what TestSuiteSpec is doing.


^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 4/5] dts: use TestSuiteSpec class imports
  2024-08-22 16:39 ` [PATCH 4/5] dts: use TestSuiteSpec class imports Luca Vizzarro
@ 2024-09-17 11:39   ` Juraj Linkeš
  2024-10-29 12:52     ` Luca Vizzarro
  2024-10-01 17:12   ` Dean Marx
  2024-10-01 20:45   ` Nicholas Pratte
  2 siblings, 1 reply; 83+ messages in thread
From: Juraj Linkeš @ 2024-09-17 11:39 UTC (permalink / raw)
  To: Luca Vizzarro, dev; +Cc: Honnappa Nagarahalli, Paul Szczepanek


> diff --git a/dts/framework/runner.py b/dts/framework/runner.py

> @@ -229,139 +221,34 @@ def _get_test_suites_with_cases(

> +            filtered_test_cases: list[TestCase] = [
> +                test_case
> +                for test_case in test_suite_spec.test_cases
> +                if not test_suite_config.test_cases_names
> +                or test_case.name in test_suite_config.test_cases_names
> +            ]

Ah, looks like TestSuiteSpec doesn't contain the subset we want to test. 
Could we adapt it this way? I think we don't really care about test 
cases we don't want to test.


^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 5/5] dts: add JSON schema generation script
  2024-08-22 16:39 ` [PATCH 5/5] dts: add JSON schema generation script Luca Vizzarro
@ 2024-09-17 11:59   ` Juraj Linkeš
  2024-10-01 20:48   ` Nicholas Pratte
  1 sibling, 0 replies; 83+ messages in thread
From: Juraj Linkeš @ 2024-09-17 11:59 UTC (permalink / raw)
  To: Luca Vizzarro, dev; +Cc: Honnappa Nagarahalli, Paul Szczepanek


>   create mode 100755 dts/generate-schema.py

Could it be worth putting this into devtools? It is a devtool.

> 
> diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst

> @@ -430,6 +430,16 @@ Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
>   Configuration Schema
>   --------------------
>   
> +The configuration schema is automatically generated from Pydantic models and can be found
> +at ``dts/framework/config/conf_yaml_schema.json``. Whenever the models are changed, the schema
> +should be regenerated using the dedicated script at ``dts/generate-schema.py``, e.g.:

Should we add this to devtools/dts-check-format.sh? Looks like a good 
candidate.

> +
> +.. code-block:: console
> +
> +   $ poetry shell
> +   (dts-py3.10) $ ./generate-schema.py
> +
> +
>   Definitions
>   ~~~~~~~~~~~

The definition names have changed and maybe there are also some other 
changes or does that not matter? Can these Pydantic changes help us with 
generating this schema description as well?

> diff --git a/dts/generate-schema.py b/dts/generate-schema.py

> @@ -0,0 +1,38 @@
> +#!/usr/bin/env python3
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2024 Arm Limited
> +
> +"""JSON schema generation script."""

This should at least say how to run the script, but we probably want to 
add more, such as from what it's creating the schema and where it's 
going to put it.


> +from framework.config import ConfigurationType
> +

Ah, so it is used elsewhere. Let's just rename it then.

> +DTS_DIR = os.path.dirname(os.path.realpath(__file__))
> +RELATIVE_PATH_TO_SCHEMA = "framework/config/conf_yaml_schema.json"

We're using pathlib everywhere in DTS, so let's use it here as well. Not 
sure if the portability is needed in this script, but why not.


> +class GenerateSchemaWithDialect(GenerateJsonSchema):
> +    """Custom schema generator which adds the schema dialect."""

I'd add that we're adding a reference to the schema dialect.


> +    print("Schema generated successfully!")
> +except Exception as e:
> +    raise Exception("failed to generate schema") from e

Let's unify the message with the print above by capitalizing and adding 
a dot to the end.


^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 2/5] dts: add Pydantic and remove Warlock
  2024-08-22 16:39 ` [PATCH 2/5] dts: add Pydantic and remove Warlock Luca Vizzarro
  2024-09-16 13:17   ` Juraj Linkeš
@ 2024-09-19 19:56   ` Nicholas Pratte
  2024-09-30 20:41   ` Dean Marx
  2 siblings, 0 replies; 83+ messages in thread
From: Nicholas Pratte @ 2024-09-19 19:56 UTC (permalink / raw)
  To: Luca Vizzarro
  Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

I understand Juraj's concern about the dependencies assuming this
change was to stay in its own isolated patch in the log. That aside,
this is pretty straightforward, and I have confidence in the judgement
of whatever decision is made between the two of you.

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

On Thu, Aug 22, 2024 at 12:40 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> Add Pydantic to the project dependencies while dropping Warlock.
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>  dts/poetry.lock    | 346 +++++++++++++++++----------------------------
>  dts/pyproject.toml |   3 +-
>  2 files changed, 135 insertions(+), 214 deletions(-)
>
> diff --git a/dts/poetry.lock b/dts/poetry.lock
> index 5f8fa03933..c5b0d059a8 100644
> --- a/dts/poetry.lock
> +++ b/dts/poetry.lock
> @@ -1,23 +1,16 @@
>  # This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
>
>  [[package]]
> -name = "attrs"
> -version = "23.1.0"
> -description = "Classes Without Boilerplate"
> +name = "annotated-types"
> +version = "0.7.0"
> +description = "Reusable constraint types to use with typing.Annotated"
>  optional = false
> -python-versions = ">=3.7"
> +python-versions = ">=3.8"
>  files = [
> -    {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
> -    {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
> +    {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"},
> +    {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
>  ]
>
> -[package.extras]
> -cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
> -dev = ["attrs[docs,tests]", "pre-commit"]
> -docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
> -tests = ["attrs[tests-no-zope]", "zope-interface"]
> -tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
> -
>  [[package]]
>  name = "bcrypt"
>  version = "4.0.1"
> @@ -280,66 +273,6 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
>  plugins = ["setuptools"]
>  requirements-deprecated-finder = ["pip-api", "pipreqs"]
>
> -[[package]]
> -name = "jsonpatch"
> -version = "1.33"
> -description = "Apply JSON-Patches (RFC 6902)"
> -optional = false
> -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
> -files = [
> -    {file = "jsonpatch-1.33-py2.py3-none-any.whl", hash = "sha256:0ae28c0cd062bbd8b8ecc26d7d164fbbea9652a1a3693f3b956c1eae5145dade"},
> -    {file = "jsonpatch-1.33.tar.gz", hash = "sha256:9fcd4009c41e6d12348b4a0ff2563ba56a2923a7dfee731d004e212e1ee5030c"},
> -]
> -
> -[package.dependencies]
> -jsonpointer = ">=1.9"
> -
> -[[package]]
> -name = "jsonpointer"
> -version = "2.4"
> -description = "Identify specific nodes in a JSON document (RFC 6901)"
> -optional = false
> -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
> -files = [
> -    {file = "jsonpointer-2.4-py2.py3-none-any.whl", hash = "sha256:15d51bba20eea3165644553647711d150376234112651b4f1811022aecad7d7a"},
> -    {file = "jsonpointer-2.4.tar.gz", hash = "sha256:585cee82b70211fa9e6043b7bb89db6e1aa49524340dde8ad6b63206ea689d88"},
> -]
> -
> -[[package]]
> -name = "jsonschema"
> -version = "4.18.4"
> -description = "An implementation of JSON Schema validation for Python"
> -optional = false
> -python-versions = ">=3.8"
> -files = [
> -    {file = "jsonschema-4.18.4-py3-none-any.whl", hash = "sha256:971be834317c22daaa9132340a51c01b50910724082c2c1a2ac87eeec153a3fe"},
> -    {file = "jsonschema-4.18.4.tar.gz", hash = "sha256:fb3642735399fa958c0d2aad7057901554596c63349f4f6b283c493cf692a25d"},
> -]
> -
> -[package.dependencies]
> -attrs = ">=22.2.0"
> -jsonschema-specifications = ">=2023.03.6"
> -referencing = ">=0.28.4"
> -rpds-py = ">=0.7.1"
> -
> -[package.extras]
> -format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
> -format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
> -
> -[[package]]
> -name = "jsonschema-specifications"
> -version = "2023.7.1"
> -description = "The JSON Schema meta-schemas and vocabularies, exposed as a Registry"
> -optional = false
> -python-versions = ">=3.8"
> -files = [
> -    {file = "jsonschema_specifications-2023.7.1-py3-none-any.whl", hash = "sha256:05adf340b659828a004220a9613be00fa3f223f2b82002e273dee62fd50524b1"},
> -    {file = "jsonschema_specifications-2023.7.1.tar.gz", hash = "sha256:c91a50404e88a1f6ba40636778e2ee08f6e24c5613fe4c53ac24578a5a7f72bb"},
> -]
> -
> -[package.dependencies]
> -referencing = ">=0.28.0"
> -
>  [[package]]
>  name = "mccabe"
>  version = "0.7.0"
> @@ -492,6 +425,129 @@ files = [
>      {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
>  ]
>
> +[[package]]
> +name = "pydantic"
> +version = "2.8.2"
> +description = "Data validation using Python type hints"
> +optional = false
> +python-versions = ">=3.8"
> +files = [
> +    {file = "pydantic-2.8.2-py3-none-any.whl", hash = "sha256:73ee9fddd406dc318b885c7a2eab8a6472b68b8fb5ba8150949fc3db939f23c8"},
> +    {file = "pydantic-2.8.2.tar.gz", hash = "sha256:6f62c13d067b0755ad1c21a34bdd06c0c12625a22b0fc09c6b149816604f7c2a"},
> +]
> +
> +[package.dependencies]
> +annotated-types = ">=0.4.0"
> +pydantic-core = "2.20.1"
> +typing-extensions = [
> +    {version = ">=4.12.2", markers = "python_version >= \"3.13\""},
> +    {version = ">=4.6.1", markers = "python_version < \"3.13\""},
> +]
> +
> +[package.extras]
> +email = ["email-validator (>=2.0.0)"]
> +
> +[[package]]
> +name = "pydantic-core"
> +version = "2.20.1"
> +description = "Core functionality for Pydantic validation and serialization"
> +optional = false
> +python-versions = ">=3.8"
> +files = [
> +    {file = "pydantic_core-2.20.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:3acae97ffd19bf091c72df4d726d552c473f3576409b2a7ca36b2f535ffff4a3"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:41f4c96227a67a013e7de5ff8f20fb496ce573893b7f4f2707d065907bffdbd6"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5f239eb799a2081495ea659d8d4a43a8f42cd1fe9ff2e7e436295c38a10c286a"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:53e431da3fc53360db73eedf6f7124d1076e1b4ee4276b36fb25514544ceb4a3"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f1f62b2413c3a0e846c3b838b2ecd6c7a19ec6793b2a522745b0869e37ab5bc1"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5d41e6daee2813ecceea8eda38062d69e280b39df793f5a942fa515b8ed67953"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d482efec8b7dc6bfaedc0f166b2ce349df0011f5d2f1f25537ced4cfc34fd98"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e93e1a4b4b33daed65d781a57a522ff153dcf748dee70b40c7258c5861e1768a"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e7c4ea22b6739b162c9ecaaa41d718dfad48a244909fe7ef4b54c0b530effc5a"},
> +    {file = "pydantic_core-2.20.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:4f2790949cf385d985a31984907fecb3896999329103df4e4983a4a41e13e840"},
> +    {file = "pydantic_core-2.20.1-cp310-none-win32.whl", hash = "sha256:5e999ba8dd90e93d57410c5e67ebb67ffcaadcea0ad973240fdfd3a135506250"},
> +    {file = "pydantic_core-2.20.1-cp310-none-win_amd64.whl", hash = "sha256:512ecfbefef6dac7bc5eaaf46177b2de58cdf7acac8793fe033b24ece0b9566c"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:d2a8fa9d6d6f891f3deec72f5cc668e6f66b188ab14bb1ab52422fe8e644f312"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:175873691124f3d0da55aeea1d90660a6ea7a3cfea137c38afa0a5ffabe37b88"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:37eee5b638f0e0dcd18d21f59b679686bbd18917b87db0193ae36f9c23c355fc"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:25e9185e2d06c16ee438ed39bf62935ec436474a6ac4f9358524220f1b236e43"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:150906b40ff188a3260cbee25380e7494ee85048584998c1e66df0c7a11c17a6"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8ad4aeb3e9a97286573c03df758fc7627aecdd02f1da04516a86dc159bf70121"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d3f3ed29cd9f978c604708511a1f9c2fdcb6c38b9aae36a51905b8811ee5cbf1"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0dae11d8f5ded51699c74d9548dcc5938e0804cc8298ec0aa0da95c21fff57b"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:faa6b09ee09433b87992fb5a2859efd1c264ddc37280d2dd5db502126d0e7f27"},
> +    {file = "pydantic_core-2.20.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9dc1b507c12eb0481d071f3c1808f0529ad41dc415d0ca11f7ebfc666e66a18b"},
> +    {file = "pydantic_core-2.20.1-cp311-none-win32.whl", hash = "sha256:fa2fddcb7107e0d1808086ca306dcade7df60a13a6c347a7acf1ec139aa6789a"},
> +    {file = "pydantic_core-2.20.1-cp311-none-win_amd64.whl", hash = "sha256:40a783fb7ee353c50bd3853e626f15677ea527ae556429453685ae32280c19c2"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:595ba5be69b35777474fa07f80fc260ea71255656191adb22a8c53aba4479231"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a4f55095ad087474999ee28d3398bae183a66be4823f753cd7d67dd0153427c9"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f9aa05d09ecf4c75157197f27cdc9cfaeb7c5f15021c6373932bf3e124af029f"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e97fdf088d4b31ff4ba35db26d9cc472ac7ef4a2ff2badeabf8d727b3377fc52"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:bc633a9fe1eb87e250b5c57d389cf28998e4292336926b0b6cdaee353f89a237"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d573faf8eb7e6b1cbbcb4f5b247c60ca8be39fe2c674495df0eb4318303137fe"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:26dc97754b57d2fd00ac2b24dfa341abffc380b823211994c4efac7f13b9e90e"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:33499e85e739a4b60c9dac710c20a08dc73cb3240c9a0e22325e671b27b70d24"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:bebb4d6715c814597f85297c332297c6ce81e29436125ca59d1159b07f423eb1"},
> +    {file = "pydantic_core-2.20.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:516d9227919612425c8ef1c9b869bbbee249bc91912c8aaffb66116c0b447ebd"},
> +    {file = "pydantic_core-2.20.1-cp312-none-win32.whl", hash = "sha256:469f29f9093c9d834432034d33f5fe45699e664f12a13bf38c04967ce233d688"},
> +    {file = "pydantic_core-2.20.1-cp312-none-win_amd64.whl", hash = "sha256:035ede2e16da7281041f0e626459bcae33ed998cca6a0a007a5ebb73414ac72d"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:0827505a5c87e8aa285dc31e9ec7f4a17c81a813d45f70b1d9164e03a813a686"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:19c0fa39fa154e7e0b7f82f88ef85faa2a4c23cc65aae2f5aea625e3c13c735a"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa223cd1e36b642092c326d694d8bf59b71ddddc94cdb752bbbb1c5c91d833b"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:c336a6d235522a62fef872c6295a42ecb0c4e1d0f1a3e500fe949415761b8a19"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7eb6a0587eded33aeefea9f916899d42b1799b7b14b8f8ff2753c0ac1741edac"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:70c8daf4faca8da5a6d655f9af86faf6ec2e1768f4b8b9d0226c02f3d6209703"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e9fa4c9bf273ca41f940bceb86922a7667cd5bf90e95dbb157cbb8441008482c"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:11b71d67b4725e7e2a9f6e9c0ac1239bbc0c48cce3dc59f98635efc57d6dac83"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:270755f15174fb983890c49881e93f8f1b80f0b5e3a3cc1394a255706cabd203"},
> +    {file = "pydantic_core-2.20.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:c81131869240e3e568916ef4c307f8b99583efaa60a8112ef27a366eefba8ef0"},
> +    {file = "pydantic_core-2.20.1-cp313-none-win32.whl", hash = "sha256:b91ced227c41aa29c672814f50dbb05ec93536abf8f43cd14ec9521ea09afe4e"},
> +    {file = "pydantic_core-2.20.1-cp313-none-win_amd64.whl", hash = "sha256:65db0f2eefcaad1a3950f498aabb4875c8890438bc80b19362cf633b87a8ab20"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:4745f4ac52cc6686390c40eaa01d48b18997cb130833154801a442323cc78f91"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:a8ad4c766d3f33ba8fd692f9aa297c9058970530a32c728a2c4bfd2616d3358b"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41e81317dd6a0127cabce83c0c9c3fbecceae981c8391e6f1dec88a77c8a569a"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:04024d270cf63f586ad41fff13fde4311c4fc13ea74676962c876d9577bcc78f"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:eaad4ff2de1c3823fddf82f41121bdf453d922e9a238642b1dedb33c4e4f98ad"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:26ab812fa0c845df815e506be30337e2df27e88399b985d0bb4e3ecfe72df31c"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3c5ebac750d9d5f2706654c638c041635c385596caf68f81342011ddfa1e5598"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2aafc5a503855ea5885559eae883978c9b6d8c8993d67766ee73d82e841300dd"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:4868f6bd7c9d98904b748a2653031fc9c2f85b6237009d475b1008bfaeb0a5aa"},
> +    {file = "pydantic_core-2.20.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:aa2f457b4af386254372dfa78a2eda2563680d982422641a85f271c859df1987"},
> +    {file = "pydantic_core-2.20.1-cp38-none-win32.whl", hash = "sha256:225b67a1f6d602de0ce7f6c1c3ae89a4aa25d3de9be857999e9124f15dab486a"},
> +    {file = "pydantic_core-2.20.1-cp38-none-win_amd64.whl", hash = "sha256:6b507132dcfc0dea440cce23ee2182c0ce7aba7054576efc65634f080dbe9434"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:b03f7941783b4c4a26051846dea594628b38f6940a2fdc0df00b221aed39314c"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:1eedfeb6089ed3fad42e81a67755846ad4dcc14d73698c120a82e4ccf0f1f9f6"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:635fee4e041ab9c479e31edda27fcf966ea9614fff1317e280d99eb3e5ab6fe2"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:77bf3ac639c1ff567ae3b47f8d4cc3dc20f9966a2a6dd2311dcc055d3d04fb8a"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7ed1b0132f24beeec5a78b67d9388656d03e6a7c837394f99257e2d55b461611"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c6514f963b023aeee506678a1cf821fe31159b925c4b76fe2afa94cc70b3222b"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:10d4204d8ca33146e761c79f83cc861df20e7ae9f6487ca290a97702daf56006"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2d036c7187b9422ae5b262badb87a20a49eb6c5238b2004e96d4da1231badef1"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9ebfef07dbe1d93efb94b4700f2d278494e9162565a54f124c404a5656d7ff09"},
> +    {file = "pydantic_core-2.20.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:6b9d9bb600328a1ce523ab4f454859e9d439150abb0906c5a1983c146580ebab"},
> +    {file = "pydantic_core-2.20.1-cp39-none-win32.whl", hash = "sha256:784c1214cb6dd1e3b15dd8b91b9a53852aed16671cc3fbe4786f4f1db07089e2"},
> +    {file = "pydantic_core-2.20.1-cp39-none-win_amd64.whl", hash = "sha256:d2fe69c5434391727efa54b47a1e7986bb0186e72a41b203df8f5b0a19a4f669"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:a45f84b09ac9c3d35dfcf6a27fd0634d30d183205230a0ebe8373a0e8cfa0906"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d02a72df14dfdbaf228424573a07af10637bd490f0901cee872c4f434a735b94"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d2b27e6af28f07e2f195552b37d7d66b150adbaa39a6d327766ffd695799780f"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:084659fac3c83fd674596612aeff6041a18402f1e1bc19ca39e417d554468482"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:242b8feb3c493ab78be289c034a1f659e8826e2233786e36f2893a950a719bb6"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:38cf1c40a921d05c5edc61a785c0ddb4bed67827069f535d794ce6bcded919fc"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:e0bbdd76ce9aa5d4209d65f2b27fc6e5ef1312ae6c5333c26db3f5ade53a1e99"},
> +    {file = "pydantic_core-2.20.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:254ec27fdb5b1ee60684f91683be95e5133c994cc54e86a0b0963afa25c8f8a6"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:407653af5617f0757261ae249d3fba09504d7a71ab36ac057c938572d1bc9331"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:c693e916709c2465b02ca0ad7b387c4f8423d1db7b4649c551f27a529181c5ad"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5b5ff4911aea936a47d9376fd3ab17e970cc543d1b68921886e7f64bd28308d1"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:177f55a886d74f1808763976ac4efd29b7ed15c69f4d838bbd74d9d09cf6fa86"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:964faa8a861d2664f0c7ab0c181af0bea66098b1919439815ca8803ef136fc4e"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:4dd484681c15e6b9a977c785a345d3e378d72678fd5f1f3c0509608da24f2ac0"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f6d6cff3538391e8486a431569b77921adfcdef14eb18fbf19b7c0a5294d4e6a"},
> +    {file = "pydantic_core-2.20.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:a6d511cc297ff0883bc3708b465ff82d7560193169a8b93260f74ecb0a5e08a7"},
> +    {file = "pydantic_core-2.20.1.tar.gz", hash = "sha256:26ca695eeee5f9f1aeeb211ffc12f10bcb6f71e2989988fda61dabd65db878d4"},
> +]
> +
> +[package.dependencies]
> +typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
> +
>  [[package]]
>  name = "pydocstyle"
>  version = "6.1.1"
> @@ -633,127 +689,6 @@ files = [
>      {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
>  ]
>
> -[[package]]
> -name = "referencing"
> -version = "0.30.0"
> -description = "JSON Referencing + Python"
> -optional = false
> -python-versions = ">=3.8"
> -files = [
> -    {file = "referencing-0.30.0-py3-none-any.whl", hash = "sha256:c257b08a399b6c2f5a3510a50d28ab5dbc7bbde049bcaf954d43c446f83ab548"},
> -    {file = "referencing-0.30.0.tar.gz", hash = "sha256:47237742e990457f7512c7d27486394a9aadaf876cbfaa4be65b27b4f4d47c6b"},
> -]
> -
> -[package.dependencies]
> -attrs = ">=22.2.0"
> -rpds-py = ">=0.7.0"
> -
> -[[package]]
> -name = "rpds-py"
> -version = "0.9.2"
> -description = "Python bindings to Rust's persistent data structures (rpds)"
> -optional = false
> -python-versions = ">=3.8"
> -files = [
> -    {file = "rpds_py-0.9.2-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:ab6919a09c055c9b092798ce18c6c4adf49d24d4d9e43a92b257e3f2548231e7"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d55777a80f78dd09410bd84ff8c95ee05519f41113b2df90a69622f5540c4f8b"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a216b26e5af0a8e265d4efd65d3bcec5fba6b26909014effe20cd302fd1138fa"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:29cd8bfb2d716366a035913ced99188a79b623a3512292963d84d3e06e63b496"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:44659b1f326214950a8204a248ca6199535e73a694be8d3e0e869f820767f12f"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:745f5a43fdd7d6d25a53ab1a99979e7f8ea419dfefebcab0a5a1e9095490ee5e"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a987578ac5214f18b99d1f2a3851cba5b09f4a689818a106c23dbad0dfeb760f"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf4151acb541b6e895354f6ff9ac06995ad9e4175cbc6d30aaed08856558201f"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:03421628f0dc10a4119d714a17f646e2837126a25ac7a256bdf7c3943400f67f"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:13b602dc3e8dff3063734f02dcf05111e887f301fdda74151a93dbbc249930fe"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:fae5cb554b604b3f9e2c608241b5d8d303e410d7dfb6d397c335f983495ce7f6"},
> -    {file = "rpds_py-0.9.2-cp310-none-win32.whl", hash = "sha256:47c5f58a8e0c2c920cc7783113df2fc4ff12bf3a411d985012f145e9242a2764"},
> -    {file = "rpds_py-0.9.2-cp310-none-win_amd64.whl", hash = "sha256:4ea6b73c22d8182dff91155af018b11aac9ff7eca085750455c5990cb1cfae6e"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:e564d2238512c5ef5e9d79338ab77f1cbbda6c2d541ad41b2af445fb200385e3"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f411330a6376fb50e5b7a3e66894e4a39e60ca2e17dce258d53768fea06a37bd"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e7521f5af0233e89939ad626b15278c71b69dc1dfccaa7b97bd4cdf96536bb7"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8d3335c03100a073883857e91db9f2e0ef8a1cf42dc0369cbb9151c149dbbc1b"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d25b1c1096ef0447355f7293fbe9ad740f7c47ae032c2884113f8e87660d8f6e"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a5d3fbd02efd9cf6a8ffc2f17b53a33542f6b154e88dd7b42ef4a4c0700fdad"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c5934e2833afeaf36bd1eadb57256239785f5af0220ed8d21c2896ec4d3a765f"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:095b460e117685867d45548fbd8598a8d9999227e9061ee7f012d9d264e6048d"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:91378d9f4151adc223d584489591dbb79f78814c0734a7c3bfa9c9e09978121c"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:24a81c177379300220e907e9b864107614b144f6c2a15ed5c3450e19cf536fae"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:de0b6eceb46141984671802d412568d22c6bacc9b230174f9e55fc72ef4f57de"},
> -    {file = "rpds_py-0.9.2-cp311-none-win32.whl", hash = "sha256:700375326ed641f3d9d32060a91513ad668bcb7e2cffb18415c399acb25de2ab"},
> -    {file = "rpds_py-0.9.2-cp311-none-win_amd64.whl", hash = "sha256:0766babfcf941db8607bdaf82569ec38107dbb03c7f0b72604a0b346b6eb3298"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-macosx_10_7_x86_64.whl", hash = "sha256:b1440c291db3f98a914e1afd9d6541e8fc60b4c3aab1a9008d03da4651e67386"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0f2996fbac8e0b77fd67102becb9229986396e051f33dbceada3debaacc7033f"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f30d205755566a25f2ae0382944fcae2f350500ae4df4e795efa9e850821d82"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:159fba751a1e6b1c69244e23ba6c28f879a8758a3e992ed056d86d74a194a0f3"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a1f044792e1adcea82468a72310c66a7f08728d72a244730d14880cd1dabe36b"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9251eb8aa82e6cf88510530b29eef4fac825a2b709baf5b94a6094894f252387"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01899794b654e616c8625b194ddd1e5b51ef5b60ed61baa7a2d9c2ad7b2a4238"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0c43f8ae8f6be1d605b0465671124aa8d6a0e40f1fb81dcea28b7e3d87ca1e1"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:207f57c402d1f8712618f737356e4b6f35253b6d20a324d9a47cb9f38ee43a6b"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b52e7c5ae35b00566d244ffefba0f46bb6bec749a50412acf42b1c3f402e2c90"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:978fa96dbb005d599ec4fd9ed301b1cc45f1a8f7982d4793faf20b404b56677d"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:6aa8326a4a608e1c28da191edd7c924dff445251b94653988efb059b16577a4d"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:aad51239bee6bff6823bbbdc8ad85136c6125542bbc609e035ab98ca1e32a192"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4bd4dc3602370679c2dfb818d9c97b1137d4dd412230cfecd3c66a1bf388a196"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dd9da77c6ec1f258387957b754f0df60766ac23ed698b61941ba9acccd3284d1"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:190ca6f55042ea4649ed19c9093a9be9d63cd8a97880106747d7147f88a49d18"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:876bf9ed62323bc7dcfc261dbc5572c996ef26fe6406b0ff985cbcf460fc8a4c"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa2818759aba55df50592ecbc95ebcdc99917fa7b55cc6796235b04193eb3c55"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9ea4d00850ef1e917815e59b078ecb338f6a8efda23369677c54a5825dbebb55"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:5855c85eb8b8a968a74dc7fb014c9166a05e7e7a8377fb91d78512900aadd13d"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:14c408e9d1a80dcb45c05a5149e5961aadb912fff42ca1dd9b68c0044904eb32"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:65a0583c43d9f22cb2130c7b110e695fff834fd5e832a776a107197e59a1898e"},
> -    {file = "rpds_py-0.9.2-cp38-none-win32.whl", hash = "sha256:71f2f7715935a61fa3e4ae91d91b67e571aeb5cb5d10331ab681256bda2ad920"},
> -    {file = "rpds_py-0.9.2-cp38-none-win_amd64.whl", hash = "sha256:674c704605092e3ebbbd13687b09c9f78c362a4bc710343efe37a91457123044"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:07e2c54bef6838fa44c48dfbc8234e8e2466d851124b551fc4e07a1cfeb37260"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f7fdf55283ad38c33e35e2855565361f4bf0abd02470b8ab28d499c663bc5d7c"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:890ba852c16ace6ed9f90e8670f2c1c178d96510a21b06d2fa12d8783a905193"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:50025635ba8b629a86d9d5474e650da304cb46bbb4d18690532dd79341467846"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:517cbf6e67ae3623c5127206489d69eb2bdb27239a3c3cc559350ef52a3bbf0b"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0836d71ca19071090d524739420a61580f3f894618d10b666cf3d9a1688355b1"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c439fd54b2b9053717cca3de9583be6584b384d88d045f97d409f0ca867d80f"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f68996a3b3dc9335037f82754f9cdbe3a95db42bde571d8c3be26cc6245f2324"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7d68dc8acded354c972116f59b5eb2e5864432948e098c19fe6994926d8e15c3"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:f963c6b1218b96db85fc37a9f0851eaf8b9040aa46dec112611697a7023da535"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5a46859d7f947061b4010e554ccd1791467d1b1759f2dc2ec9055fa239f1bc26"},
> -    {file = "rpds_py-0.9.2-cp39-none-win32.whl", hash = "sha256:e07e5dbf8a83c66783a9fe2d4566968ea8c161199680e8ad38d53e075df5f0d0"},
> -    {file = "rpds_py-0.9.2-cp39-none-win_amd64.whl", hash = "sha256:682726178138ea45a0766907957b60f3a1bf3acdf212436be9733f28b6c5af3c"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_10_7_x86_64.whl", hash = "sha256:196cb208825a8b9c8fc360dc0f87993b8b260038615230242bf18ec84447c08d"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:c7671d45530fcb6d5e22fd40c97e1e1e01965fc298cbda523bb640f3d923b387"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83b32f0940adec65099f3b1c215ef7f1d025d13ff947975a055989cb7fd019a4"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7f67da97f5b9eac838b6980fc6da268622e91f8960e083a34533ca710bec8611"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03975db5f103997904c37e804e5f340c8fdabbb5883f26ee50a255d664eed58c"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:987b06d1cdb28f88a42e4fb8a87f094e43f3c435ed8e486533aea0bf2e53d931"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c861a7e4aef15ff91233751619ce3a3d2b9e5877e0fcd76f9ea4f6847183aa16"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:02938432352359805b6da099c9c95c8a0547fe4b274ce8f1a91677401bb9a45f"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:ef1f08f2a924837e112cba2953e15aacfccbbfcd773b4b9b4723f8f2ddded08e"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:35da5cc5cb37c04c4ee03128ad59b8c3941a1e5cd398d78c37f716f32a9b7f67"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:141acb9d4ccc04e704e5992d35472f78c35af047fa0cfae2923835d153f091be"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_10_7_x86_64.whl", hash = "sha256:79f594919d2c1a0cc17d1988a6adaf9a2f000d2e1048f71f298b056b1018e872"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:a06418fe1155e72e16dddc68bb3780ae44cebb2912fbd8bb6ff9161de56e1798"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b2eb034c94b0b96d5eddb290b7b5198460e2d5d0c421751713953a9c4e47d10"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b08605d248b974eb02f40bdcd1a35d3924c83a2a5e8f5d0fa5af852c4d960af"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a0805911caedfe2736935250be5008b261f10a729a303f676d3d5fea6900c96a"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab2299e3f92aa5417d5e16bb45bb4586171c1327568f638e8453c9f8d9e0f020"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c8d7594e38cf98d8a7df25b440f684b510cf4627fe038c297a87496d10a174f"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8b9ec12ad5f0a4625db34db7e0005be2632c1013b253a4a60e8302ad4d462afd"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:1fcdee18fea97238ed17ab6478c66b2095e4ae7177e35fb71fbe561a27adf620"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_i686.whl", hash = "sha256:933a7d5cd4b84f959aedeb84f2030f0a01d63ae6cf256629af3081cf3e3426e8"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:686ba516e02db6d6f8c279d1641f7067ebb5dc58b1d0536c4aaebb7bf01cdc5d"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_10_7_x86_64.whl", hash = "sha256:0173c0444bec0a3d7d848eaeca2d8bd32a1b43f3d3fde6617aac3731fa4be05f"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:d576c3ef8c7b2d560e301eb33891d1944d965a4d7a2eacb6332eee8a71827db6"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed89861ee8c8c47d6beb742a602f912b1bb64f598b1e2f3d758948721d44d468"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1054a08e818f8e18910f1bee731583fe8f899b0a0a5044c6e680ceea34f93876"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99e7c4bb27ff1aab90dcc3e9d37ee5af0231ed98d99cb6f5250de28889a3d502"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c545d9d14d47be716495076b659db179206e3fd997769bc01e2d550eeb685596"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9039a11bca3c41be5a58282ed81ae422fa680409022b996032a43badef2a3752"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fb39aca7a64ad0c9490adfa719dbeeb87d13be137ca189d2564e596f8ba32c07"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:2d8b3b3a2ce0eaa00c5bbbb60b6713e94e7e0becab7b3db6c5c77f979e8ed1f1"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:99b1c16f732b3a9971406fbfe18468592c5a3529585a45a35adbc1389a529a03"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:c27ee01a6c3223025f4badd533bea5e87c988cb0ba2811b690395dfe16088cfe"},
> -    {file = "rpds_py-0.9.2.tar.gz", hash = "sha256:8d70e8f14900f2657c249ea4def963bed86a29b81f81f5b76b5a9215680de945"},
> -]
> -
>  [[package]]
>  name = "scapy"
>  version = "2.5.0"
> @@ -826,31 +761,16 @@ files = [
>
>  [[package]]
>  name = "typing-extensions"
> -version = "4.11.0"
> +version = "4.12.2"
>  description = "Backported and Experimental Type Hints for Python 3.8+"
>  optional = false
>  python-versions = ">=3.8"
>  files = [
> -    {file = "typing_extensions-4.11.0-py3-none-any.whl", hash = "sha256:c1f94d72897edaf4ce775bb7558d5b79d8126906a14ea5ed1635921406c0387a"},
> -    {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
> +    {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"},
> +    {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"},
>  ]
>
> -[[package]]
> -name = "warlock"
> -version = "2.0.1"
> -description = "Python object model built on JSON schema and JSON patch."
> -optional = false
> -python-versions = ">=3.7,<4.0"
> -files = [
> -    {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"},
> -    {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"},
> -]
> -
> -[package.dependencies]
> -jsonpatch = ">=1,<2"
> -jsonschema = ">=4,<5"
> -
>  [metadata]
>  lock-version = "2.0"
>  python-versions = "^3.10"
> -content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
> +content-hash = "f69ffb8c1545d7beb035533dab109722f844f39f9ffd46b7aceb386e90fa039d"
> diff --git a/dts/pyproject.toml b/dts/pyproject.toml
> index 0b9b09805a..e5785f27d8 100644
> --- a/dts/pyproject.toml
> +++ b/dts/pyproject.toml
> @@ -19,13 +19,13 @@ documentation = "https://doc.dpdk.org/guides/tools/dts.html"
>
>  [tool.poetry.dependencies]
>  python = "^3.10"
> -warlock = "^2.0.1"
>  PyYAML = "^6.0"
>  types-PyYAML = "^6.0.8"
>  fabric = "^2.7.1"
>  scapy = "^2.5.0"
>  pydocstyle = "6.1.1"
>  typing-extensions = "^4.11.0"
> +pydantic = "^2.8.2"
>
>  [tool.poetry.group.dev.dependencies]
>  mypy = "^1.10.0"
> @@ -55,6 +55,7 @@ python_version = "3.10"
>  enable_error_code = ["ignore-without-code"]
>  show_error_codes = true
>  warn_unused_ignores = true
> +plugins = "pydantic.mypy"
>
>  [tool.isort]
>  profile = "black"
> --
> 2.34.1
>

-- 



*Let's Connect!*

...  *October Webinars*

Ask Us Anything: IOL Services 
Open Q&A 
<https://unh.zoom.us/webinar/register/9017265932716/WN_OUo5S7iQRLmKKY7CsmwZhw#/registration>Your 
questions. Our answers. Let's get started.


Oct 3rd


Live Tour of INTACT® 
for IPv6 Testing and Validation 
<https://unh.zoom.us/webinar/register/7117231236474/WN_I2zfyi_2S2yEiXkxBRi8sA#/registration>
Open tour. Open Q&A. See why we think you'll love INTACT.

Oct 9th


How to 
Prep for Our NVMe® Plugfest #21 
<https://unh.zoom.us/webinar/register/4017266809553/WN_X1iA2SZ8QhmcGboF2DImNg#/registration>
Checklists. Conversation. Let's get ready to plugin! 
Oct 15th


... * 
Newsletter*

*
*
Get the IOL Connector 
<https://www.iol.unh.edu/news/email-newsletters> for our latest news and 
event info.



.

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 1/5] dts: add TestSuiteSpec class and discovery
  2024-08-22 16:39 ` [PATCH 1/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
  2024-09-16 13:00   ` Juraj Linkeš
@ 2024-09-19 20:01   ` Nicholas Pratte
  1 sibling, 0 replies; 83+ messages in thread
From: Nicholas Pratte @ 2024-09-19 20:01 UTC (permalink / raw)
  To: Luca Vizzarro
  Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

I think Juraj's comments here make sense, it probably would make sense
to separate this in-conjunction with Juraj's decorator patch and use
it as a dependency. From what I can understand, the changes offered
here make sense to me.

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

-- 



*Let's Connect!*

...  *October Webinars*

Ask Us Anything: IOL Services 
Open Q&A 
<https://unh.zoom.us/webinar/register/9017265932716/WN_OUo5S7iQRLmKKY7CsmwZhw#/registration>Your 
questions. Our answers. Let's get started.


Oct 3rd


Live Tour of INTACT® 
for IPv6 Testing and Validation 
<https://unh.zoom.us/webinar/register/7117231236474/WN_I2zfyi_2S2yEiXkxBRi8sA#/registration>
Open tour. Open Q&A. See why we think you'll love INTACT.

Oct 9th


How to 
Prep for Our NVMe® Plugfest #21 
<https://unh.zoom.us/webinar/register/4017266809553/WN_X1iA2SZ8QhmcGboF2DImNg#/registration>
Checklists. Conversation. Let's get ready to plugin! 
Oct 15th


... * 
Newsletter*

*
*
Get the IOL Connector 
<https://www.iol.unh.edu/news/email-newsletters> for our latest news and 
event info.



.

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 3/5] dts: use Pydantic in the configuration
  2024-08-22 16:39 ` [PATCH 3/5] dts: use Pydantic in the configuration Luca Vizzarro
  2024-09-17 11:13   ` Juraj Linkeš
@ 2024-09-30 17:56   ` Nicholas Pratte
  2024-10-29 12:41     ` Luca Vizzarro
  2024-09-30 21:45   ` Dean Marx
  2 siblings, 1 reply; 83+ messages in thread
From: Nicholas Pratte @ 2024-09-30 17:56 UTC (permalink / raw)
  To: Luca Vizzarro
  Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

Hi Luca! See my comments below, thanks!

On Thu, Aug 22, 2024 at 12:40 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> This change brings in Pydantic in place of Warlock. Pydantic offers
> a built-in model validation system in the classes, which allows for
> a more resilient and simpler code. As a consequence of this change:
>
> - most validation is now built-in
> - further validation is added to verify:
>   - cross referencing of node names and ports
>   - test suite and test cases names
> - dictionaries representing the config schema are removed
> - the config schema is no longer used for validation but kept as an
>   alternative format for the developer
> - the config schema can now be generated automatically from the
>   Pydantic models
> - the TrafficGeneratorType enum has been changed from inheriting
>   StrEnum to the native str and Enum. This change was necessary to
>   enable the discriminator for object unions
> - the structure of the classes has been slightly changed to perfectly
>   match the structure of the configuration files
> - updates the test suite argument to catch the ValidationError that
>   TestSuiteConfig can now raise
>
> Bugzilla ID: 1508
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>  dts/framework/config/__init__.py              | 588 +++++++++---------
>  dts/framework/config/types.py                 | 132 ----
>  dts/framework/runner.py                       |  35 +-
>  dts/framework/settings.py                     |  16 +-
>  dts/framework/testbed_model/sut_node.py       |   2 +-
>  .../traffic_generator/__init__.py             |   4 +-
>  .../traffic_generator/traffic_generator.py    |   2 +-
>  7 files changed, 325 insertions(+), 454 deletions(-)
>  delete mode 100644 dts/framework/config/types.py
>
> diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
> index df60a5030e..013c529829 100644
> --- a/dts/framework/config/__init__.py
> +++ b/dts/framework/config/__init__.py
> @@ -2,17 +2,19 @@
>  # Copyright(c) 2010-2021 Intel Corporation
>  # Copyright(c) 2022-2023 University of New Hampshire
>  # Copyright(c) 2023 PANTHEON.tech s.r.o.
> +# Copyright(c) 2024 Arm Limited
>
>  """Testbed configuration and test suite specification.
>
>  This package offers classes that hold real-time information about the testbed, hold test run
>  configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
> -the YAML test run configuration file
> -and validates it according to :download:`the schema <conf_yaml_schema.json>`.
> +the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
> +dataclass model. The Pydantic model is also available as

Out of curiosity, what is the reason for maintaining use of
dataclasses here as opposed to creating BaseModel subclasses for the
Pydantic library? I suppose both implementations would lead to the
same result, but is it mostly for the sake of familiarity and
consistency that we're still using dataclasses here?

> +:download:`JSON schema <conf_yaml_schema.json>`.
>
>  The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
> -this package. The allowed keys and types inside this dictionary are defined in
> -the :doc:`types <framework.config.types>` module.
> +this package. The allowed keys and types inside this dictionary map directly to the
> +:class:`Configuration` model, its fields and sub-models.
>
>  The test run configuration has two main sections:
>
> @@ -24,7 +26,7 @@
>
>  The real-time information about testbed is supposed to be gathered at runtime.
>
> -The classes defined in this package make heavy use of :mod:`dataclasses`.
> +The classes defined in this package make heavy use of :mod:`pydantic.dataclasses`.
>  All of them use slots and are frozen:
>
>      * Slots enables some optimizations, by pre-allocating space for the defined
> @@ -33,29 +35,31 @@
>        and makes it thread safe should we ever want to move in that direction.
>  """
>
> -import json
> -import os.path
> -from dataclasses import dataclass, fields
> -from enum import auto, unique
> +from enum import Enum, auto, unique
> +from functools import cached_property
>  from pathlib import Path
> -from typing import Union
> +from typing import TYPE_CHECKING, Annotated, Any, Literal, NamedTuple, Protocol
>
> -import warlock  # type: ignore[import-untyped]
>  import yaml
> +from pydantic import (
> +    ConfigDict,
> +    Field,
> +    StringConstraints,
> +    TypeAdapter,
> +    ValidationError,
> +    field_validator,
> +    model_validator,
> +)
> +from pydantic.config import JsonDict
> +from pydantic.dataclasses import dataclass
>  from typing_extensions import Self
>
> -from framework.config.types import (
> -    BuildTargetConfigDict,
> -    ConfigurationDict,
> -    NodeConfigDict,
> -    PortConfigDict,
> -    TestRunConfigDict,
> -    TestSuiteConfigDict,
> -    TrafficGeneratorConfigDict,
> -)
>  from framework.exception import ConfigurationError
>  from framework.utils import StrEnum
>
> +if TYPE_CHECKING:
> +    from framework.test_suite import TestSuiteSpec
> +
>
>  @unique
>  class Architecture(StrEnum):
> @@ -116,14 +120,14 @@ class Compiler(StrEnum):
>
>
>  @unique
> -class TrafficGeneratorType(StrEnum):
> +class TrafficGeneratorType(str, Enum):
>      """The supported traffic generators."""
>
>      #:
> -    SCAPY = auto()
> +    SCAPY = "SCAPY"

Going off of Juraj's comments, would you be able to provide an deeper
explanation as how this new parameterization of str and enum works
with respect to the Pydantic field discriminators?
<snip>

The rest of what you've provided is more straightforward, but I'd
still like to look at this deeper and provide review. The comments
asked here are probably my most significant.

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 2/5] dts: add Pydantic and remove Warlock
  2024-08-22 16:39 ` [PATCH 2/5] dts: add Pydantic and remove Warlock Luca Vizzarro
  2024-09-16 13:17   ` Juraj Linkeš
  2024-09-19 19:56   ` Nicholas Pratte
@ 2024-09-30 20:41   ` Dean Marx
  2 siblings, 0 replies; 83+ messages in thread
From: Dean Marx @ 2024-09-30 20:41 UTC (permalink / raw)
  To: Luca Vizzarro
  Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

[-- Attachment #1: Type: text/plain, Size: 652 bytes --]

On Thu, Aug 22, 2024 at 12:40 PM Luca Vizzarro <luca.vizzarro@arm.com>
wrote:

> Add Pydantic to the project dependencies while dropping Warlock.
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>  dts/poetry.lock    | 346 +++++++++++++++++----------------------------
>  dts/pyproject.toml |   3 +-
>  2 files changed, 135 insertions(+), 214 deletions(-)
>
> diff --git a/dts/poetry.lock b/dts/poetry.lock
> index 5f8fa03933..c5b0d059a8 100644
> --- a/dts/poetry.lock
> +++ b/dts/poetry.lock
> @@ -1,23 +1,16 @@


 Reviewed-by: Dean Marx <dmarx@iol.unh.edu>

[-- Attachment #2: Type: text/html, Size: 1140 bytes --]

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 3/5] dts: use Pydantic in the configuration
  2024-08-22 16:39 ` [PATCH 3/5] dts: use Pydantic in the configuration Luca Vizzarro
  2024-09-17 11:13   ` Juraj Linkeš
  2024-09-30 17:56   ` Nicholas Pratte
@ 2024-09-30 21:45   ` Dean Marx
  2024-10-29 12:51     ` Luca Vizzarro
  2 siblings, 1 reply; 83+ messages in thread
From: Dean Marx @ 2024-09-30 21:45 UTC (permalink / raw)
  To: Luca Vizzarro
  Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

[-- Attachment #1: Type: text/plain, Size: 2656 bytes --]

On Thu, Aug 22, 2024 at 12:40 PM Luca Vizzarro <luca.vizzarro@arm.com>
wrote:

> This change brings in Pydantic in place of Warlock. Pydantic offers
> a built-in model validation system in the classes, which allows for
> a more resilient and simpler code. As a consequence of this change:
>
> - most validation is now built-in
> - further validation is added to verify:
>   - cross referencing of node names and ports
>   - test suite and test cases names
> - dictionaries representing the config schema are removed
> - the config schema is no longer used for validation but kept as an
>   alternative format for the developer
> - the config schema can now be generated automatically from the
>   Pydantic models
> - the TrafficGeneratorType enum has been changed from inheriting
>   StrEnum to the native str and Enum. This change was necessary to
>   enable the discriminator for object unions
> - the structure of the classes has been slightly changed to perfectly
>   match the structure of the configuration files
> - updates the test suite argument to catch the ValidationError that
>   TestSuiteConfig can now raise
>
> Bugzilla ID: 1508
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>


 <snip>

> -@dataclass(slots=True, frozen=True)
> +@dataclass(slots=True, frozen=True, kw_only=True,
> config=ConfigDict(extra="forbid"))
>

Up to you but I think it might be worth specifying what some of these extra
pydantic args are for if we're going to keep the name of the decorator as
"dataclass." For example, this ConfigDict "forbid" argument seems to be
commonly used, same with the "before/after" modes with the model_validator.
Maybe a brief description somewhere in the docstrings, just so others can
see how it differs from the previous implementation even without experience
using pydantic.

<snip>

> +    @model_validator(mode="before")
>      @classmethod
> -    def from_dict(
> -        cls,
> -        entry: str | TestSuiteConfigDict,
> -    ) -> Self:
> -        """Create an instance from two different types.
> +    def convert_from_string(cls, data: Any) -> Any:
> +        """Convert the string representation into a valid mapping."""
> +        if isinstance(data, str):
> +            [test_suite, *test_cases] = data.split()
> +            return dict(test_suite=test_suite, test_cases=test_cases)
> +        return data
>

Again this is completely your call, but might be worth explaining in the
docstrings why this "before" method is used here while the other validators
are running with "after."

[-- Attachment #2: Type: text/html, Size: 3651 bytes --]

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 4/5] dts: use TestSuiteSpec class imports
  2024-08-22 16:39 ` [PATCH 4/5] dts: use TestSuiteSpec class imports Luca Vizzarro
  2024-09-17 11:39   ` Juraj Linkeš
@ 2024-10-01 17:12   ` Dean Marx
  2024-10-29 12:54     ` Luca Vizzarro
  2024-10-01 20:45   ` Nicholas Pratte
  2 siblings, 1 reply; 83+ messages in thread
From: Dean Marx @ 2024-10-01 17:12 UTC (permalink / raw)
  To: Luca Vizzarro
  Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

[-- Attachment #1: Type: text/plain, Size: 1038 bytes --]

On Thu, Aug 22, 2024 at 12:40 PM Luca Vizzarro <luca.vizzarro@arm.com>
wrote:

> The introduction of TestSuiteSpec adds auto-discovery of test suites,
> which are also automatically imported. This causes double imports as the
> runner loads the test suites. This changes the behaviour of the runner
> to load the imported classes from TestSuiteSpec instead of importing
> them again.
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
>

<snip>

> +            filtered_test_cases: list[TestCase] = [
> +                test_case
> +                for test_case in test_suite_spec.test_cases
> +                if not test_suite_config.test_cases_names
> +                or test_case.name in test_suite_config.test_cases_names
> +            ]
>
>
Just wondering, what's the plan with the filtered test cases? I'm assuming
they're stored here so we can report which cases were skipped after runtime?

Reviewed-by: Dean Marx <dmarx@iol.unh.edu>

[-- Attachment #2: Type: text/html, Size: 1854 bytes --]

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 4/5] dts: use TestSuiteSpec class imports
  2024-08-22 16:39 ` [PATCH 4/5] dts: use TestSuiteSpec class imports Luca Vizzarro
  2024-09-17 11:39   ` Juraj Linkeš
  2024-10-01 17:12   ` Dean Marx
@ 2024-10-01 20:45   ` Nicholas Pratte
  2024-10-29 12:56     ` Luca Vizzarro
  2 siblings, 1 reply; 83+ messages in thread
From: Nicholas Pratte @ 2024-10-01 20:45 UTC (permalink / raw)
  To: Luca Vizzarro
  Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

The code you have here makes sense, and I like the implementation as
it removes a lot of fluff in DTSRunner. I know Jurja mentioned in an
earlier patch in this series that this functionality intersects with
the capabilities series, but I'm missing a lot of context to
understand that fully. Maybe you could provide some insight? I'll make
sure to analyse this deeper in my own time as well. Beyond that:

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

On Thu, Aug 22, 2024 at 12:40 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> The introduction of TestSuiteSpec adds auto-discovery of test suites,
> which are also automatically imported. This causes double imports as the
> runner loads the test suites. This changes the behaviour of the runner
> to load the imported classes from TestSuiteSpec instead of importing
> them again.
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>  dts/framework/runner.py | 167 +++++++---------------------------------
>  1 file changed, 27 insertions(+), 140 deletions(-)
>
> diff --git a/dts/framework/runner.py b/dts/framework/runner.py
> index 14e405aced..00b63cc292 100644
> --- a/dts/framework/runner.py
> +++ b/dts/framework/runner.py
> @@ -2,6 +2,7 @@
>  # Copyright(c) 2010-2019 Intel Corporation
>  # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
>  # Copyright(c) 2022-2023 University of New Hampshire
> +# Copyright(c) 2024 Arm Limited
>
>  """Test suite runner module.
>
> @@ -17,14 +18,11 @@
>  and the test case stage runs test cases individually.
>  """
>
> -import importlib
> -import inspect
>  import os
> -import re
>  import sys
>  from pathlib import Path
>  from types import FunctionType
> -from typing import Iterable, Sequence
> +from typing import Iterable
>
>  from framework.testbed_model.sut_node import SutNode
>  from framework.testbed_model.tg_node import TGNode
> @@ -38,12 +36,7 @@
>      TGNodeConfiguration,
>      load_config,
>  )
> -from .exception import (
> -    BlockingTestSuiteError,
> -    ConfigurationError,
> -    SSHTimeoutError,
> -    TestCaseVerifyError,
> -)
> +from .exception import BlockingTestSuiteError, SSHTimeoutError, TestCaseVerifyError
>  from .logger import DTSLogger, DtsStage, get_dts_logger
>  from .settings import SETTINGS
>  from .test_result import (
> @@ -55,7 +48,7 @@
>      TestSuiteResult,
>      TestSuiteWithCases,
>  )
> -from .test_suite import TestSuite
> +from .test_suite import TestCase, TestCaseVariant, TestSuite
>
>
>  class DTSRunner:
> @@ -217,11 +210,10 @@ def _get_test_suites_with_cases(
>          func: bool,
>          perf: bool,
>      ) -> list[TestSuiteWithCases]:
> -        """Test suites with test cases discovery.
> +        """Get test suites with selected cases.
>
> -        The test suites with test cases defined in the user configuration are discovered
> -        and stored for future use so that we don't import the modules twice and so that
> -        the list of test suites with test cases is available for recording right away.
> +        The test suites with test cases defined in the user configuration are selected
> +        and the corresponding functions and classes are gathered.
>
>          Args:
>              test_suite_configs: Test suite configurations.
> @@ -229,139 +221,34 @@ def _get_test_suites_with_cases(
>              perf: Whether to include performance test cases in the final list.
>
>          Returns:
> -            The discovered test suites, each with test cases.
> +            The test suites, each with test cases.
>          """
>          test_suites_with_cases = []
>
>          for test_suite_config in test_suite_configs:
> -            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
> -            test_cases = []
> -            func_test_cases, perf_test_cases = self._filter_test_cases(
> -                test_suite_class, test_suite_config.test_cases_names
> -            )
> -            if func:
> -                test_cases.extend(func_test_cases)
> -            if perf:
> -                test_cases.extend(perf_test_cases)
> -
> -            test_suites_with_cases.append(
> -                TestSuiteWithCases(test_suite_class=test_suite_class, test_cases=test_cases)
> -            )
> -
> -        return test_suites_with_cases
> -
> -    def _get_test_suite_class(self, module_name: str) -> type[TestSuite]:
> -        """Find the :class:`TestSuite` class in `module_name`.
> -
> -        The full module name is `module_name` prefixed with `self._test_suite_module_prefix`.
> -        The module name is a standard filename with words separated with underscores.
> -        Search the `module_name` for a :class:`TestSuite` class which starts
> -        with `self._test_suite_class_prefix`, continuing with CamelCase `module_name`.
> -        The first matching class is returned.
> -
> -        The CamelCase convention applies to abbreviations, acronyms, initialisms and so on::
> -
> -            OS -> Os
> -            TCP -> Tcp
> -
> -        Args:
> -            module_name: The module name without prefix where to search for the test suite.
> -
> -        Returns:
> -            The found test suite class.
> -
> -        Raises:
> -            ConfigurationError: If the corresponding module is not found or
> -                a valid :class:`TestSuite` is not found in the module.
> -        """
> -
> -        def is_test_suite(object) -> bool:
> -            """Check whether `object` is a :class:`TestSuite`.
> -
> -            The `object` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
> -
> -            Args:
> -                object: The object to be checked.
> -
> -            Returns:
> -                :data:`True` if `object` is a subclass of `TestSuite`.
> -            """
> -            try:
> -                if issubclass(object, TestSuite) and object is not TestSuite:
> -                    return True
> -            except TypeError:
> -                return False
> -            return False
> -
> -        testsuite_module_path = f"{self._test_suite_module_prefix}{module_name}"
> -        try:
> -            test_suite_module = importlib.import_module(testsuite_module_path)
> -        except ModuleNotFoundError as e:
> -            raise ConfigurationError(
> -                f"Test suite module '{testsuite_module_path}' not found."
> -            ) from e
> -
> -        camel_case_suite_name = "".join(
> -            [suite_word.capitalize() for suite_word in module_name.split("_")]
> -        )
> -        full_suite_name_to_find = f"{self._test_suite_class_prefix}{camel_case_suite_name}"
> -        for class_name, class_obj in inspect.getmembers(test_suite_module, is_test_suite):
> -            if class_name == full_suite_name_to_find:
> -                return class_obj
> -        raise ConfigurationError(
> -            f"Couldn't find any valid test suites in {test_suite_module.__name__}."
> -        )
> -
> -    def _filter_test_cases(
> -        self, test_suite_class: type[TestSuite], test_cases_to_run: Sequence[str]
> -    ) -> tuple[list[FunctionType], list[FunctionType]]:
> -        """Filter `test_cases_to_run` from `test_suite_class`.
> -
> -        There are two rounds of filtering if `test_cases_to_run` is not empty.
> -        The first filters `test_cases_to_run` from all methods of `test_suite_class`.
> -        Then the methods are separated into functional and performance test cases.
> -        If a method matches neither the functional nor performance name prefix, it's an error.
> -
> -        Args:
> -            test_suite_class: The class of the test suite.
> -            test_cases_to_run: Test case names to filter from `test_suite_class`.
> -                If empty, return all matching test cases.
> -
> -        Returns:
> -            A list of test case methods that should be executed.
> +            test_suite_spec = test_suite_config.test_suite_spec
> +            test_suite_class = test_suite_spec.class_type
> +
> +            filtered_test_cases: list[TestCase] = [
> +                test_case
> +                for test_case in test_suite_spec.test_cases
> +                if not test_suite_config.test_cases_names
> +                or test_case.name in test_suite_config.test_cases_names
> +            ]
>
> -        Raises:
> -            ConfigurationError: If a test case from `test_cases_to_run` is not found
> -                or it doesn't match either the functional nor performance name prefix.
> -        """
> -        func_test_cases = []
> -        perf_test_cases = []
> -        name_method_tuples = inspect.getmembers(test_suite_class, inspect.isfunction)
> -        if test_cases_to_run:
> -            name_method_tuples = [
> -                (name, method) for name, method in name_method_tuples if name in test_cases_to_run
> +            selected_test_cases: list[FunctionType] = [
> +                test_case.function_type  # type: ignore[misc]
> +                for test_case in filtered_test_cases
> +                if (func and test_case.variant == TestCaseVariant.FUNCTIONAL)
> +                or (perf and test_case.variant == TestCaseVariant.PERFORMANCE)
>              ]
> -            if len(name_method_tuples) < len(test_cases_to_run):
> -                missing_test_cases = set(test_cases_to_run) - {
> -                    name for name, _ in name_method_tuples
> -                }
> -                raise ConfigurationError(
> -                    f"Test cases {missing_test_cases} not found among methods "
> -                    f"of {test_suite_class.__name__}."
> -                )
>
> -        for test_case_name, test_case_method in name_method_tuples:
> -            if re.match(self._func_test_case_regex, test_case_name):
> -                func_test_cases.append(test_case_method)
> -            elif re.match(self._perf_test_case_regex, test_case_name):
> -                perf_test_cases.append(test_case_method)
> -            elif test_cases_to_run:
> -                raise ConfigurationError(
> -                    f"Method '{test_case_name}' matches neither "
> -                    f"a functional nor a performance test case name."
> +            test_suites_with_cases.append(
> +                TestSuiteWithCases(
> +                    test_suite_class=test_suite_class, test_cases=selected_test_cases
>                  )
> -
> -        return func_test_cases, perf_test_cases
> +            )
> +        return test_suites_with_cases
>
>      def _connect_nodes_and_run_test_run(
>          self,
> --
> 2.34.1
>

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 5/5] dts: add JSON schema generation script
  2024-08-22 16:39 ` [PATCH 5/5] dts: add JSON schema generation script Luca Vizzarro
  2024-09-17 11:59   ` Juraj Linkeš
@ 2024-10-01 20:48   ` Nicholas Pratte
  1 sibling, 0 replies; 83+ messages in thread
From: Nicholas Pratte @ 2024-10-01 20:48 UTC (permalink / raw)
  To: Luca Vizzarro
  Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

Seems straightforward. There are actually some intersections here
between my currently-existing config changes and some of the trimming
you provide here which simplifies my upcoming series when I rebase it
to use this series.

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

On Thu, Aug 22, 2024 at 12:40 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> Adds a new script which automatically re-generates the JSON schema file
> based on the Pydantic configuration models.
>
> Moreover, update the JSON schema with this script for the first time.
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>  doc/guides/tools/dts.rst                   |  10 +
>  dts/framework/config/conf_yaml_schema.json | 776 ++++++++++++---------
>  dts/generate-schema.py                     |  38 +
>  3 files changed, 486 insertions(+), 338 deletions(-)
>  create mode 100755 dts/generate-schema.py
>
> diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
> index 515b15e4d8..317bd0ff99 100644
> --- a/doc/guides/tools/dts.rst
> +++ b/doc/guides/tools/dts.rst
> @@ -430,6 +430,16 @@ Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
>  Configuration Schema
>  --------------------
>
> +The configuration schema is automatically generated from Pydantic models and can be found
> +at ``dts/framework/config/conf_yaml_schema.json``. Whenever the models are changed, the schema
> +should be regenerated using the dedicated script at ``dts/generate-schema.py``, e.g.:
> +
> +.. code-block:: console
> +
> +   $ poetry shell
> +   (dts-py3.10) $ ./generate-schema.py
> +
> +
>  Definitions
>  ~~~~~~~~~~~
>
> diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
> index f02a310bb5..1cf1bb098a 100644
> --- a/dts/framework/config/conf_yaml_schema.json
> +++ b/dts/framework/config/conf_yaml_schema.json
> @@ -1,402 +1,502 @@
>  {
> -  "$schema": "https://json-schema.org/draft-07/schema",
> -  "title": "DTS Config Schema",
> -  "definitions": {
> -    "node_name": {
> -      "type": "string",
> -      "description": "A unique identifier for a node"
> -    },
> -    "NIC": {
> -      "type": "string",
> -      "enum": [
> -        "ALL",
> -        "ConnectX3_MT4103",
> -        "ConnectX4_LX_MT4117",
> -        "ConnectX4_MT4115",
> -        "ConnectX5_MT4119",
> -        "ConnectX5_MT4121",
> -        "I40E_10G-10G_BASE_T_BC",
> -        "I40E_10G-10G_BASE_T_X722",
> -        "I40E_10G-SFP_X722",
> -        "I40E_10G-SFP_XL710",
> -        "I40E_10G-X722_A0",
> -        "I40E_1G-1G_BASE_T_X722",
> -        "I40E_25G-25G_SFP28",
> -        "I40E_40G-QSFP_A",
> -        "I40E_40G-QSFP_B",
> -        "IAVF-ADAPTIVE_VF",
> -        "IAVF-VF",
> -        "IAVF_10G-X722_VF",
> -        "ICE_100G-E810C_QSFP",
> -        "ICE_25G-E810C_SFP",
> -        "ICE_25G-E810_XXV_SFP",
> -        "IGB-I350_VF",
> -        "IGB_1G-82540EM",
> -        "IGB_1G-82545EM_COPPER",
> -        "IGB_1G-82571EB_COPPER",
> -        "IGB_1G-82574L",
> -        "IGB_1G-82576",
> -        "IGB_1G-82576_QUAD_COPPER",
> -        "IGB_1G-82576_QUAD_COPPER_ET2",
> -        "IGB_1G-82580_COPPER",
> -        "IGB_1G-I210_COPPER",
> -        "IGB_1G-I350_COPPER",
> -        "IGB_1G-I354_SGMII",
> -        "IGB_1G-PCH_LPTLP_I218_LM",
> -        "IGB_1G-PCH_LPTLP_I218_V",
> -        "IGB_1G-PCH_LPT_I217_LM",
> -        "IGB_1G-PCH_LPT_I217_V",
> -        "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
> -        "IGC-I225_LM",
> -        "IGC-I226_LM",
> -        "IXGBE_10G-82599_SFP",
> -        "IXGBE_10G-82599_SFP_SF_QP",
> -        "IXGBE_10G-82599_T3_LOM",
> -        "IXGBE_10G-82599_VF",
> -        "IXGBE_10G-X540T",
> -        "IXGBE_10G-X540_VF",
> -        "IXGBE_10G-X550EM_A_SFP",
> -        "IXGBE_10G-X550EM_X_10G_T",
> -        "IXGBE_10G-X550EM_X_SFP",
> -        "IXGBE_10G-X550EM_X_VF",
> -        "IXGBE_10G-X550T",
> -        "IXGBE_10G-X550_VF",
> -        "brcm_57414",
> -        "brcm_P2100G",
> -        "cavium_0011",
> -        "cavium_a034",
> -        "cavium_a063",
> -        "cavium_a064",
> -        "fastlinq_ql41000",
> -        "fastlinq_ql41000_vf",
> -        "fastlinq_ql45000",
> -        "fastlinq_ql45000_vf",
> -        "hi1822",
> -        "virtio"
> -      ]
> -    },
> -
> -    "ARCH": {
> -      "type": "string",
> +  "$defs": {
> +    "Architecture": {
> +      "description": "The supported architectures of :class:`~framework.testbed_model.node.Node`\\s.",
>        "enum": [
> +        "i686",
>          "x86_64",
> +        "x86_32",
>          "arm64",
>          "ppc64le"
> -      ]
> -    },
> -    "OS": {
> -      "type": "string",
> -      "enum": [
> -        "linux"
> -      ]
> -    },
> -    "cpu": {
> -      "type": "string",
> -      "description": "Native should be the default on x86",
> -      "enum": [
> -        "native",
> -        "armv8a",
> -        "dpaa2",
> -        "thunderx",
> -        "xgene1"
> -      ]
> -    },
> -    "compiler": {
> -      "type": "string",
> -      "enum": [
> -        "gcc",
> -        "clang",
> -        "icc",
> -        "mscv"
> -      ]
> +      ],
> +      "title": "Architecture",
> +      "type": "string"
>      },
> -    "build_target": {
> -      "type": "object",
> -      "description": "Targets supported by DTS",
> +    "BuildTargetConfiguration": {
> +      "additionalProperties": false,
> +      "description": "DPDK build configuration.\n\nThe configuration used for building DPDK.\n\nAttributes:\n    arch: The target architecture to build for.\n    os: The target os to build for.\n    cpu: The target CPU to build for.\n    compiler: The compiler executable to use.\n    compiler_wrapper: This string will be put in front of the compiler when\n        executing the build. Useful for adding wrapper commands, such as ``ccache``.",
>        "properties": {
>          "arch": {
> -          "type": "string",
> -          "enum": [
> -            "ALL",
> -            "x86_64",
> -            "arm64",
> -            "ppc64le",
> -            "other"
> -          ]
> +          "$ref": "#/$defs/Architecture"
>          },
>          "os": {
> -          "$ref": "#/definitions/OS"
> +          "$ref": "#/$defs/OS"
>          },
>          "cpu": {
> -          "$ref": "#/definitions/cpu"
> +          "$ref": "#/$defs/CPUType"
>          },
>          "compiler": {
> -          "$ref": "#/definitions/compiler"
> +          "$ref": "#/$defs/Compiler"
>          },
> -          "compiler_wrapper": {
> -          "type": "string",
> -          "description": "This will be added before compiler to the CC variable when building DPDK. Optional."
> +        "compiler_wrapper": {
> +          "default": "",
> +          "title": "Compiler Wrapper",
> +          "type": "string"
>          }
>        },
> -      "additionalProperties": false,
>        "required": [
>          "arch",
>          "os",
>          "cpu",
>          "compiler"
> -      ]
> +      ],
> +      "title": "BuildTargetConfiguration",
> +      "type": "object"
>      },
> -    "hugepages_2mb": {
> -      "type": "object",
> -      "description": "Optional hugepage configuration. If not specified, hugepages won't be configured and DTS will use system configuration.",
> +    "CPUType": {
> +      "description": "The supported CPUs of :class:`~framework.testbed_model.node.Node`\\s.",
> +      "enum": [
> +        "native",
> +        "armv8a",
> +        "dpaa2",
> +        "thunderx",
> +        "xgene1"
> +      ],
> +      "title": "CPUType",
> +      "type": "string"
> +    },
> +    "Compiler": {
> +      "description": "The supported compilers of :class:`~framework.testbed_model.node.Node`\\s.",
> +      "enum": [
> +        "gcc",
> +        "clang",
> +        "icc",
> +        "msvc"
> +      ],
> +      "title": "Compiler",
> +      "type": "string"
> +    },
> +    "HugepageConfiguration": {
> +      "additionalProperties": false,
> +      "description": "The hugepage configuration of :class:`~framework.testbed_model.node.Node`\\s.\n\nAttributes:\n    number_of: The number of hugepages to allocate.\n    force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.",
>        "properties": {
>          "number_of": {
> -          "type": "integer",
> -          "description": "The number of hugepages to configure. Hugepage size will be the system default."
> +          "title": "Number Of",
> +          "type": "integer"
>          },
>          "force_first_numa": {
> -          "type": "boolean",
> -          "description": "Set to True to force configuring hugepages on the first NUMA node. Defaults to False."
> +          "title": "Force First Numa",
> +          "type": "boolean"
>          }
>        },
> -      "additionalProperties": false,
>        "required": [
> -        "number_of"
> -      ]
> -    },
> -    "mac_address": {
> -      "type": "string",
> -      "description": "A MAC address",
> -      "pattern": "^([0-9A-Fa-f]{2}[:-]){5}([0-9A-Fa-f]{2})$"
> +        "number_of",
> +        "force_first_numa"
> +      ],
> +      "title": "HugepageConfiguration",
> +      "type": "object"
>      },
> -    "pci_address": {
> -      "type": "string",
> -      "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
> +    "OS": {
> +      "description": "The supported operating systems of :class:`~framework.testbed_model.node.Node`\\s.",
> +      "enum": [
> +        "linux",
> +        "freebsd",
> +        "windows"
> +      ],
> +      "title": "OS",
> +      "type": "string"
>      },
> -    "port_peer_address": {
> -      "description": "Peer is a TRex port, and IXIA port or a PCI address",
> -      "oneOf": [
> -        {
> -          "description": "PCI peer port",
> -          "$ref": "#/definitions/pci_address"
> +    "PortConfig": {
> +      "additionalProperties": false,
> +      "description": "The port configuration of :class:`~framework.testbed_model.node.Node`\\s.\n\nAttributes:\n    pci: The PCI address of the port.\n    os_driver_for_dpdk: The operating system driver name for use with DPDK.\n    os_driver: The operating system driver name when the operating system controls the port.\n    peer_node: The :class:`~framework.testbed_model.node.Node` of the port\n        connected to this port.\n    peer_pci: The PCI address of the port connected to this port.",
> +      "properties": {
> +        "pci": {
> +          "description": "The local PCI address of the port.",
> +          "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$",
> +          "title": "Pci",
> +          "type": "string"
> +        },
> +        "os_driver_for_dpdk": {
> +          "description": "The driver that the kernel should bind this device to for DPDK to use it.",
> +          "examples": [
> +            "vfio-pci",
> +            "mlx5_core"
> +          ],
> +          "title": "Os Driver For Dpdk",
> +          "type": "string"
> +        },
> +        "os_driver": {
> +          "description": "The driver normally used by this port",
> +          "examples": [
> +            "i40e",
> +            "ice",
> +            "mlx5_core"
> +          ],
> +          "title": "Os Driver",
> +          "type": "string"
> +        },
> +        "peer_node": {
> +          "description": "The name of the peer node this port is connected to.",
> +          "title": "Peer Node",
> +          "type": "string"
> +        },
> +        "peer_pci": {
> +          "description": "The PCI address of the peer port this port is connected to.",
> +          "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$",
> +          "title": "Peer Pci",
> +          "type": "string"
>          }
> -      ]
> +      },
> +      "required": [
> +        "pci",
> +        "os_driver_for_dpdk",
> +        "os_driver",
> +        "peer_node",
> +        "peer_pci"
> +      ],
> +      "title": "PortConfig",
> +      "type": "object"
>      },
> -    "test_suite": {
> -      "type": "string",
> -      "enum": [
> -        "hello_world",
> -        "os_udp",
> -        "pmd_buffer_scatter"
> -      ]
> +    "ScapyTrafficGeneratorConfig": {
> +      "additionalProperties": false,
> +      "description": "Scapy traffic generator specific configuration.",
> +      "properties": {
> +        "type": {
> +          "const": "SCAPY",
> +          "enum": [
> +            "SCAPY"
> +          ],
> +          "title": "Type",
> +          "type": "string"
> +        }
> +      },
> +      "required": [
> +        "type"
> +      ],
> +      "title": "ScapyTrafficGeneratorConfig",
> +      "type": "object"
>      },
> -    "test_target": {
> -      "type": "object",
> +    "SutNodeConfiguration": {
> +      "additionalProperties": false,
> +      "description": ":class:`~framework.testbed_model.sut_node.SutNode` specific configuration.\n\nAttributes:\n    memory_channels: The number of memory channels to use when running DPDK.",
>        "properties": {
> -        "suite": {
> -          "$ref": "#/definitions/test_suite"
> +        "name": {
> +          "description": "A unique identifier for this node.",
> +          "title": "Name",
> +          "type": "string"
> +        },
> +        "hostname": {
> +          "description": "The hostname or IP address of the node.",
> +          "title": "Hostname",
> +          "type": "string"
> +        },
> +        "user": {
> +          "description": "The login user to use to connect to this node.",
> +          "title": "User",
> +          "type": "string"
>          },
> -        "cases": {
> -          "type": "array",
> -          "description": "If specified, only this subset of test suite's test cases will be run.",
> +        "password": {
> +          "anyOf": [
> +            {
> +              "type": "string"
> +            },
> +            {
> +              "type": "null"
> +            }
> +          ],
> +          "default": null,
> +          "description": "The login password to use to connect to this node. SSH keys are STRONGLY preferred, use only as last resort.",
> +          "title": "Password"
> +        },
> +        "use_first_core": {
> +          "default": false,
> +          "description": "DPDK won't use the first physical core if set to False.",
> +          "title": "Use First Core",
> +          "type": "boolean"
> +        },
> +        "hugepages_2mb": {
> +          "anyOf": [
> +            {
> +              "$ref": "#/$defs/HugepageConfiguration"
> +            },
> +            {
> +              "type": "null"
> +            }
> +          ],
> +          "default": null
> +        },
> +        "ports": {
>            "items": {
> -            "type": "string"
> +            "$ref": "#/$defs/PortConfig"
>            },
> -          "minimum": 1
> +          "minItems": 1,
> +          "title": "Ports",
> +          "type": "array"
> +        },
> +        "memory_channels": {
> +          "default": 1,
> +          "description": "Number of memory channels to use when running DPDK.",
> +          "title": "Memory Channels",
> +          "type": "integer"
> +        },
> +        "arch": {
> +          "$ref": "#/$defs/Architecture"
> +        },
> +        "os": {
> +          "$ref": "#/$defs/OS"
> +        },
> +        "lcores": {
> +          "default": "1",
> +          "description": "Comma-separated list of logical cores to use. An empty string means use all lcores.",
> +          "examples": [
> +            "1,2,3,4,5,18-22",
> +            "10-15"
> +          ],
> +          "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
> +          "title": "Lcores",
> +          "type": "string"
>          }
>        },
>        "required": [
> -        "suite"
> +        "name",
> +        "hostname",
> +        "user",
> +        "ports",
> +        "arch",
> +        "os"
>        ],
> -      "additionalProperties": false
> -    }
> -  },
> -  "type": "object",
> -  "properties": {
> -    "nodes": {
> -      "type": "array",
> -      "items": {
> -        "type": "object",
> -        "properties": {
> -          "name": {
> -            "type": "string",
> -            "description": "A unique identifier for this node"
> -          },
> -          "hostname": {
> -            "type": "string",
> -            "description": "A hostname from which the node running DTS can access this node. This can also be an IP address."
> -          },
> -          "user": {
> -            "type": "string",
> -            "description": "The user to access this node with."
> -          },
> -          "password": {
> -            "type": "string",
> -            "description": "The password to use on this node. Use only as a last resort. SSH keys are STRONGLY preferred."
> -          },
> -          "arch": {
> -            "$ref": "#/definitions/ARCH"
> -          },
> -          "os": {
> -            "$ref": "#/definitions/OS"
> -          },
> -          "lcores": {
> -            "type": "string",
> -            "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
> -            "description": "Optional comma-separated list of logical cores to use, e.g.: 1,2,3,4,5,18-22. Defaults to 1. An empty string means use all lcores."
> +      "title": "SutNodeConfiguration",
> +      "type": "object"
> +    },
> +    "TGNodeConfiguration": {
> +      "additionalProperties": false,
> +      "description": ":class:`~framework.testbed_model.tg_node.TGNode` specific configuration.\n\nAttributes:\n    traffic_generator: The configuration of the traffic generator present on the TG node.",
> +      "properties": {
> +        "name": {
> +          "description": "A unique identifier for this node.",
> +          "title": "Name",
> +          "type": "string"
> +        },
> +        "hostname": {
> +          "description": "The hostname or IP address of the node.",
> +          "title": "Hostname",
> +          "type": "string"
> +        },
> +        "user": {
> +          "description": "The login user to use to connect to this node.",
> +          "title": "User",
> +          "type": "string"
> +        },
> +        "password": {
> +          "anyOf": [
> +            {
> +              "type": "string"
> +            },
> +            {
> +              "type": "null"
> +            }
> +          ],
> +          "default": null,
> +          "description": "The login password to use to connect to this node. SSH keys are STRONGLY preferred, use only as last resort.",
> +          "title": "Password"
> +        },
> +        "use_first_core": {
> +          "default": false,
> +          "description": "DPDK won't use the first physical core if set to False.",
> +          "title": "Use First Core",
> +          "type": "boolean"
> +        },
> +        "hugepages_2mb": {
> +          "anyOf": [
> +            {
> +              "$ref": "#/$defs/HugepageConfiguration"
> +            },
> +            {
> +              "type": "null"
> +            }
> +          ],
> +          "default": null
> +        },
> +        "ports": {
> +          "items": {
> +            "$ref": "#/$defs/PortConfig"
>            },
> -          "use_first_core": {
> -            "type": "boolean",
> -            "description": "Indicate whether DPDK should use the first physical core. It won't be used by default."
> +          "minItems": 1,
> +          "title": "Ports",
> +          "type": "array"
> +        },
> +        "arch": {
> +          "$ref": "#/$defs/Architecture"
> +        },
> +        "os": {
> +          "$ref": "#/$defs/OS"
> +        },
> +        "lcores": {
> +          "default": "1",
> +          "description": "Comma-separated list of logical cores to use. An empty string means use all lcores.",
> +          "examples": [
> +            "1,2,3,4,5,18-22",
> +            "10-15"
> +          ],
> +          "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
> +          "title": "Lcores",
> +          "type": "string"
> +        },
> +        "traffic_generator": {
> +          "discriminator": {
> +            "mapping": {
> +              "SCAPY": "#/$defs/ScapyTrafficGeneratorConfig"
> +            },
> +            "propertyName": "type"
>            },
> -          "memory_channels": {
> -            "type": "integer",
> -            "description": "How many memory channels to use. Optional, defaults to 1."
> +          "oneOf": [
> +            {
> +              "$ref": "#/$defs/ScapyTrafficGeneratorConfig"
> +            }
> +          ],
> +          "title": "Traffic Generator"
> +        }
> +      },
> +      "required": [
> +        "name",
> +        "hostname",
> +        "user",
> +        "ports",
> +        "arch",
> +        "os",
> +        "traffic_generator"
> +      ],
> +      "title": "TGNodeConfiguration",
> +      "type": "object"
> +    },
> +    "TestRunConfiguration": {
> +      "additionalProperties": false,
> +      "description": "The configuration of a test run.\n\nThe configuration contains testbed information, what tests to execute\nand with what DPDK build.\n\nAttributes:\n    build_targets: A list of DPDK builds to test.\n    perf: Whether to run performance tests.\n    func: Whether to run functional tests.\n    skip_smoke_tests: Whether to skip smoke tests.\n    test_suites: The names of test suites and/or test cases to execute.\n    system_under_test_node: The SUT node configuration to use in this test run.\n    traffic_generator_node: The TG node name to use in this test run.",
> +      "properties": {
> +        "perf": {
> +          "description": "Enable performance testing.",
> +          "title": "Perf",
> +          "type": "boolean"
> +        },
> +        "func": {
> +          "description": "Enable functional testing.",
> +          "title": "Func",
> +          "type": "boolean"
> +        },
> +        "test_suites": {
> +          "items": {
> +            "$ref": "#/$defs/TestSuiteConfig"
>            },
> -          "hugepages_2mb": {
> -            "$ref": "#/definitions/hugepages_2mb"
> +          "minItems": 1,
> +          "title": "Test Suites",
> +          "type": "array"
> +        },
> +        "build_targets": {
> +          "items": {
> +            "$ref": "#/$defs/BuildTargetConfiguration"
>            },
> -          "ports": {
> -            "type": "array",
> -            "items": {
> -              "type": "object",
> -              "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.",
> -              "properties": {
> -                "pci": {
> -                  "$ref": "#/definitions/pci_address",
> -                  "description": "The local PCI address of the port"
> -                },
> -                "os_driver_for_dpdk": {
> -                  "type": "string",
> -                  "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)"
> -                },
> -                "os_driver": {
> -                  "type": "string",
> -                  "description": "The driver normally used by this port (ex: i40e)"
> -                },
> -                "peer_node": {
> -                  "type": "string",
> -                  "description": "The name of the node the peer port is on"
> -                },
> -                "peer_pci": {
> -                  "$ref": "#/definitions/pci_address",
> -                  "description": "The PCI address of the peer port"
> -                }
> -              },
> -              "additionalProperties": false,
> -              "required": [
> -                "pci",
> -                "os_driver_for_dpdk",
> -                "os_driver",
> -                "peer_node",
> -                "peer_pci"
> -              ]
> -            },
> -            "minimum": 1
> +          "title": "Build Targets",
> +          "type": "array"
> +        },
> +        "skip_smoke_tests": {
> +          "default": false,
> +          "title": "Skip Smoke Tests",
> +          "type": "boolean"
> +        },
> +        "system_under_test_node": {
> +          "$ref": "#/$defs/TestRunSUTNodeConfiguration"
> +        },
> +        "traffic_generator_node": {
> +          "title": "Traffic Generator Node",
> +          "type": "string"
> +        }
> +      },
> +      "required": [
> +        "perf",
> +        "func",
> +        "test_suites",
> +        "build_targets",
> +        "system_under_test_node",
> +        "traffic_generator_node"
> +      ],
> +      "title": "TestRunConfiguration",
> +      "type": "object"
> +    },
> +    "TestRunSUTNodeConfiguration": {
> +      "additionalProperties": false,
> +      "description": "The SUT node configuration of a test run.\n\nAttributes:\n    node_name: The SUT node to use in this test run.\n    vdevs: The names of virtual devices to test.",
> +      "properties": {
> +        "vdevs": {
> +          "items": {
> +            "type": "string"
>            },
> -          "traffic_generator": {
> -            "oneOf": [
> -              {
> -                "type": "object",
> -                "description": "Scapy traffic generator. Used for functional testing.",
> -                "properties": {
> -                  "type": {
> -                    "type": "string",
> -                    "enum": [
> -                      "SCAPY"
> -                    ]
> -                  }
> -                }
> -              }
> -            ]
> -          }
> +          "title": "Vdevs",
> +          "type": "array"
>          },
> -        "additionalProperties": false,
> -        "required": [
> -          "name",
> -          "hostname",
> -          "user",
> -          "arch",
> -          "os"
> -        ]
> +        "node_name": {
> +          "title": "Node Name",
> +          "type": "string"
> +        }
>        },
> -      "minimum": 1
> +      "required": [
> +        "node_name"
> +      ],
> +      "title": "TestRunSUTNodeConfiguration",
> +      "type": "object"
>      },
> -    "test_runs": {
> -      "type": "array",
> -      "items": {
> -        "type": "object",
> -        "properties": {
> -          "build_targets": {
> -            "type": "array",
> -            "items": {
> -              "$ref": "#/definitions/build_target"
> +    "TestSuiteConfig": {
> +      "anyOf": [
> +        {
> +          "additionalProperties": false,
> +          "properties": {
> +            "test_suite": {
> +              "description": "The identifying name of the test suite.",
> +              "title": "Test suite name",
> +              "type": "string"
>              },
> -            "minimum": 1
> -          },
> -          "perf": {
> -            "type": "boolean",
> -            "description": "Enable performance testing."
> -          },
> -          "func": {
> -            "type": "boolean",
> -            "description": "Enable functional testing."
> -          },
> -          "test_suites": {
> -            "type": "array",
> -            "items": {
> -              "oneOf": [
> -                {
> -                  "$ref": "#/definitions/test_suite"
> -                },
> -                {
> -                  "$ref": "#/definitions/test_target"
> -                }
> -              ]
> +            "test_cases": {
> +              "description": "The identifying name of the test cases of the test suite.",
> +              "items": {
> +                "type": "string"
> +              },
> +              "title": "Test cases by name",
> +              "type": "array"
>              }
>            },
> -          "skip_smoke_tests": {
> -            "description": "Optional field that allows you to skip smoke testing",
> -            "type": "boolean"
> -          },
> -          "system_under_test_node": {
> -            "type":"object",
> -            "properties": {
> -              "node_name": {
> -                "$ref": "#/definitions/node_name"
> -              },
> -              "vdevs": {
> -                "description": "Optional list of names of vdevs to be used in the test run",
> -                "type": "array",
> -                "items": {
> -                  "type": "string"
> -                }
> -              }
> -            },
> -            "required": [
> -              "node_name"
> -            ]
> +          "required": [
> +            "test_suite"
> +          ],
> +          "type": "object"
> +        },
> +        {
> +          "type": "string"
> +        }
> +      ],
> +      "description": "Test suite configuration.\n\nInformation about a single test suite to be executed. It can be represented and validated as a\nstring type in the form of: ``TEST_SUITE [TEST_CASE, ...]``, in the configuration file.\n\nAttributes:\n    test_suite: The name of the test suite module without the starting ``TestSuite_``.\n    test_cases: The names of test cases from this test suite to execute.\n        If empty, all test cases will be executed.",
> +      "title": "TestSuiteConfig"
> +    }
> +  },
> +  "description": "DTS testbed and test configuration.\n\nAttributes:\n    test_runs: Test run configurations.\n    nodes: Node configurations.",
> +  "properties": {
> +    "test_runs": {
> +      "items": {
> +        "$ref": "#/$defs/TestRunConfiguration"
> +      },
> +      "minItems": 1,
> +      "title": "Test Runs",
> +      "type": "array"
> +    },
> +    "nodes": {
> +      "items": {
> +        "anyOf": [
> +          {
> +            "$ref": "#/$defs/TGNodeConfiguration"
>            },
> -          "traffic_generator_node": {
> -            "$ref": "#/definitions/node_name"
> +          {
> +            "$ref": "#/$defs/SutNodeConfiguration"
>            }
> -        },
> -        "additionalProperties": false,
> -        "required": [
> -          "build_targets",
> -          "perf",
> -          "func",
> -          "test_suites",
> -          "system_under_test_node",
> -          "traffic_generator_node"
>          ]
>        },
> -      "minimum": 1
> +      "minItems": 1,
> +      "title": "Nodes",
> +      "type": "array"
>      }
>    },
>    "required": [
>      "test_runs",
>      "nodes"
>    ],
> -  "additionalProperties": false
> -}
> +  "title": "Configuration",
> +  "type": "object",
> +  "$schema": "https://json-schema.org/draft/2020-12/schema"
> +}
> \ No newline at end of file
> diff --git a/dts/generate-schema.py b/dts/generate-schema.py
> new file mode 100755
> index 0000000000..b41d28492f
> --- /dev/null
> +++ b/dts/generate-schema.py
> @@ -0,0 +1,38 @@
> +#!/usr/bin/env python3
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2024 Arm Limited
> +
> +"""JSON schema generation script."""
> +
> +import json
> +import os
> +
> +from pydantic.json_schema import GenerateJsonSchema
> +
> +from framework.config import ConfigurationType
> +
> +DTS_DIR = os.path.dirname(os.path.realpath(__file__))
> +RELATIVE_PATH_TO_SCHEMA = "framework/config/conf_yaml_schema.json"
> +
> +
> +class GenerateSchemaWithDialect(GenerateJsonSchema):
> +    """Custom schema generator which adds the schema dialect."""
> +
> +    def generate(self, schema, mode="validation"):
> +        """Generate JSON schema."""
> +        json_schema = super().generate(schema, mode=mode)
> +        json_schema["$schema"] = self.schema_dialect
> +        return json_schema
> +
> +
> +try:
> +    path = os.path.join(DTS_DIR, RELATIVE_PATH_TO_SCHEMA)
> +
> +    with open(path, "w") as schema_file:
> +        schema_dict = ConfigurationType.json_schema(schema_generator=GenerateSchemaWithDialect)
> +        schema_json = json.dumps(schema_dict, indent=2)
> +        schema_file.write(schema_json)
> +
> +    print("Schema generated successfully!")
> +except Exception as e:
> +    raise Exception("failed to generate schema") from e
> --
> 2.34.1
>

^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v2 0/5] dts: Pydantic configuration
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
                   ` (4 preceding siblings ...)
  2024-08-22 16:39 ` [PATCH 5/5] dts: add JSON schema generation script Luca Vizzarro
@ 2024-10-25 15:58 ` Luca Vizzarro
  2024-10-25 15:58   ` [PATCH v2 1/5] dts: add pydantic dependency Luca Vizzarro
                     ` (4 more replies)
  2024-10-25 16:43 ` [PATCH v3 0/5] dts: Pydantic configuration Luca Vizzarro
                   ` (3 subsequent siblings)
  9 siblings, 5 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-25 15:58 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

Hi there,

sending a v2 for the pydantic changes.

v2:
- rebased and merge conflicts resolved:
  - capabilities patch introducing TestCase has now been combined with
    TestSuiteSpec
  - external build patch added more configuration complexity which has
    been re-worked in pydantic adding exclusion via structured models
- split pydantic/warlock dependency chains
- deleted the config schema as no longer needed
- removed config schema generator
- turned all configuration dataclasses into Pydantic BaseModels
- refactored
- improved docstrings

Best,
Luca

---
Depends-on: series-33590 ("DTS external DPDK build")

Luca Vizzarro (5):
  dts: add pydantic dependency
  dts: add TestSuiteSpec class and discovery
  dts: use pydantic in the configuration
  dts: remove warlock dependency
  dts: use TestSuiteSpec class imports

 doc/api/dts/conf_yaml_schema.json             |   1 -
 doc/api/dts/framework.config.rst              |   6 -
 doc/api/dts/framework.config.types.rst        |   8 -
 dts/conf.yaml                                 |  11 +-
 dts/framework/config/__init__.py              | 842 +++++++++---------
 dts/framework/config/conf_yaml_schema.json    | 458 ----------
 dts/framework/config/types.py                 | 149 ----
 dts/framework/runner.py                       | 139 +--
 dts/framework/settings.py                     | 124 +--
 dts/framework/test_suite.py                   | 189 +++-
 dts/framework/testbed_model/capability.py     |  12 +-
 dts/framework/testbed_model/node.py           |  15 +-
 dts/framework/testbed_model/os_session.py     |   4 +-
 dts/framework/testbed_model/port.py           |   4 +-
 dts/framework/testbed_model/posix_session.py  |  10 +-
 dts/framework/testbed_model/sut_node.py       | 182 ++--
 dts/framework/testbed_model/topology.py       |  11 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 dts/framework/utils.py                        |   2 +-
 dts/poetry.lock                               | 370 ++++----
 dts/pyproject.toml                            |   2 +-
 dts/tests/TestSuite_smoke_tests.py            |   2 +-
 23 files changed, 1010 insertions(+), 1537 deletions(-)
 delete mode 120000 doc/api/dts/conf_yaml_schema.json
 delete mode 100644 doc/api/dts/framework.config.types.rst
 delete mode 100644 dts/framework/config/conf_yaml_schema.json
 delete mode 100644 dts/framework/config/types.py

-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v2 1/5] dts: add pydantic dependency
  2024-10-25 15:58 ` [PATCH v2 0/5] dts: Pydantic configuration Luca Vizzarro
@ 2024-10-25 15:58   ` Luca Vizzarro
  2024-10-25 15:58   ` [PATCH v2 2/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
                     ` (3 subsequent siblings)
  4 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-25 15:58 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

As part of configuration validation and deserialization improvements,
this adds pydantic as a project dependency. Pydantic is a library that
caters to all of the aforementioned needs, while improving the process
and code.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/poetry.lock    | 171 ++++++++++++++++++++++++++++++++++++++++++++-
 dts/pyproject.toml |   1 +
 2 files changed, 170 insertions(+), 2 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index cf5f6569c6..56c50ad52c 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,4 +1,4 @@
-# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand.
 
 [[package]]
 name = "aenum"
@@ -23,6 +23,17 @@ files = [
     {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
 ]
 
+[[package]]
+name = "annotated-types"
+version = "0.7.0"
+description = "Reusable constraint types to use with typing.Annotated"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"},
+    {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
+]
+
 [[package]]
 name = "attrs"
 version = "23.1.0"
@@ -567,6 +578,16 @@ files = [
     {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
     {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
     {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
@@ -762,6 +783,130 @@ files = [
     {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
 ]
 
+[[package]]
+name = "pydantic"
+version = "2.9.2"
+description = "Data validation using Python type hints"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic-2.9.2-py3-none-any.whl", hash = "sha256:f048cec7b26778210e28a0459867920654d48e5e62db0958433636cde4254f12"},
+    {file = "pydantic-2.9.2.tar.gz", hash = "sha256:d155cef71265d1e9807ed1c32b4c8deec042a44a50a4188b25ac67ecd81a9c0f"},
+]
+
+[package.dependencies]
+annotated-types = ">=0.6.0"
+pydantic-core = "2.23.4"
+typing-extensions = [
+    {version = ">=4.12.2", markers = "python_version >= \"3.13\""},
+    {version = ">=4.6.1", markers = "python_version < \"3.13\""},
+]
+
+[package.extras]
+email = ["email-validator (>=2.0.0)"]
+timezone = ["tzdata"]
+
+[[package]]
+name = "pydantic-core"
+version = "2.23.4"
+description = "Core functionality for Pydantic validation and serialization"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic_core-2.23.4-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:b10bd51f823d891193d4717448fab065733958bdb6a6b351967bd349d48d5c9b"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4fc714bdbfb534f94034efaa6eadd74e5b93c8fa6315565a222f7b6f42ca1166"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63e46b3169866bd62849936de036f901a9356e36376079b05efa83caeaa02ceb"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed1a53de42fbe34853ba90513cea21673481cd81ed1be739f7f2efb931b24916"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cfdd16ab5e59fc31b5e906d1a3f666571abc367598e3e02c83403acabc092e07"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:255a8ef062cbf6674450e668482456abac99a5583bbafb73f9ad469540a3a232"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a7cd62e831afe623fbb7aabbb4fe583212115b3ef38a9f6b71869ba644624a2"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f09e2ff1f17c2b51f2bc76d1cc33da96298f0a036a137f5440ab3ec5360b624f"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e38e63e6f3d1cec5a27e0afe90a085af8b6806ee208b33030e65b6516353f1a3"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0dbd8dbed2085ed23b5c04afa29d8fd2771674223135dc9bc937f3c09284d071"},
+    {file = "pydantic_core-2.23.4-cp310-none-win32.whl", hash = "sha256:6531b7ca5f951d663c339002e91aaebda765ec7d61b7d1e3991051906ddde119"},
+    {file = "pydantic_core-2.23.4-cp310-none-win_amd64.whl", hash = "sha256:7c9129eb40958b3d4500fa2467e6a83356b3b61bfff1b414c7361d9220f9ae8f"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:77733e3892bb0a7fa797826361ce8a9184d25c8dffaec60b7ffe928153680ba8"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1b84d168f6c48fabd1f2027a3d1bdfe62f92cade1fb273a5d68e621da0e44e6d"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:df49e7a0861a8c36d089c1ed57d308623d60416dab2647a4a17fe050ba85de0e"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ff02b6d461a6de369f07ec15e465a88895f3223eb75073ffea56b84d9331f607"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:996a38a83508c54c78a5f41456b0103c30508fed9abcad0a59b876d7398f25fd"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d97683ddee4723ae8c95d1eddac7c192e8c552da0c73a925a89fa8649bf13eea"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:216f9b2d7713eb98cb83c80b9c794de1f6b7e3145eef40400c62e86cee5f4e1e"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6f783e0ec4803c787bcea93e13e9932edab72068f68ecffdf86a99fd5918878b"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d0776dea117cf5272382634bd2a5c1b6eb16767c223c6a5317cd3e2a757c61a0"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d5f7a395a8cf1621939692dba2a6b6a830efa6b3cee787d82c7de1ad2930de64"},
+    {file = "pydantic_core-2.23.4-cp311-none-win32.whl", hash = "sha256:74b9127ffea03643e998e0c5ad9bd3811d3dac8c676e47db17b0ee7c3c3bf35f"},
+    {file = "pydantic_core-2.23.4-cp311-none-win_amd64.whl", hash = "sha256:98d134c954828488b153d88ba1f34e14259284f256180ce659e8d83e9c05eaa3"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:f3e0da4ebaef65158d4dfd7d3678aad692f7666877df0002b8a522cdf088f231"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f69a8e0b033b747bb3e36a44e7732f0c99f7edd5cea723d45bc0d6e95377ffee"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:723314c1d51722ab28bfcd5240d858512ffd3116449c557a1336cbe3919beb87"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bb2802e667b7051a1bebbfe93684841cc9351004e2badbd6411bf357ab8d5ac8"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d18ca8148bebe1b0a382a27a8ee60350091a6ddaf475fa05ef50dc35b5df6327"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33e3d65a85a2a4a0dc3b092b938a4062b1a05f3a9abde65ea93b233bca0e03f2"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:128585782e5bfa515c590ccee4b727fb76925dd04a98864182b22e89a4e6ed36"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:68665f4c17edcceecc112dfed5dbe6f92261fb9d6054b47d01bf6371a6196126"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:20152074317d9bed6b7a95ade3b7d6054845d70584216160860425f4fbd5ee9e"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:9261d3ce84fa1d38ed649c3638feefeae23d32ba9182963e465d58d62203bd24"},
+    {file = "pydantic_core-2.23.4-cp312-none-win32.whl", hash = "sha256:4ba762ed58e8d68657fc1281e9bb72e1c3e79cc5d464be146e260c541ec12d84"},
+    {file = "pydantic_core-2.23.4-cp312-none-win_amd64.whl", hash = "sha256:97df63000f4fea395b2824da80e169731088656d1818a11b95f3b173747b6cd9"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7530e201d10d7d14abce4fb54cfe5b94a0aefc87da539d0346a484ead376c3cc"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:df933278128ea1cd77772673c73954e53a1c95a4fdf41eef97c2b779271bd0bd"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cb3da3fd1b6a5d0279a01877713dbda118a2a4fc6f0d821a57da2e464793f05"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42c6dcb030aefb668a2b7009c85b27f90e51e6a3b4d5c9bc4c57631292015b0d"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:696dd8d674d6ce621ab9d45b205df149399e4bb9aa34102c970b721554828510"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2971bb5ffe72cc0f555c13e19b23c85b654dd2a8f7ab493c262071377bfce9f6"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8394d940e5d400d04cad4f75c0598665cbb81aecefaca82ca85bd28264af7f9b"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0dff76e0602ca7d4cdaacc1ac4c005e0ce0dcfe095d5b5259163a80d3a10d327"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7d32706badfe136888bdea71c0def994644e09fff0bfe47441deaed8e96fdbc6"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ed541d70698978a20eb63d8c5d72f2cc6d7079d9d90f6b50bad07826f1320f5f"},
+    {file = "pydantic_core-2.23.4-cp313-none-win32.whl", hash = "sha256:3d5639516376dce1940ea36edf408c554475369f5da2abd45d44621cb616f769"},
+    {file = "pydantic_core-2.23.4-cp313-none-win_amd64.whl", hash = "sha256:5a1504ad17ba4210df3a045132a7baeeba5a200e930f57512ee02909fc5c4cb5"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:d4488a93b071c04dc20f5cecc3631fc78b9789dd72483ba15d423b5b3689b555"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:81965a16b675b35e1d09dd14df53f190f9129c0202356ed44ab2728b1c905658"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ffa2ebd4c8530079140dd2d7f794a9d9a73cbb8e9d59ffe24c63436efa8f271"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:61817945f2fe7d166e75fbfb28004034b48e44878177fc54d81688e7b85a3665"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:29d2c342c4bc01b88402d60189f3df065fb0dda3654744d5a165a5288a657368"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5e11661ce0fd30a6790e8bcdf263b9ec5988e95e63cf901972107efc49218b13"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9d18368b137c6295db49ce7218b1a9ba15c5bc254c96d7c9f9e924a9bc7825ad"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ec4e55f79b1c4ffb2eecd8a0cfba9955a2588497d96851f4c8f99aa4a1d39b12"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:374a5e5049eda9e0a44c696c7ade3ff355f06b1fe0bb945ea3cac2bc336478a2"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5c364564d17da23db1106787675fc7af45f2f7b58b4173bfdd105564e132e6fb"},
+    {file = "pydantic_core-2.23.4-cp38-none-win32.whl", hash = "sha256:d7a80d21d613eec45e3d41eb22f8f94ddc758a6c4720842dc74c0581f54993d6"},
+    {file = "pydantic_core-2.23.4-cp38-none-win_amd64.whl", hash = "sha256:5f5ff8d839f4566a474a969508fe1c5e59c31c80d9e140566f9a37bba7b8d556"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a4fa4fc04dff799089689f4fd502ce7d59de529fc2f40a2c8836886c03e0175a"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a7df63886be5e270da67e0966cf4afbae86069501d35c8c1b3b6c168f42cb36"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dcedcd19a557e182628afa1d553c3895a9f825b936415d0dbd3cd0bbcfd29b4b"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f54b118ce5de9ac21c363d9b3caa6c800341e8c47a508787e5868c6b79c9323"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:86d2f57d3e1379a9525c5ab067b27dbb8a0642fb5d454e17a9ac434f9ce523e3"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:de6d1d1b9e5101508cb37ab0d972357cac5235f5c6533d1071964c47139257df"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1278e0d324f6908e872730c9102b0112477a7f7cf88b308e4fc36ce1bdb6d58c"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9a6b5099eeec78827553827f4c6b8615978bb4b6a88e5d9b93eddf8bb6790f55"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:e55541f756f9b3ee346b840103f32779c695a19826a4c442b7954550a0972040"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a5c7ba8ffb6d6f8f2ab08743be203654bb1aaa8c9dcb09f82ddd34eadb695605"},
+    {file = "pydantic_core-2.23.4-cp39-none-win32.whl", hash = "sha256:37b0fe330e4a58d3c58b24d91d1eb102aeec675a3db4c292ec3928ecd892a9a6"},
+    {file = "pydantic_core-2.23.4-cp39-none-win_amd64.whl", hash = "sha256:1498bec4c05c9c787bde9125cfdcc63a41004ff167f495063191b863399b1a29"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:f455ee30a9d61d3e1a15abd5068827773d6e4dc513e795f380cdd59932c782d5"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1e90d2e3bd2c3863d48525d297cd143fe541be8bbf6f579504b9712cb6b643ec"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e203fdf807ac7e12ab59ca2bfcabb38c7cf0b33c41efeb00f8e5da1d86af480"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e08277a400de01bc72436a0ccd02bdf596631411f592ad985dcee21445bd0068"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f220b0eea5965dec25480b6333c788fb72ce5f9129e8759ef876a1d805d00801"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:d06b0c8da4f16d1d1e352134427cb194a0a6e19ad5db9161bf32b2113409e728"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:ba1a0996f6c2773bd83e63f18914c1de3c9dd26d55f4ac302a7efe93fb8e7433"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:9a5bce9d23aac8f0cf0836ecfc033896aa8443b501c58d0602dbfd5bd5b37753"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:78ddaaa81421a29574a682b3179d4cf9e6d405a09b99d93ddcf7e5239c742e21"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:883a91b5dd7d26492ff2f04f40fbb652de40fcc0afe07e8129e8ae779c2110eb"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:88ad334a15b32a791ea935af224b9de1bf99bcd62fabf745d5f3442199d86d59"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:233710f069d251feb12a56da21e14cca67994eab08362207785cf8c598e74577"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:19442362866a753485ba5e4be408964644dd6a09123d9416c54cd49171f50744"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:624e278a7d29b6445e4e813af92af37820fafb6dcc55c012c834f9e26f9aaaef"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f5ef8f42bec47f21d07668a043f077d507e5bf4e668d5c6dfe6aaba89de1a5b8"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:aea443fffa9fbe3af1a9ba721a87f926fe548d32cab71d188a6ede77d0ff244e"},
+    {file = "pydantic_core-2.23.4.tar.gz", hash = "sha256:2584f7cf844ac4d970fba483a717dbe10c1c1c96a969bf65d61ffe94df1b2863"},
+]
+
+[package.dependencies]
+typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
+
 [[package]]
 name = "pydocstyle"
 version = "6.1.1"
@@ -880,6 +1025,7 @@ files = [
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
+    {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
     {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
     {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
     {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -887,8 +1033,16 @@ files = [
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
+    {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
     {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
     {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
+    {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
+    {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
+    {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
+    {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
+    {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
+    {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
+    {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
     {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
     {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
     {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -905,6 +1059,7 @@ files = [
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
+    {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
     {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
     {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
     {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -912,6 +1067,7 @@ files = [
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
+    {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
     {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
     {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
     {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -1327,6 +1483,17 @@ files = [
     {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
 ]
 
+[[package]]
+name = "typing-extensions"
+version = "4.12.2"
+description = "Backported and Experimental Type Hints for Python 3.8+"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"},
+    {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"},
+]
+
 [[package]]
 name = "urllib3"
 version = "2.0.7"
@@ -1362,4 +1529,4 @@ jsonschema = ">=4,<5"
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "6f20ce05310df93fed1d392160d1653ae5de5c6f260a5865eb3c6111a7c2b394"
+content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 506380ac2f..6c2d1ca8a4 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -28,6 +28,7 @@ scapy = "^2.5.0"
 pydocstyle = "6.1.1"
 typing-extensions = "^4.11.0"
 aenum = "^3.1.15"
+pydantic = "^2.9.2"
 
 [tool.poetry.group.dev.dependencies]
 mypy = "^1.10.0"
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v2 2/5] dts: add TestSuiteSpec class and discovery
  2024-10-25 15:58 ` [PATCH v2 0/5] dts: Pydantic configuration Luca Vizzarro
  2024-10-25 15:58   ` [PATCH v2 1/5] dts: add pydantic dependency Luca Vizzarro
@ 2024-10-25 15:58   ` Luca Vizzarro
  2024-10-25 15:58   ` [PATCH v2 3/5] dts: use pydantic in the configuration Luca Vizzarro
                     ` (2 subsequent siblings)
  4 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-25 15:58 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

Currently there is a lack of a definition which identifies all the test
suites available to test. This change intends to simplify the process to
discover all the test suites and idenfity them.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/runner.py                   |   2 +-
 dts/framework/test_suite.py               | 189 +++++++++++++++++++---
 dts/framework/testbed_model/capability.py |  12 +-
 3 files changed, 177 insertions(+), 26 deletions(-)

diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 8bbe698eaf..195622c653 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -225,7 +225,7 @@ def _get_test_suites_with_cases(
         for test_suite_config in test_suite_configs:
             test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
             test_cases: list[type[TestCase]] = []
-            func_test_cases, perf_test_cases = test_suite_class.get_test_cases(
+            func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
                 test_suite_config.test_cases
             )
             if func:
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index cbe3b30ffc..936eb2cede 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2010-2014 Intel Corporation
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Features common to all test suites.
 
@@ -16,13 +17,20 @@
 import inspect
 from collections import Counter
 from collections.abc import Callable, Sequence
+from dataclasses import dataclass
 from enum import Enum, auto
+from functools import cached_property
+from importlib import import_module
 from ipaddress import IPv4Interface, IPv6Interface, ip_interface
+from pkgutil import iter_modules
+from types import ModuleType
 from typing import ClassVar, Protocol, TypeVar, Union, cast
 
+from pydantic.alias_generators import to_pascal
 from scapy.layers.inet import IP  # type: ignore[import-untyped]
 from scapy.layers.l2 import Ether  # type: ignore[import-untyped]
 from scapy.packet import Packet, Padding, raw  # type: ignore[import-untyped]
+from typing_extensions import Self
 
 from framework.testbed_model.capability import TestProtocol
 from framework.testbed_model.port import Port
@@ -33,7 +41,7 @@
     PacketFilteringConfig,
 )
 
-from .exception import ConfigurationError, TestCaseVerifyError
+from .exception import ConfigurationError, InternalError, TestCaseVerifyError
 from .logger import DTSLogger, get_dts_logger
 from .utils import get_packet_summaries
 
@@ -112,10 +120,24 @@ def __init__(
         self._tg_ip_address_ingress = ip_interface("192.168.101.3/24")
 
     @classmethod
-    def get_test_cases(
+    def get_test_cases(cls) -> list[type["TestCase"]]:
+        """A list of all the available test cases."""
+
+        def is_test_case(function: Callable) -> bool:
+            if inspect.isfunction(function):
+                # TestCase is not used at runtime, so we can't use isinstance() with `function`.
+                # But function.test_type exists.
+                if hasattr(function, "test_type"):
+                    return isinstance(function.test_type, TestCaseType)
+            return False
+
+        return [test_case for _, test_case in inspect.getmembers(cls, is_test_case)]
+
+    @classmethod
+    def filter_test_cases(
         cls, test_case_sublist: Sequence[str] | None = None
     ) -> tuple[set[type["TestCase"]], set[type["TestCase"]]]:
-        """Filter `test_case_subset` from this class.
+        """Filter `test_case_sublist` from this class.
 
         Test cases are regular (or bound) methods decorated with :func:`func_test`
         or :func:`perf_test`.
@@ -129,17 +151,8 @@ def get_test_cases(
             as methods are bound to instances and this method only has access to the class.
 
         Raises:
-            ConfigurationError: If a test case from `test_case_subset` is not found.
+            ConfigurationError: If a test case from `test_case_sublist` is not found.
         """
-
-        def is_test_case(function: Callable) -> bool:
-            if inspect.isfunction(function):
-                # TestCase is not used at runtime, so we can't use isinstance() with `function`.
-                # But function.test_type exists.
-                if hasattr(function, "test_type"):
-                    return isinstance(function.test_type, TestCaseType)
-            return False
-
         if test_case_sublist is None:
             test_case_sublist = []
 
@@ -149,22 +162,22 @@ def is_test_case(function: Callable) -> bool:
         func_test_cases = set()
         perf_test_cases = set()
 
-        for test_case_name, test_case_function in inspect.getmembers(cls, is_test_case):
-            if test_case_name in test_case_sublist_copy:
+        for test_case in cls.get_test_cases():
+            if test_case.name in test_case_sublist_copy:
                 # if test_case_sublist_copy is non-empty, remove the found test case
                 # so that we can look at the remainder at the end
-                test_case_sublist_copy.remove(test_case_name)
+                test_case_sublist_copy.remove(test_case.name)
             elif test_case_sublist:
                 # the original list not being empty means we're filtering test cases
-                # since we didn't remove test_case_name in the previous branch,
+                # since we didn't remove test_case.name in the previous branch,
                 # it doesn't match the filter and we don't want to remove it
                 continue
 
-            match test_case_function.test_type:
+            match test_case.test_type:
                 case TestCaseType.PERFORMANCE:
-                    perf_test_cases.add(test_case_function)
+                    perf_test_cases.add(test_case)
                 case TestCaseType.FUNCTIONAL:
-                    func_test_cases.add(test_case_function)
+                    func_test_cases.add(test_case)
 
         if test_case_sublist_copy:
             raise ConfigurationError(
@@ -536,6 +549,8 @@ class TestCase(TestProtocol, Protocol[TestSuiteMethodType]):
     test case function to :class:`TestCase` and sets common variables.
     """
 
+    #:
+    name: ClassVar[str]
     #:
     test_type: ClassVar[TestCaseType]
     #: necessary for mypy so that it can treat this class as the function it's shadowing
@@ -560,6 +575,7 @@ def make_decorator(
 
         def _decorator(func: TestSuiteMethodType) -> type[TestCase]:
             test_case = cast(type[TestCase], func)
+            test_case.name = func.__name__
             test_case.skip = cls.skip
             test_case.skip_reason = cls.skip_reason
             test_case.required_capabilities = set()
@@ -575,3 +591,136 @@ def _decorator(func: TestSuiteMethodType) -> type[TestCase]:
 func_test: Callable = TestCase.make_decorator(TestCaseType.FUNCTIONAL)
 #: The decorator for performance test cases.
 perf_test: Callable = TestCase.make_decorator(TestCaseType.PERFORMANCE)
+
+
+@dataclass
+class TestSuiteSpec:
+    """A class defining the specification of a test suite.
+
+    Apart from defining all the specs of a test suite, a helper function :meth:`discover_all` is
+    provided to automatically discover all the available test suites.
+
+    Attributes:
+        module_name: The name of the test suite's module.
+    """
+
+    #:
+    TEST_SUITES_PACKAGE_NAME = "tests"
+    #:
+    TEST_SUITE_MODULE_PREFIX = "TestSuite_"
+    #:
+    TEST_SUITE_CLASS_PREFIX = "Test"
+    #:
+    TEST_CASE_METHOD_PREFIX = "test_"
+    #:
+    FUNC_TEST_CASE_REGEX = r"test_(?!perf_)"
+    #:
+    PERF_TEST_CASE_REGEX = r"test_perf_"
+
+    module_name: str
+
+    @cached_property
+    def name(self) -> str:
+        """The name of the test suite's module."""
+        return self.module_name[len(self.TEST_SUITE_MODULE_PREFIX) :]
+
+    @cached_property
+    def module(self) -> ModuleType:
+        """A reference to the test suite's module."""
+        return import_module(f"{self.TEST_SUITES_PACKAGE_NAME}.{self.module_name}")
+
+    @cached_property
+    def class_name(self) -> str:
+        """The name of the test suite's class."""
+        return f"{self.TEST_SUITE_CLASS_PREFIX}{to_pascal(self.name)}"
+
+    @cached_property
+    def class_obj(self) -> type[TestSuite]:
+        """A reference to the test suite's class."""
+
+        def is_test_suite(obj) -> bool:
+            """Check whether `obj` is a :class:`TestSuite`.
+
+            The `obj` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
+
+            Args:
+                obj: The object to be checked.
+
+            Returns:
+                :data:`True` if `obj` is a subclass of `TestSuite`.
+            """
+            try:
+                if issubclass(obj, TestSuite) and obj is not TestSuite:
+                    return True
+            except TypeError:
+                return False
+            return False
+
+        for class_name, class_obj in inspect.getmembers(self.module, is_test_suite):
+            if class_name == self.class_name:
+                return class_obj
+
+        raise InternalError(
+            f"Expected class {self.class_name} not found in module {self.module_name}."
+        )
+
+    @classmethod
+    def discover_all(
+        cls, package_name: str | None = None, module_prefix: str | None = None
+    ) -> list[Self]:
+        """Discover all the test suites.
+
+        The test suites are discovered in the provided `package_name`. The full module name,
+        expected under that package, is prefixed with `module_prefix`.
+        The module name is a standard filename with words separated with underscores.
+        For each module found, search for a :class:`TestSuite` class which starts
+        with :attr:`~TestSuiteSpec.TEST_SUITE_CLASS_PREFIX`, continuing with the module name in
+        PascalCase.
+
+        The PascalCase convention applies to abbreviations, acronyms, initialisms and so on::
+
+            OS -> Os
+            TCP -> Tcp
+
+        Args:
+            package_name: The name of the package where to find the test suites. If :data:`None`,
+                the :attr:`~TestSuiteSpec.TEST_SUITES_PACKAGE_NAME` is used.
+            module_prefix: The name prefix defining the test suite module. If :data:`None`, the
+                :attr:`~TestSuiteSpec.TEST_SUITE_MODULE_PREFIX` constant is used.
+
+        Returns:
+            A list containing all the discovered test suites.
+        """
+        if package_name is None:
+            package_name = cls.TEST_SUITES_PACKAGE_NAME
+        if module_prefix is None:
+            module_prefix = cls.TEST_SUITE_MODULE_PREFIX
+
+        test_suites = []
+
+        test_suites_pkg = import_module(package_name)
+        for _, module_name, is_pkg in iter_modules(test_suites_pkg.__path__):
+            if not module_name.startswith(module_prefix) or is_pkg:
+                continue
+
+            test_suite = cls(module_name)
+            try:
+                if test_suite.class_obj:
+                    test_suites.append(test_suite)
+            except InternalError as err:
+                get_dts_logger().warning(err)
+
+        return test_suites
+
+
+AVAILABLE_TEST_SUITES: list[TestSuiteSpec] = TestSuiteSpec.discover_all()
+"""Constant to store all the available, discovered and imported test suites.
+
+The test suites should be gathered from this list to avoid importing more than once.
+"""
+
+
+def find_by_name(name: str) -> TestSuiteSpec | None:
+    """Find a requested test suite by name from the available ones."""
+    test_suites = filter(lambda t: t.name == name, AVAILABLE_TEST_SUITES)
+    return next(test_suites, None)
diff --git a/dts/framework/testbed_model/capability.py b/dts/framework/testbed_model/capability.py
index 2207957a7a..0d5f0e0b32 100644
--- a/dts/framework/testbed_model/capability.py
+++ b/dts/framework/testbed_model/capability.py
@@ -47,9 +47,9 @@ def test_scatter_mbuf_2048(self):
 
 import inspect
 from abc import ABC, abstractmethod
-from collections.abc import MutableSet, Sequence
+from collections.abc import MutableSet
 from dataclasses import dataclass
-from typing import Callable, ClassVar, Protocol
+from typing import TYPE_CHECKING, Callable, ClassVar, Protocol
 
 from typing_extensions import Self
 
@@ -66,6 +66,9 @@ def test_scatter_mbuf_2048(self):
 from .sut_node import SutNode
 from .topology import Topology, TopologyType
 
+if TYPE_CHECKING:
+    from framework.test_suite import TestCase
+
 
 class Capability(ABC):
     """The base class for various capabilities.
@@ -354,8 +357,7 @@ def set_required(self, test_case_or_suite: type["TestProtocol"]) -> None:
         if inspect.isclass(test_case_or_suite):
             if self.topology_type is not TopologyType.default:
                 self.add_to_required(test_case_or_suite)
-                func_test_cases, perf_test_cases = test_case_or_suite.get_test_cases()
-                for test_case in func_test_cases | perf_test_cases:
+                for test_case in test_case_or_suite.get_test_cases():
                     if test_case.topology_type.topology_type is TopologyType.default:
                         # test case topology has not been set, use the one set by the test suite
                         self.add_to_required(test_case)
@@ -446,7 +448,7 @@ class TestProtocol(Protocol):
     required_capabilities: ClassVar[set[Capability]] = set()
 
     @classmethod
-    def get_test_cases(cls, test_case_sublist: Sequence[str] | None = None) -> tuple[set, set]:
+    def get_test_cases(cls) -> list[type["TestCase"]]:
         """Get test cases. Should be implemented by subclasses containing test cases.
 
         Raises:
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v2 3/5] dts: use pydantic in the configuration
  2024-10-25 15:58 ` [PATCH v2 0/5] dts: Pydantic configuration Luca Vizzarro
  2024-10-25 15:58   ` [PATCH v2 1/5] dts: add pydantic dependency Luca Vizzarro
  2024-10-25 15:58   ` [PATCH v2 2/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
@ 2024-10-25 15:58   ` Luca Vizzarro
  2024-10-25 15:58   ` [PATCH v2 4/5] dts: remove warlock dependency Luca Vizzarro
  2024-10-25 15:58   ` [PATCH v2 5/5] dts: use TestSuiteSpec class imports Luca Vizzarro
  4 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-25 15:58 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

This change brings in pydantic in place of warlock. Pydantic offers
a built-in model validation system in the classes, which allows for
a more resilient and simpler code. As a consequence of this change:

- most validation is now built-in
- further validation is added to verify:
  - cross referencing of node names and ports
  - test suite and test cases names
- dictionaries representing the config schema are removed
- the config schema is no longer used and therefore dropped
- the TrafficGeneratorType enum has been changed from inheriting
  StrEnum to the native str and Enum. This change was necessary to
  enable the discriminator for object unions
- the structure of the classes has been slightly changed to perfectly
  match the structure of the configuration files
- the test suite argument catches the ValidationError that
  TestSuiteConfig can now raise
- the DPDK location has been wrapped under another configuration
  mapping `dpdk_location`
- the DPDK locations are now structured and enforced by classes,
  further simplifying the validation and handling thanks to
  pattern matching

Bugzilla ID: 1508

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 doc/api/dts/conf_yaml_schema.json             |   1 -
 doc/api/dts/framework.config.rst              |   6 -
 doc/api/dts/framework.config.types.rst        |   8 -
 dts/conf.yaml                                 |  11 +-
 dts/framework/config/__init__.py              | 842 +++++++++---------
 dts/framework/config/conf_yaml_schema.json    | 458 ----------
 dts/framework/config/types.py                 | 149 ----
 dts/framework/runner.py                       |  57 +-
 dts/framework/settings.py                     | 124 +--
 dts/framework/testbed_model/node.py           |  15 +-
 dts/framework/testbed_model/os_session.py     |   4 +-
 dts/framework/testbed_model/port.py           |   4 +-
 dts/framework/testbed_model/posix_session.py  |  10 +-
 dts/framework/testbed_model/sut_node.py       | 182 ++--
 dts/framework/testbed_model/topology.py       |  11 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 dts/framework/utils.py                        |   2 +-
 dts/tests/TestSuite_smoke_tests.py            |   2 +-
 19 files changed, 671 insertions(+), 1221 deletions(-)
 delete mode 120000 doc/api/dts/conf_yaml_schema.json
 delete mode 100644 doc/api/dts/framework.config.types.rst
 delete mode 100644 dts/framework/config/conf_yaml_schema.json
 delete mode 100644 dts/framework/config/types.py

diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json
deleted file mode 120000
index 5978642d76..0000000000
--- a/doc/api/dts/conf_yaml_schema.json
+++ /dev/null
@@ -1 +0,0 @@
-../../../dts/framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst
index 261997aefa..cc266276c1 100644
--- a/doc/api/dts/framework.config.rst
+++ b/doc/api/dts/framework.config.rst
@@ -6,9 +6,3 @@ config - Configuration Package
 .. automodule:: framework.config
    :members:
    :show-inheritance:
-
-.. toctree::
-   :hidden:
-   :maxdepth: 1
-
-   framework.config.types
diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst
deleted file mode 100644
index a50a0c874a..0000000000
--- a/doc/api/dts/framework.config.types.rst
+++ /dev/null
@@ -1,8 +0,0 @@
-.. SPDX-License-Identifier: BSD-3-Clause
-
-config.types - Configuration Types
-==================================
-
-.. automodule:: framework.config.types
-   :members:
-   :show-inheritance:
diff --git a/dts/conf.yaml b/dts/conf.yaml
index 8a65a481d6..2496262854 100644
--- a/dts/conf.yaml
+++ b/dts/conf.yaml
@@ -5,11 +5,12 @@
 test_runs:
   # define one test run environment
   - dpdk_build:
-      # dpdk_tree: Commented out because `tarball` is defined.
-      tarball: dpdk-tarball.tar.xz
-      # Either `dpdk_tree` or `tarball` can be defined, but not both.
-      remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
-                    # is located on the SUT node, instead of the execution host.
+      dpdk_location:
+        # dpdk_tree: Commented out because `tarball` is defined.
+        tarball: dpdk-tarball.tar.xz
+        # Either `dpdk_tree` or `tarball` can be defined, but not both.
+        remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
+                      # is located on the SUT node, instead of the execution host.
 
       # precompiled_build_dir: Commented out because `build_options` is defined.
       build_options:
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index d0d95d00c7..8de036b342 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -2,17 +2,18 @@
 # Copyright(c) 2010-2021 Intel Corporation
 # Copyright(c) 2022-2023 University of New Hampshire
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Testbed configuration and test suite specification.
 
 This package offers classes that hold real-time information about the testbed, hold test run
 configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
-the YAML test run configuration file
-and validates it according to :download:`the schema <conf_yaml_schema.json>`.
+the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
+model.
 
 The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
-this package. The allowed keys and types inside this dictionary are defined in
-the :doc:`types <framework.config.types>` module.
+this package. The allowed keys and types inside this dictionary map directly to the
+:class:`Configuration` model, its fields and sub-models.
 
 The test run configuration has two main sections:
 
@@ -24,39 +25,33 @@
 
 The real-time information about testbed is supposed to be gathered at runtime.
 
-The classes defined in this package make heavy use of :mod:`dataclasses`.
-All of them use slots and are frozen:
+The classes defined in this package make heavy use of :mod:`pydantic`.
+Nearly all of them are frozen:
 
-    * Slots enables some optimizations, by pre-allocating space for the defined
-      attributes in the underlying data structure,
     * Frozen makes the object immutable. This enables further optimizations,
       and makes it thread safe should we ever want to move in that direction.
 """
 
-import json
-import os.path
 import tarfile
-from dataclasses import dataclass, fields
-from enum import auto, unique
-from pathlib import Path
-from typing import Union
+from enum import Enum, auto, unique
+from functools import cached_property
+from pathlib import Path, PurePath
+from typing import TYPE_CHECKING, Annotated, Any, Literal, NamedTuple
 
-import warlock  # type: ignore[import-untyped]
 import yaml
+from pydantic import BaseModel, Field, ValidationError, field_validator, model_validator
+from pydantic.config import JsonDict
 from typing_extensions import Self
 
-from framework.config.types import (
-    ConfigurationDict,
-    DPDKBuildConfigDict,
-    DPDKConfigurationDict,
-    NodeConfigDict,
-    PortConfigDict,
-    TestRunConfigDict,
-    TestSuiteConfigDict,
-    TrafficGeneratorConfigDict,
-)
 from framework.exception import ConfigurationError
-from framework.utils import StrEnum
+from framework.utils import REGEX_FOR_PCI_ADDRESS, StrEnum
+
+if TYPE_CHECKING:
+    from framework.test_suite import TestSuiteSpec
+
+
+class FrozenModel(BaseModel, frozen=True, extra="forbid"):
+    """Pydantic base model with frozen and forbidden extra attributes."""
 
 
 @unique
@@ -118,15 +113,14 @@ class Compiler(StrEnum):
 
 
 @unique
-class TrafficGeneratorType(StrEnum):
+class TrafficGeneratorType(str, Enum):
     """The supported traffic generators."""
 
     #:
-    SCAPY = auto()
+    SCAPY = "SCAPY"
 
 
-@dataclass(slots=True, frozen=True)
-class HugepageConfiguration:
+class HugepageConfiguration(FrozenModel):
     r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
@@ -138,12 +132,10 @@ class HugepageConfiguration:
     force_first_numa: bool
 
 
-@dataclass(slots=True, frozen=True)
-class PortConfig:
+class PortConfig(FrozenModel):
     r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
-        node: The :class:`~framework.testbed_model.node.Node` where this port exists.
         pci: The PCI address of the port.
         os_driver_for_dpdk: The operating system driver name for use with DPDK.
         os_driver: The operating system driver name when the operating system controls the port.
@@ -152,70 +144,57 @@ class PortConfig:
         peer_pci: The PCI address of the port connected to this port.
     """
 
-    node: str
-    pci: str
-    os_driver_for_dpdk: str
-    os_driver: str
-    peer_node: str
-    peer_pci: str
-
-    @classmethod
-    def from_dict(cls, node: str, d: PortConfigDict) -> Self:
-        """A convenience method that creates the object from fewer inputs.
-
-        Args:
-            node: The node where this port exists.
-            d: The configuration dictionary.
-
-        Returns:
-            The port configuration instance.
-        """
-        return cls(node=node, **d)
-
-
-@dataclass(slots=True, frozen=True)
-class TrafficGeneratorConfig:
-    """The configuration of traffic generators.
-
-    The class will be expanded when more configuration is needed.
+    pci: str = Field(
+        description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS
+    )
+    os_driver_for_dpdk: str = Field(
+        description="The driver that the kernel should bind this device to for DPDK to use it.",
+        examples=["vfio-pci", "mlx5_core"],
+    )
+    os_driver: str = Field(
+        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
+    )
+    peer_node: str = Field(description="The name of the peer node this port is connected to.")
+    peer_pci: str = Field(
+        description="The PCI address of the peer port this port is connected to.",
+        pattern=REGEX_FOR_PCI_ADDRESS,
+    )
+
+
+class TrafficGeneratorConfig(FrozenModel):
+    """A protocol required to define traffic generator types.
 
     Attributes:
-        traffic_generator_type: The type of the traffic generator.
+        type: The traffic generator type, the child class is required to define to be distinguished
+            among others.
     """
 
-    traffic_generator_type: TrafficGeneratorType
+    type: TrafficGeneratorType
 
-    @staticmethod
-    def from_dict(d: TrafficGeneratorConfigDict) -> "TrafficGeneratorConfig":
-        """A convenience method that produces traffic generator config of the proper type.
 
-        Args:
-            d: The configuration dictionary.
+class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
+    """Scapy traffic generator specific configuration."""
 
-        Returns:
-            The traffic generator configuration instance.
+    type: Literal[TrafficGeneratorType.SCAPY]
 
-        Raises:
-            ConfigurationError: An unknown traffic generator type was encountered.
-        """
-        match TrafficGeneratorType(d["type"]):
-            case TrafficGeneratorType.SCAPY:
-                return ScapyTrafficGeneratorConfig(
-                    traffic_generator_type=TrafficGeneratorType.SCAPY
-                )
-            case _:
-                raise ConfigurationError(f'Unknown traffic generator type "{d["type"]}".')
 
+#: A union type discriminating traffic generators by the `type` field.
+TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
 
-@dataclass(slots=True, frozen=True)
-class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
-    """Scapy traffic generator specific configuration."""
 
-    pass
+#: A field representing logical core ranges.
+LogicalCores = Annotated[
+    str,
+    Field(
+        description="Comma-separated list of logical cores to use. "
+        "An empty string means use all lcores.",
+        examples=["1,2,3,4,5,18-22", "10-15"],
+        pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
+    ),
+]
 
 
-@dataclass(slots=True, frozen=True)
-class NodeConfiguration:
+class NodeConfiguration(FrozenModel):
     r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
@@ -234,69 +213,24 @@ class NodeConfiguration:
         ports: The ports that can be used in testing.
     """
 
-    name: str
-    hostname: str
-    user: str
-    password: str | None
+    name: str = Field(description="A unique identifier for this node.")
+    hostname: str = Field(description="The hostname or IP address of the node.")
+    user: str = Field(description="The login user to use to connect to this node.")
+    password: str | None = Field(
+        default=None,
+        description="The login password to use to connect to this node. "
+        "SSH keys are STRONGLY preferred, use only as last resort.",
+    )
     arch: Architecture
     os: OS
-    lcores: str
-    use_first_core: bool
-    hugepages: HugepageConfiguration | None
-    ports: list[PortConfig]
-
-    @staticmethod
-    def from_dict(
-        d: NodeConfigDict,
-    ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
-        """A convenience method that processes the inputs before creating a specialized instance.
-
-        Args:
-            d: The configuration dictionary.
-
-        Returns:
-            Either an SUT or TG configuration instance.
-        """
-        hugepage_config = None
-        if "hugepages_2mb" in d:
-            hugepage_config_dict = d["hugepages_2mb"]
-            if "force_first_numa" not in hugepage_config_dict:
-                hugepage_config_dict["force_first_numa"] = False
-            hugepage_config = HugepageConfiguration(**hugepage_config_dict)
-
-        # The calls here contain duplicated code which is here because Mypy doesn't
-        # properly support dictionary unpacking with TypedDicts
-        if "traffic_generator" in d:
-            return TGNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                traffic_generator=TrafficGeneratorConfig.from_dict(d["traffic_generator"]),
-            )
-        else:
-            return SutNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                memory_channels=d.get("memory_channels", 1),
-            )
+    lcores: LogicalCores = "1"
+    use_first_core: bool = Field(
+        default=False, description="DPDK won't use the first physical core if set to False."
+    )
+    hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
+    ports: list[PortConfig] = Field(min_length=1)
 
 
-@dataclass(slots=True, frozen=True)
 class SutNodeConfiguration(NodeConfiguration):
     """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
 
@@ -304,10 +238,11 @@ class SutNodeConfiguration(NodeConfiguration):
         memory_channels: The number of memory channels to use when running DPDK.
     """
 
-    memory_channels: int
+    memory_channels: int = Field(
+        default=1, description="Number of memory channels to use when running DPDK."
+    )
 
 
-@dataclass(slots=True, frozen=True)
 class TGNodeConfiguration(NodeConfiguration):
     """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
 
@@ -315,11 +250,14 @@ class TGNodeConfiguration(NodeConfiguration):
         traffic_generator: The configuration of the traffic generator present on the TG node.
     """
 
-    traffic_generator: TrafficGeneratorConfig
+    traffic_generator: TrafficGeneratorConfigTypes
+
+
+#: Union type for all the node configuration types.
+NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
 
 
-@dataclass(slots=True, frozen=True)
-class NodeInfo:
+class NodeInfo(FrozenModel):
     """Supplemental node information.
 
     Attributes:
@@ -336,165 +274,187 @@ class NodeInfo:
     kernel_version: str
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKBuildConfiguration:
-    """DPDK build configuration.
+def resolve_path(path: str) -> Path:
+    """Resolve a path as string into an absolute path."""
+    return Path(path).resolve()
+
+
+class BaseDPDKLocation(FrozenModel):
+    """DPDK location.
 
-    The configuration used for building DPDK.
+    The path to the DPDK sources, build dir and type of location.
 
     Attributes:
-        arch: The target architecture to build for.
-        os: The target os to build for.
-        cpu: The target CPU to build for.
-        compiler: The compiler executable to use.
-        compiler_wrapper: This string will be put in front of the compiler when
-            executing the build. Useful for adding wrapper commands, such as ``ccache``.
-        name: The name of the compiler.
+        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
+            located on the SUT node, instead of the execution host.
     """
 
-    arch: Architecture
-    os: OS
-    cpu: CPUType
-    compiler: Compiler
-    compiler_wrapper: str
-    name: str
+    remote: bool = False
 
-    @classmethod
-    def from_dict(cls, d: DPDKBuildConfigDict) -> Self:
-        r"""A convenience method that processes the inputs before creating an instance.
 
-        `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
-        `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
+class LocalDPDKLocation(BaseDPDKLocation):
+    """Local DPDK location parent class.
 
-        Args:
-            d: The configuration dictionary.
+    This class is meant to represent any location that is present only locally.
+    """
 
-        Returns:
-            The DPDK build configuration instance.
-        """
-        return cls(
-            arch=Architecture(d["arch"]),
-            os=OS(d["os"]),
-            cpu=CPUType(d["cpu"]),
-            compiler=Compiler(d["compiler"]),
-            compiler_wrapper=d.get("compiler_wrapper", ""),
-            name=f"{d['arch']}-{d['os']}-{d['cpu']}-{d['compiler']}",
-        )
+    remote: Literal[False] = False
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKLocation:
-    """DPDK location.
+class LocalDPDKTreeLocation(LocalDPDKLocation):
+    """Local DPDK tree location.
 
-    The path to the DPDK sources, build dir and type of location.
+    This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
+    validation.
 
     Attributes:
-        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
-            must be provided.
-        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
-            provided.
-        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
-            located on the SUT node, instead of the execution host.
-        build_dir: If it's defined, DPDK has been pre-compiled and the build directory is located in
-            a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will be using
-            `build_options` from configuration to build the DPDK from source.
+        dpdk_tree: The path to the DPDK source tree directory.
     """
 
-    dpdk_tree: str | None
-    tarball: str | None
-    remote: bool
-    build_dir: str | None
+    dpdk_tree: Path
 
-    @classmethod
-    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
-        """A convenience method that processes and validates the inputs before creating an instance.
+    #: Resolve the local DPDK tree path
+    resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
 
-        Validate existence and format of `dpdk_tree` or `tarball` on local filesystem, if
-        `remote` is False.
+    @model_validator(mode="after")
+    def validate_dpdk_tree_path(self) -> Self:
+        """Validate the provided DPDK tree path."""
+        assert self.dpdk_tree.exists(), "DPDK tree not found in local filesystem."
+        assert self.dpdk_tree.is_dir(), "The DPDK tree path must be a directory."
+        return self
 
-        Args:
-            d: The configuration dictionary.
 
-        Returns:
-            The DPDK location instance.
+class LocalDPDKTarballLocation(LocalDPDKLocation):
+    """Local DPDK tarball location.
+
+    This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
+    validation.
+
+    Attributes:
+        tarball: The path to the DPDK tarball.
+    """
+
+    tarball: Path
+
+    #: Resolve the local tarball path
+    resolve_tarball_path = field_validator("tarball")(resolve_path)
+
+    @model_validator(mode="after")
+    def validate_tarball_path(self) -> Self:
+        """Validate the provided tarball."""
+        assert self.tarball.exists(), "DPDK tarball not found in local filesystem."
+        assert tarfile.is_tarfile(self.tarball), "The DPDK tarball must be a valid tar archive."
+        return self
 
-        Raises:
-            ConfigurationError: If `dpdk_tree` or `tarball` not found in local filesystem or they
-                aren't in the right format.
-        """
-        dpdk_tree = d.get("dpdk_tree")
-        tarball = d.get("tarball")
-        remote = d.get("remote", False)
-
-        if not remote:
-            if dpdk_tree:
-                if not Path(dpdk_tree).exists():
-                    raise ConfigurationError(
-                        f"DPDK tree '{dpdk_tree}' not found in local filesystem."
-                    )
-
-                if not Path(dpdk_tree).is_dir():
-                    raise ConfigurationError(f"The DPDK tree '{dpdk_tree}' must be a directory.")
-
-                dpdk_tree = os.path.realpath(dpdk_tree)
-
-            if tarball:
-                if not Path(tarball).exists():
-                    raise ConfigurationError(
-                        f"DPDK tarball '{tarball}' not found in local filesystem."
-                    )
-
-                if not tarfile.is_tarfile(tarball):
-                    raise ConfigurationError(
-                        f"The DPDK tarball '{tarball}' must be a valid tar archive."
-                    )
-
-        return cls(
-            dpdk_tree=dpdk_tree,
-            tarball=tarball,
-            remote=remote,
-            build_dir=d.get("precompiled_build_dir"),
-        )
 
+class RemoteDPDKLocation(BaseDPDKLocation):
+    """Remote DPDK location parent class.
+
+    This class is meant to represent any location that is present only remotely.
+    """
+
+    remote: Literal[True] = True
 
-@dataclass
-class DPDKConfiguration:
-    """The configuration of the DPDK build.
 
-    The configuration contain the location of the DPDK and configuration used for
-    building it.
+class RemoteDPDKTreeLocation(RemoteDPDKLocation):
+    """Remote DPDK tree location.
+
+    This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
+
+    Attributes:
+        dpdk_tree: The path to the DPDK source tree directory.
+    """
+
+    dpdk_tree: PurePath
+
+
+class RemoteDPDKTarballLocation(LocalDPDKLocation):
+    """Remote DPDK tarball location.
+
+    This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
+    validation.
+
+    Attributes:
+        tarball: The path to the DPDK tarball.
+    """
+
+    tarball: PurePath
+
+
+#: Union type for different DPDK locations
+DPDKLocation = (
+    LocalDPDKTreeLocation
+    | LocalDPDKTarballLocation
+    | RemoteDPDKTreeLocation
+    | RemoteDPDKTarballLocation
+)
+
+
+class BaseDPDKBuildConfiguration(FrozenModel):
+    """The base configuration for different types of build.
+
+    The configuration contain the location of the DPDK and configuration used for building it.
 
     Attributes:
         dpdk_location: The location of the DPDK tree.
-        dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
-            DTS will use pre-built DPDK from `build_dir` in a :class:`DPDKLocation`.
     """
 
     dpdk_location: DPDKLocation
-    dpdk_build_config: DPDKBuildConfiguration | None
 
-    @classmethod
-    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
 
-        Args:
-            d: The configuration dictionary.
+class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration):
+    """DPDK precompiled build configuration.
+
+    Attributes:
+        precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory
+            is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will
+            be using `dpdk_build_config` from configuration to build the DPDK from source.
+    """
+
+    precompiled_build_dir: str = Field(min_length=1)
+
+
+class DPDKBuildOptionsConfiguration(FrozenModel):
+    """DPDK build options configuration.
+
+    The build options used for building DPDK.
+
+    Attributes:
+        arch: The target architecture to build for.
+        os: The target os to build for.
+        cpu: The target CPU to build for.
+        compiler: The compiler executable to use.
+        compiler_wrapper: This string will be put in front of the compiler when executing the build.
+            Useful for adding wrapper commands, such as ``ccache``.
+    """
+
+    arch: Architecture
+    os: OS
+    cpu: CPUType
+    compiler: Compiler
+    compiler_wrapper: str = ""
+
+    @cached_property
+    def name(self) -> str:
+        """The name of the compiler."""
+        return f"{self.arch}-{self.os}-{self.cpu}-{self.compiler}"
 
-        Returns:
-            The DPDK configuration.
-        """
-        return cls(
-            dpdk_location=DPDKLocation.from_dict(d),
-            dpdk_build_config=(
-                DPDKBuildConfiguration.from_dict(d["build_options"])
-                if d.get("build_options")
-                else None
-            ),
-        )
 
+class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration):
+    """DPDK uncompiled build configuration.
 
-@dataclass(slots=True, frozen=True)
-class DPDKBuildInfo:
+    Attributes:
+        build_options: The build options to compile DPDK.
+    """
+
+    build_options: DPDKBuildOptionsConfiguration
+
+
+#: Union type for different build configurations
+DPDKBuildConfiguration = DPDKPrecompiledBuildConfiguration | DPDKUncompiledBuildConfiguration
+
+
+class DPDKBuildInfo(FrozenModel):
     """Various versions and other information about a DPDK build.
 
     Attributes:
@@ -506,44 +466,106 @@ class DPDKBuildInfo:
     compiler_version: str | None
 
 
-@dataclass(slots=True, frozen=True)
-class TestSuiteConfig:
+def make_parsable_schema(schema: JsonDict):
+    """Updates a model's JSON schema to make a string representation a valid alternative.
+
+    This utility function is required to be used with models that can be represented and validated
+    as a string instead of an object mapping. Normally the generated JSON schema will just show
+    the object mapping. This function wraps the mapping under an anyOf property sequenced with a
+    string type.
+
+    This function is a valid `Callable` for the
+    :attr:`~pydantic.config.ConfigDict.json_schema_extra` attribute.
+    """
+    inner_schema = schema.copy()
+    del inner_schema["title"]
+
+    title = schema.get("title")
+    description = schema.get("description")
+
+    schema.clear()
+
+    schema["title"] = title
+    schema["description"] = description
+    schema["anyOf"] = [inner_schema, {"type": "string"}]
+
+
+class TestSuiteConfig(FrozenModel, json_schema_extra=make_parsable_schema):
     """Test suite configuration.
 
-    Information about a single test suite to be executed.
+    Information about a single test suite to be executed. It can be represented and validated as a
+    string type in the form of: ``TEST_SUITE [TEST_CASE, ...]``, in the configuration file.
 
     Attributes:
-        test_suite: The name of the test suite module without the starting ``TestSuite_``.
-        test_cases: The names of test cases from this test suite to execute.
+        test_suite_name: The name of the test suite module without the starting ``TestSuite_``.
+        test_cases_names: The names of test cases from this test suite to execute.
             If empty, all test cases will be executed.
     """
 
-    test_suite: str
-    test_cases: list[str]
-
+    test_suite_name: str = Field(
+        title="Test suite name",
+        description="The identifying module name of the test suite without the prefix.",
+        alias="test_suite",
+    )
+    test_cases_names: list[str] = Field(
+        default_factory=list,
+        title="Test cases by name",
+        description="The identifying name of the test cases of the test suite.",
+        alias="test_cases",
+    )
+
+    @cached_property
+    def test_suite_spec(self) -> "TestSuiteSpec":
+        """The specification of the requested test suite."""
+        from framework.test_suite import find_by_name
+
+        test_suite_spec = find_by_name(self.test_suite_name)
+        assert (
+            test_suite_spec is not None
+        ), f"{self.test_suite_name} is not a valid test suite module name."
+        return test_suite_spec
+
+    @model_validator(mode="before")
     @classmethod
-    def from_dict(
-        cls,
-        entry: str | TestSuiteConfigDict,
-    ) -> Self:
-        """Create an instance from two different types.
+    def convert_from_string(cls, data: Any) -> Any:
+        """Convert the string representation into a valid mapping."""
+        if isinstance(data, str):
+            [test_suite, *test_cases] = data.split()
+            return dict(test_suite=test_suite, test_cases=test_cases)
+        return data
+
+    @model_validator(mode="after")
+    def validate_names(self) -> Self:
+        """Validate the supplied test suite and test cases names.
+
+        This validator relies on the cached property `test_suite_spec` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        available_test_cases = map(
+            lambda t: t.name, self.test_suite_spec.class_obj.get_test_cases()
+        )
+        for requested_test_case in self.test_cases_names:
+            assert requested_test_case in available_test_cases, (
+                f"{requested_test_case} is not a valid test case "
+                f"of test suite {self.test_suite_name}."
+            )
 
-        Args:
-            entry: Either a suite name or a dictionary containing the config.
+        return self
 
-        Returns:
-            The test suite configuration instance.
-        """
-        if isinstance(entry, str):
-            return cls(test_suite=entry, test_cases=[])
-        elif isinstance(entry, dict):
-            return cls(test_suite=entry["suite"], test_cases=entry["cases"])
-        else:
-            raise TypeError(f"{type(entry)} is not valid for a test suite config.")
+
+class TestRunSUTNodeConfiguration(FrozenModel):
+    """The SUT node configuration of a test run.
+
+    Attributes:
+        node_name: The SUT node to use in this test run.
+        vdevs: The names of virtual devices to test.
+    """
+
+    node_name: str
+    vdevs: list[str] = Field(default_factory=list)
 
 
-@dataclass(slots=True, frozen=True)
-class TestRunConfiguration:
+class TestRunConfiguration(FrozenModel):
     """The configuration of a test run.
 
     The configuration contains testbed information, what tests to execute
@@ -555,144 +577,130 @@ class TestRunConfiguration:
         func: Whether to run functional tests.
         skip_smoke_tests: Whether to skip smoke tests.
         test_suites: The names of test suites and/or test cases to execute.
-        system_under_test_node: The SUT node to use in this test run.
-        traffic_generator_node: The TG node to use in this test run.
-        vdevs: The names of virtual devices to test.
+        system_under_test_node: The SUT node configuration to use in this test run.
+        traffic_generator_node: The TG node name to use in this test run.
         random_seed: The seed to use for pseudo-random generation.
     """
 
-    dpdk_config: DPDKConfiguration
-    perf: bool
-    func: bool
-    skip_smoke_tests: bool
-    test_suites: list[TestSuiteConfig]
-    system_under_test_node: SutNodeConfiguration
-    traffic_generator_node: TGNodeConfiguration
-    vdevs: list[str]
-    random_seed: int | None
-
-    @classmethod
-    def from_dict(
-        cls,
-        d: TestRunConfigDict,
-        node_map: dict[str, SutNodeConfiguration | TGNodeConfiguration],
-    ) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
-
-        The DPDK build and the test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
-
-        Args:
-            d: The test run configuration dictionary.
-            node_map: A dictionary mapping node names to their config objects.
-
-        Returns:
-            The test run configuration instance.
-        """
-        test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"]))
-        sut_name = d["system_under_test_node"]["node_name"]
-        skip_smoke_tests = d.get("skip_smoke_tests", False)
-        assert sut_name in node_map, f"Unknown SUT {sut_name} in test run {d}"
-        system_under_test_node = node_map[sut_name]
-        assert isinstance(
-            system_under_test_node, SutNodeConfiguration
-        ), f"Invalid SUT configuration {system_under_test_node}"
-
-        tg_name = d["traffic_generator_node"]
-        assert tg_name in node_map, f"Unknown TG {tg_name} in test run {d}"
-        traffic_generator_node = node_map[tg_name]
-        assert isinstance(
-            traffic_generator_node, TGNodeConfiguration
-        ), f"Invalid TG configuration {traffic_generator_node}"
-
-        vdevs = (
-            d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else []
-        )
-        random_seed = d.get("random_seed", None)
-        return cls(
-            dpdk_config=DPDKConfiguration.from_dict(d["dpdk_build"]),
-            perf=d["perf"],
-            func=d["func"],
-            skip_smoke_tests=skip_smoke_tests,
-            test_suites=test_suites,
-            system_under_test_node=system_under_test_node,
-            traffic_generator_node=traffic_generator_node,
-            vdevs=vdevs,
-            random_seed=random_seed,
-        )
-
-    def copy_and_modify(self, **kwargs) -> Self:
-        """Create a shallow copy with any of the fields modified.
-
-        The only new data are those passed to this method.
-        The rest are copied from the object's fields calling the method.
+    dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
+    perf: bool = Field(description="Enable performance testing.")
+    func: bool = Field(description="Enable functional testing.")
+    skip_smoke_tests: bool = False
+    test_suites: list[TestSuiteConfig] = Field(min_length=1)
+    system_under_test_node: TestRunSUTNodeConfiguration
+    traffic_generator_node: str
+    random_seed: int | None = None
 
-        Args:
-            **kwargs: The names and types of keyword arguments are defined
-                by the fields of the :class:`TestRunConfiguration` class.
 
-        Returns:
-            The copied and modified test run configuration.
-        """
-        new_config = {}
-        for field in fields(self):
-            if field.name in kwargs:
-                new_config[field.name] = kwargs[field.name]
-            else:
-                new_config[field.name] = getattr(self, field.name)
+class TestRunWithNodesConfiguration(NamedTuple):
+    """Tuple containing the configuration of the test run and its associated nodes."""
 
-        return type(self)(**new_config)
+    #:
+    test_run_config: TestRunConfiguration
+    #:
+    sut_node_config: SutNodeConfiguration
+    #:
+    tg_node_config: TGNodeConfiguration
 
 
-@dataclass(slots=True, frozen=True)
-class Configuration:
+class Configuration(BaseModel, extra="forbid"):
     """DTS testbed and test configuration.
 
-    The node configuration is not stored in this object. Rather, all used node configurations
-    are stored inside the test run configuration where the nodes are actually used.
-
     Attributes:
         test_runs: Test run configurations.
+        nodes: Node configurations.
     """
 
-    test_runs: list[TestRunConfiguration]
+    test_runs: list[TestRunConfiguration] = Field(min_length=1)
+    nodes: list[NodeConfigurationTypes] = Field(min_length=1)
 
-    @classmethod
-    def from_dict(cls, d: ConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
+    @cached_property
+    def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]:
+        """List of test runs with the associated nodes."""
+        test_runs_with_nodes = []
 
-        DPDK build and test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
+        for test_run_no, test_run in enumerate(self.test_runs):
+            sut_node_name = test_run.system_under_test_node.node_name
+            sut_node = next(filter(lambda n: n.name == sut_node_name, self.nodes), None)
 
-        Args:
-            d: The configuration dictionary.
+            assert sut_node is not None, (
+                f"test_runs.{test_run_no}.sut_node_config.node_name "
+                f"({test_run.system_under_test_node.node_name}) is not a valid node name"
+            )
+            assert isinstance(sut_node, SutNodeConfiguration), (
+                f"test_runs.{test_run_no}.sut_node_config.node_name is a valid node name, "
+                "but it is not a valid SUT node"
+            )
 
-        Returns:
-            The whole configuration instance.
-        """
-        nodes: list[SutNodeConfiguration | TGNodeConfiguration] = list(
-            map(NodeConfiguration.from_dict, d["nodes"])
-        )
-        assert len(nodes) > 0, "There must be a node to test"
+            tg_node_name = test_run.traffic_generator_node
+            tg_node = next(filter(lambda n: n.name == tg_node_name, self.nodes), None)
 
-        node_map = {node.name: node for node in nodes}
-        assert len(nodes) == len(node_map), "Duplicate node names are not allowed"
+            assert tg_node is not None, (
+                f"test_runs.{test_run_no}.tg_node_name "
+                f"({test_run.traffic_generator_node}) is not a valid node name"
+            )
+            assert isinstance(tg_node, TGNodeConfiguration), (
+                f"test_runs.{test_run_no}.tg_node_name is a valid node name, "
+                "but it is not a valid TG node"
+            )
 
-        test_runs: list[TestRunConfiguration] = list(
-            map(TestRunConfiguration.from_dict, d["test_runs"], [node_map for _ in d])
-        )
+            test_runs_with_nodes.append(TestRunWithNodesConfiguration(test_run, sut_node, tg_node))
 
-        return cls(test_runs=test_runs)
+        return test_runs_with_nodes
+
+    @field_validator("nodes")
+    @classmethod
+    def validate_node_names(cls, nodes: list[NodeConfiguration]) -> list[NodeConfiguration]:
+        """Validate that the node names are unique."""
+        nodes_by_name: dict[str, int] = {}
+        for node_no, node in enumerate(nodes):
+            assert node.name not in nodes_by_name, (
+                f"node {node_no} cannot have the same name as node {nodes_by_name[node.name]} "
+                f"({node.name})"
+            )
+            nodes_by_name[node.name] = node_no
+
+        return nodes
+
+    @model_validator(mode="after")
+    def validate_ports(self) -> Self:
+        """Validate that the ports are all linked to valid ones."""
+        port_links: dict[tuple[str, str], Literal[False] | tuple[int, int]] = {
+            (node.name, port.pci): False for node in self.nodes for port in node.ports
+        }
+
+        for node_no, node in enumerate(self.nodes):
+            for port_no, port in enumerate(node.ports):
+                peer_port_identifier = (port.peer_node, port.peer_pci)
+                peer_port = port_links.get(peer_port_identifier, None)
+                assert peer_port is not None, (
+                    "invalid peer port specified for " f"nodes.{node_no}.ports.{port_no}"
+                )
+                assert peer_port is False, (
+                    f"the peer port specified for nodes.{node_no}.ports.{port_no} "
+                    f"is already linked to nodes.{peer_port[0]}.ports.{peer_port[1]}"
+                )
+                port_links[peer_port_identifier] = (node_no, port_no)
+
+        return self
+
+    @model_validator(mode="after")
+    def validate_test_runs_with_nodes(self) -> Self:
+        """Validate the test runs to nodes associations.
+
+        This validator relies on the cached property `test_runs_with_nodes` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        if self.test_runs_with_nodes:
+            pass
+        return self
 
 
 def load_config(config_file_path: Path) -> Configuration:
     """Load DTS test run configuration from a file.
 
-    Load the YAML test run configuration file
-    and :download:`the configuration file schema <conf_yaml_schema.json>`,
-    validate the test run configuration file, and create a test run configuration object.
+    Load the YAML test run configuration file, validate it, and create a test run configuration
+    object.
 
     The YAML test run configuration file is specified in the :option:`--config-file` command line
     argument or the :envvar:`DTS_CFG_FILE` environment variable.
@@ -702,14 +710,14 @@ def load_config(config_file_path: Path) -> Configuration:
 
     Returns:
         The parsed test run configuration.
+
+    Raises:
+        ConfigurationError: If the supplied configuration file is invalid.
     """
     with open(config_file_path, "r") as f:
         config_data = yaml.safe_load(f)
 
-    schema_path = os.path.join(Path(__file__).parent.resolve(), "conf_yaml_schema.json")
-
-    with open(schema_path, "r") as f:
-        schema = json.load(f)
-    config = warlock.model_factory(schema, name="_Config")(config_data)
-    config_obj: Configuration = Configuration.from_dict(dict(config))  # type: ignore[arg-type]
-    return config_obj
+    try:
+        return Configuration.model_validate(config_data)
+    except ValidationError as e:
+        raise ConfigurationError("failed to load the supplied configuration") from e
diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
deleted file mode 100644
index 3e37555fc2..0000000000
--- a/dts/framework/config/conf_yaml_schema.json
+++ /dev/null
@@ -1,458 +0,0 @@
-{
-  "$schema": "https://json-schema.org/draft-07/schema",
-  "title": "DTS Config Schema",
-  "definitions": {
-    "node_name": {
-      "type": "string",
-      "description": "A unique identifier for a node"
-    },
-    "NIC": {
-      "type": "string",
-      "enum": [
-        "ALL",
-        "ConnectX3_MT4103",
-        "ConnectX4_LX_MT4117",
-        "ConnectX4_MT4115",
-        "ConnectX5_MT4119",
-        "ConnectX5_MT4121",
-        "I40E_10G-10G_BASE_T_BC",
-        "I40E_10G-10G_BASE_T_X722",
-        "I40E_10G-SFP_X722",
-        "I40E_10G-SFP_XL710",
-        "I40E_10G-X722_A0",
-        "I40E_1G-1G_BASE_T_X722",
-        "I40E_25G-25G_SFP28",
-        "I40E_40G-QSFP_A",
-        "I40E_40G-QSFP_B",
-        "IAVF-ADAPTIVE_VF",
-        "IAVF-VF",
-        "IAVF_10G-X722_VF",
-        "ICE_100G-E810C_QSFP",
-        "ICE_25G-E810C_SFP",
-        "ICE_25G-E810_XXV_SFP",
-        "IGB-I350_VF",
-        "IGB_1G-82540EM",
-        "IGB_1G-82545EM_COPPER",
-        "IGB_1G-82571EB_COPPER",
-        "IGB_1G-82574L",
-        "IGB_1G-82576",
-        "IGB_1G-82576_QUAD_COPPER",
-        "IGB_1G-82576_QUAD_COPPER_ET2",
-        "IGB_1G-82580_COPPER",
-        "IGB_1G-I210_COPPER",
-        "IGB_1G-I350_COPPER",
-        "IGB_1G-I354_SGMII",
-        "IGB_1G-PCH_LPTLP_I218_LM",
-        "IGB_1G-PCH_LPTLP_I218_V",
-        "IGB_1G-PCH_LPT_I217_LM",
-        "IGB_1G-PCH_LPT_I217_V",
-        "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
-        "IGC-I225_LM",
-        "IGC-I226_LM",
-        "IXGBE_10G-82599_SFP",
-        "IXGBE_10G-82599_SFP_SF_QP",
-        "IXGBE_10G-82599_T3_LOM",
-        "IXGBE_10G-82599_VF",
-        "IXGBE_10G-X540T",
-        "IXGBE_10G-X540_VF",
-        "IXGBE_10G-X550EM_A_SFP",
-        "IXGBE_10G-X550EM_X_10G_T",
-        "IXGBE_10G-X550EM_X_SFP",
-        "IXGBE_10G-X550EM_X_VF",
-        "IXGBE_10G-X550T",
-        "IXGBE_10G-X550_VF",
-        "brcm_57414",
-        "brcm_P2100G",
-        "cavium_0011",
-        "cavium_a034",
-        "cavium_a063",
-        "cavium_a064",
-        "fastlinq_ql41000",
-        "fastlinq_ql41000_vf",
-        "fastlinq_ql45000",
-        "fastlinq_ql45000_vf",
-        "hi1822",
-        "virtio"
-      ]
-    },
-
-    "ARCH": {
-      "type": "string",
-      "enum": [
-        "x86_64",
-        "arm64",
-        "ppc64le"
-      ]
-    },
-    "OS": {
-      "type": "string",
-      "enum": [
-        "linux"
-      ]
-    },
-    "cpu": {
-      "type": "string",
-      "description": "Native should be the default on x86",
-      "enum": [
-        "native",
-        "armv8a",
-        "dpaa2",
-        "thunderx",
-        "xgene1"
-      ]
-    },
-    "compiler": {
-      "type": "string",
-      "enum": [
-        "gcc",
-        "clang",
-        "icc",
-        "mscv"
-      ]
-    },
-    "build_options": {
-      "type": "object",
-      "properties": {
-        "arch": {
-          "type": "string",
-          "enum": [
-            "ALL",
-            "x86_64",
-            "arm64",
-            "ppc64le",
-            "other"
-          ]
-        },
-        "os": {
-          "$ref": "#/definitions/OS"
-        },
-        "cpu": {
-          "$ref": "#/definitions/cpu"
-        },
-        "compiler": {
-          "$ref": "#/definitions/compiler"
-        },
-        "compiler_wrapper": {
-          "type": "string",
-          "description": "This will be added before compiler to the CC variable when building DPDK. Optional."
-        }
-      },
-      "additionalProperties": false,
-      "required": [
-        "arch",
-        "os",
-        "cpu",
-        "compiler"
-      ]
-    },
-    "dpdk_build": {
-      "type": "object",
-      "description": "DPDK source and build configuration.",
-      "properties": {
-        "dpdk_tree": {
-          "type": "string",
-          "description": "The path to the DPDK source tree directory to test. Only one of `dpdk_tree` or `tarball` must be provided."
-        },
-        "tarball": {
-          "type": "string",
-          "description": "The path to the DPDK source tarball to test. Only one of `dpdk_tree` or `tarball` must be provided."
-        },
-        "remote": {
-          "type": "boolean",
-          "description": "Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball` is located on the SUT node, instead of the execution host."
-        },
-        "precompiled_build_dir": {
-          "type": "string",
-          "description": "If it's defined, DPDK has been pre-built and the build directory is located in a subdirectory of DPDK tree root directory. Otherwise, will be using a `build_options` to build the DPDK from source. Either this or `build_options` must be defined, but not both."
-        },
-        "build_options": {
-          "$ref": "#/definitions/build_options",
-          "description": "Either this or `precompiled_build_dir` must be defined, but not both. DPDK build configuration supported by DTS."
-        }
-      },
-      "allOf": [
-        {
-          "oneOf": [
-            {
-            "required": [
-              "dpdk_tree"
-              ]
-            },
-            {
-              "required": [
-                "tarball"
-              ]
-            }
-          ]
-        },
-        {
-          "oneOf": [
-            {
-              "required": [
-                "precompiled_build_dir"
-              ]
-            },
-            {
-              "required": [
-                "build_options"
-              ]
-            }
-          ]
-        }
-      ],
-      "additionalProperties": false
-    },
-    "hugepages_2mb": {
-      "type": "object",
-      "description": "Optional hugepage configuration. If not specified, hugepages won't be configured and DTS will use system configuration.",
-      "properties": {
-        "number_of": {
-          "type": "integer",
-          "description": "The number of hugepages to configure. Hugepage size will be the system default."
-        },
-        "force_first_numa": {
-          "type": "boolean",
-          "description": "Set to True to force configuring hugepages on the first NUMA node. Defaults to False."
-        }
-      },
-      "additionalProperties": false,
-      "required": [
-        "number_of"
-      ]
-    },
-    "mac_address": {
-      "type": "string",
-      "description": "A MAC address",
-      "pattern": "^([0-9A-Fa-f]{2}[:-]){5}([0-9A-Fa-f]{2})$"
-    },
-    "pci_address": {
-      "type": "string",
-      "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
-    },
-    "port_peer_address": {
-      "description": "Peer is a TRex port, and IXIA port or a PCI address",
-      "oneOf": [
-        {
-          "description": "PCI peer port",
-          "$ref": "#/definitions/pci_address"
-        }
-      ]
-    },
-    "test_suite": {
-      "type": "string",
-      "enum": [
-        "hello_world",
-        "os_udp",
-        "pmd_buffer_scatter"
-      ]
-    },
-    "test_target": {
-      "type": "object",
-      "properties": {
-        "suite": {
-          "$ref": "#/definitions/test_suite"
-        },
-        "cases": {
-          "type": "array",
-          "description": "If specified, only this subset of test suite's test cases will be run.",
-          "items": {
-            "type": "string"
-          },
-          "minimum": 1
-        }
-      },
-      "required": [
-        "suite"
-      ],
-      "additionalProperties": false
-    }
-  },
-  "type": "object",
-  "properties": {
-    "nodes": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "name": {
-            "type": "string",
-            "description": "A unique identifier for this node"
-          },
-          "hostname": {
-            "type": "string",
-            "description": "A hostname from which the node running DTS can access this node. This can also be an IP address."
-          },
-          "user": {
-            "type": "string",
-            "description": "The user to access this node with."
-          },
-          "password": {
-            "type": "string",
-            "description": "The password to use on this node. Use only as a last resort. SSH keys are STRONGLY preferred."
-          },
-          "arch": {
-            "$ref": "#/definitions/ARCH"
-          },
-          "os": {
-            "$ref": "#/definitions/OS"
-          },
-          "lcores": {
-            "type": "string",
-            "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
-            "description": "Optional comma-separated list of logical cores to use, e.g.: 1,2,3,4,5,18-22. Defaults to 1. An empty string means use all lcores."
-          },
-          "use_first_core": {
-            "type": "boolean",
-            "description": "Indicate whether DPDK should use the first physical core. It won't be used by default."
-          },
-          "memory_channels": {
-            "type": "integer",
-            "description": "How many memory channels to use. Optional, defaults to 1."
-          },
-          "hugepages_2mb": {
-            "$ref": "#/definitions/hugepages_2mb"
-          },
-          "ports": {
-            "type": "array",
-            "items": {
-              "type": "object",
-              "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.",
-              "properties": {
-                "pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The local PCI address of the port"
-                },
-                "os_driver_for_dpdk": {
-                  "type": "string",
-                  "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)"
-                },
-                "os_driver": {
-                  "type": "string",
-                  "description": "The driver normally used by this port (ex: i40e)"
-                },
-                "peer_node": {
-                  "type": "string",
-                  "description": "The name of the node the peer port is on"
-                },
-                "peer_pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The PCI address of the peer port"
-                }
-              },
-              "additionalProperties": false,
-              "required": [
-                "pci",
-                "os_driver_for_dpdk",
-                "os_driver",
-                "peer_node",
-                "peer_pci"
-              ]
-            },
-            "minimum": 1
-          },
-          "traffic_generator": {
-            "oneOf": [
-              {
-                "type": "object",
-                "description": "Scapy traffic generator. Used for functional testing.",
-                "properties": {
-                  "type": {
-                    "type": "string",
-                    "enum": [
-                      "SCAPY"
-                    ]
-                  }
-                }
-              }
-            ]
-          }
-        },
-        "additionalProperties": false,
-        "required": [
-          "name",
-          "hostname",
-          "user",
-          "arch",
-          "os"
-        ]
-      },
-      "minimum": 1
-    },
-    "test_runs": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "dpdk_build": {
-            "$ref": "#/definitions/dpdk_build"
-          },
-          "perf": {
-            "type": "boolean",
-            "description": "Enable performance testing."
-          },
-          "func": {
-            "type": "boolean",
-            "description": "Enable functional testing."
-          },
-          "test_suites": {
-            "type": "array",
-            "items": {
-              "oneOf": [
-                {
-                  "$ref": "#/definitions/test_suite"
-                },
-                {
-                  "$ref": "#/definitions/test_target"
-                }
-              ]
-            }
-          },
-          "skip_smoke_tests": {
-            "description": "Optional field that allows you to skip smoke testing",
-            "type": "boolean"
-          },
-          "system_under_test_node": {
-            "type":"object",
-            "properties": {
-              "node_name": {
-                "$ref": "#/definitions/node_name"
-              },
-              "vdevs": {
-                "description": "Optional list of names of vdevs to be used in the test run",
-                "type": "array",
-                "items": {
-                  "type": "string"
-                }
-              }
-            },
-            "required": [
-              "node_name"
-            ]
-          },
-          "traffic_generator_node": {
-            "$ref": "#/definitions/node_name"
-          },
-          "random_seed": {
-            "type": "integer",
-            "description": "Optional field. Allows you to set a seed for pseudo-random generation."
-          }
-        },
-        "additionalProperties": false,
-        "required": [
-          "dpdk_build",
-          "perf",
-          "func",
-          "test_suites",
-          "system_under_test_node",
-          "traffic_generator_node"
-        ]
-      },
-      "minimum": 1
-    }
-  },
-  "required": [
-    "test_runs",
-    "nodes"
-  ],
-  "additionalProperties": false
-}
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
deleted file mode 100644
index 02e738a61e..0000000000
--- a/dts/framework/config/types.py
+++ /dev/null
@@ -1,149 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-"""Configuration dictionary contents specification.
-
-These type definitions serve as documentation of the configuration dictionary contents.
-
-The definitions use the built-in :class:`~typing.TypedDict` construct.
-"""
-
-from typing import TypedDict
-
-
-class PortConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    pci: str
-    #:
-    os_driver_for_dpdk: str
-    #:
-    os_driver: str
-    #:
-    peer_node: str
-    #:
-    peer_pci: str
-
-
-class TrafficGeneratorConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    type: str
-
-
-class HugepageConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    number_of: int
-    #:
-    force_first_numa: bool
-
-
-class NodeConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    hugepages_2mb: HugepageConfigurationDict
-    #:
-    name: str
-    #:
-    hostname: str
-    #:
-    user: str
-    #:
-    password: str
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    lcores: str
-    #:
-    use_first_core: bool
-    #:
-    ports: list[PortConfigDict]
-    #:
-    memory_channels: int
-    #:
-    traffic_generator: TrafficGeneratorConfigDict
-
-
-class DPDKBuildConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    cpu: str
-    #:
-    compiler: str
-    #:
-    compiler_wrapper: str
-
-
-class DPDKConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    dpdk_tree: str | None
-    #:
-    tarball: str | None
-    #:
-    remote: bool
-    #:
-    precompiled_build_dir: str | None
-    #:
-    build_options: DPDKBuildConfigDict
-
-
-class TestSuiteConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    suite: str
-    #:
-    cases: list[str]
-
-
-class TestRunSUTConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    node_name: str
-    #:
-    vdevs: list[str]
-
-
-class TestRunConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    dpdk_build: DPDKConfigurationDict
-    #:
-    perf: bool
-    #:
-    func: bool
-    #:
-    skip_smoke_tests: bool
-    #:
-    test_suites: TestSuiteConfigDict
-    #:
-    system_under_test_node: TestRunSUTConfigDict
-    #:
-    traffic_generator_node: str
-    #:
-    random_seed: int
-
-
-class ConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    nodes: list[NodeConfigDict]
-    #:
-    test_runs: list[TestRunConfigDict]
diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 195622c653..c3d9a27a8c 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -30,7 +30,15 @@
 from framework.testbed_model.sut_node import SutNode
 from framework.testbed_model.tg_node import TGNode
 
-from .config import Configuration, TestRunConfiguration, TestSuiteConfig, load_config
+from .config import (
+    Configuration,
+    DPDKPrecompiledBuildConfiguration,
+    SutNodeConfiguration,
+    TestRunConfiguration,
+    TestSuiteConfig,
+    TGNodeConfiguration,
+    load_config,
+)
 from .exception import (
     BlockingTestSuiteError,
     ConfigurationError,
@@ -133,11 +141,10 @@ def run(self) -> None:
             self._result.update_setup(Result.PASS)
 
             # for all test run sections
-            for test_run_config in self._configuration.test_runs:
+            for test_run_with_nodes_config in self._configuration.test_runs_with_nodes:
+                test_run_config, sut_node_config, tg_node_config = test_run_with_nodes_config
                 self._logger.set_stage(DtsStage.test_run_setup)
-                self._logger.info(
-                    f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
-                )
+                self._logger.info(f"Running test run with SUT '{sut_node_config.name}'.")
                 self._init_random_seed(test_run_config)
                 test_run_result = self._result.add_test_run(test_run_config)
                 # we don't want to modify the original config, so create a copy
@@ -145,7 +152,7 @@ def run(self) -> None:
                     SETTINGS.test_suites if SETTINGS.test_suites else test_run_config.test_suites
                 )
                 if not test_run_config.skip_smoke_tests:
-                    test_run_test_suites[:0] = [TestSuiteConfig.from_dict("smoke_tests")]
+                    test_run_test_suites[:0] = [TestSuiteConfig(test_suite="smoke_tests")]
                 try:
                     test_suites_with_cases = self._get_test_suites_with_cases(
                         test_run_test_suites, test_run_config.func, test_run_config.perf
@@ -161,6 +168,8 @@ def run(self) -> None:
                     self._connect_nodes_and_run_test_run(
                         sut_nodes,
                         tg_nodes,
+                        sut_node_config,
+                        tg_node_config,
                         test_run_config,
                         test_run_result,
                         test_suites_with_cases,
@@ -223,10 +232,10 @@ def _get_test_suites_with_cases(
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
+            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
             test_cases: list[type[TestCase]] = []
             func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
-                test_suite_config.test_cases
+                test_suite_config.test_cases_names
             )
             if func:
                 test_cases.extend(func_test_cases)
@@ -305,6 +314,8 @@ def _connect_nodes_and_run_test_run(
         self,
         sut_nodes: dict[str, SutNode],
         tg_nodes: dict[str, TGNode],
+        sut_node_config: SutNodeConfiguration,
+        tg_node_config: TGNodeConfiguration,
         test_run_config: TestRunConfiguration,
         test_run_result: TestRunResult,
         test_suites_with_cases: Iterable[TestSuiteWithCases],
@@ -319,24 +330,26 @@ def _connect_nodes_and_run_test_run(
         Args:
             sut_nodes: A dictionary storing connected/to be connected SUT nodes.
             tg_nodes: A dictionary storing connected/to be connected TG nodes.
+            sut_node_config: The test run's SUT node configuration.
+            tg_node_config: The test run's TG node configuration.
             test_run_config: A test run configuration.
             test_run_result: The test run's result.
             test_suites_with_cases: The test suites with test cases to run.
         """
-        sut_node = sut_nodes.get(test_run_config.system_under_test_node.name)
-        tg_node = tg_nodes.get(test_run_config.traffic_generator_node.name)
+        sut_node = sut_nodes.get(sut_node_config.name)
+        tg_node = tg_nodes.get(tg_node_config.name)
 
         try:
             if not sut_node:
-                sut_node = SutNode(test_run_config.system_under_test_node)
+                sut_node = SutNode(sut_node_config)
                 sut_nodes[sut_node.name] = sut_node
             if not tg_node:
-                tg_node = TGNode(test_run_config.traffic_generator_node)
+                tg_node = TGNode(tg_node_config)
                 tg_nodes[tg_node.name] = tg_node
         except Exception as e:
-            failed_node = test_run_config.system_under_test_node.name
+            failed_node = test_run_config.system_under_test_node.node_name
             if sut_node:
-                failed_node = test_run_config.traffic_generator_node.name
+                failed_node = test_run_config.traffic_generator_node
             self._logger.exception(f"The Creation of node {failed_node} failed.")
             test_run_result.update_setup(Result.FAIL, e)
 
@@ -369,14 +382,22 @@ def _run_test_run(
             ConfigurationError: If the DPDK sources or build is not set up from config or settings.
         """
         self._logger.info(
-            f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
+            f"Running test run with SUT '{test_run_config.system_under_test_node.node_name}'."
         )
         test_run_result.add_sut_info(sut_node.node_info)
         try:
-            dpdk_location = SETTINGS.dpdk_location or test_run_config.dpdk_config.dpdk_location
-            sut_node.set_up_test_run(test_run_config, dpdk_location)
+            dpdk_build_config = test_run_config.dpdk_config
+            if new_location := SETTINGS.dpdk_location:
+                dpdk_build_config = dpdk_build_config.model_copy(
+                    update={"dpdk_location": new_location}
+                )
+            if dir := SETTINGS.precompiled_build_dir:
+                dpdk_build_config = DPDKPrecompiledBuildConfiguration(
+                    dpdk_location=dpdk_build_config.dpdk_location, precompiled_build_dir=dir
+                )
+            sut_node.set_up_test_run(test_run_config, dpdk_build_config)
             test_run_result.add_dpdk_build_info(sut_node.get_dpdk_build_info())
-            tg_node.set_up_test_run(test_run_config, dpdk_location)
+            tg_node.set_up_test_run(test_run_config, dpdk_build_config)
             test_run_result.update_setup(Result.PASS)
         except Exception as e:
             self._logger.exception("Test run setup failed.")
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index a452319b90..1253ed86ac 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -60,9 +60,8 @@
 .. option:: --precompiled-build-dir
 .. envvar:: DTS_PRECOMPILED_BUILD_DIR
 
-    Define the subdirectory under the DPDK tree root directory where the pre-compiled binaries are
-    located. If set, DTS will build DPDK under the `build` directory instead. Can only be used with
-    --dpdk-tree or --tarball.
+    Define the subdirectory under the DPDK tree root directory or tarball where the pre-compiled
+    binaries are located.
 
 .. option:: --test-suite
 .. envvar:: DTS_TEST_SUITES
@@ -95,13 +94,21 @@
 import argparse
 import os
 import sys
-import tarfile
 from argparse import Action, ArgumentDefaultsHelpFormatter, _get_action_name
 from dataclasses import dataclass, field
 from pathlib import Path
 from typing import Callable
 
-from .config import DPDKLocation, TestSuiteConfig
+from pydantic import ValidationError
+
+from .config import (
+    DPDKLocation,
+    LocalDPDKTarballLocation,
+    LocalDPDKTreeLocation,
+    RemoteDPDKTarballLocation,
+    RemoteDPDKTreeLocation,
+    TestSuiteConfig,
+)
 
 
 @dataclass(slots=True)
@@ -122,6 +129,8 @@ class Settings:
     #:
     dpdk_location: DPDKLocation | None = None
     #:
+    precompiled_build_dir: str | None = None
+    #:
     compile_timeout: float = 1200
     #:
     test_suites: list[TestSuiteConfig] = field(default_factory=list)
@@ -383,13 +392,11 @@ def _get_parser() -> _DTSArgumentParser:
 
     action = dpdk_build.add_argument(
         "--precompiled-build-dir",
-        help="Define the subdirectory under the DPDK tree root directory where the pre-compiled "
-        "binaries are located. If set, DTS will build DPDK under the `build` directory instead. "
-        "Can only be used with --dpdk-tree or --tarball.",
+        help="Define the subdirectory under the DPDK tree root directory or tarball where the "
+        "pre-compiled binaries are located.",
         metavar="DIR_NAME",
     )
     _add_env_var_to_action(action)
-    _required_with_one_of(parser, action, "dpdk_tarball_path", "dpdk_tree_path")
 
     action = parser.add_argument(
         "--compile-timeout",
@@ -442,61 +449,61 @@ def _get_parser() -> _DTSArgumentParser:
 
 
 def _process_dpdk_location(
+    parser: _DTSArgumentParser,
     dpdk_tree: str | None,
     tarball: str | None,
     remote: bool,
-    build_dir: str | None,
-):
+) -> DPDKLocation | None:
     """Process and validate DPDK build arguments.
 
     Ensures that either `dpdk_tree` or `tarball` is provided. Validate existence and format of
     `dpdk_tree` or `tarball` on local filesystem, if `remote` is False. Constructs and returns
-    the :class:`DPDKLocation` with the provided parameters if validation is successful.
+    any valid :class:`DPDKLocation` with the provided parameters if validation is successful.
 
     Args:
-        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
-            must be provided.
-        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
-            provided.
+        dpdk_tree: The path to the DPDK source tree directory.
+        tarball: The path to the DPDK tarball.
         remote: If :data:`True`, `dpdk_tree` or `tarball` is located on the SUT node, instead of the
             execution host.
-        build_dir: If it's defined, DPDK has been pre-built and the build directory is located in a
-            subdirectory of `dpdk_tree` or `tarball` root directory.
 
     Returns:
         A DPDK location if construction is successful, otherwise None.
-
-    Raises:
-        argparse.ArgumentTypeError: If `dpdk_tree` or `tarball` not found in local filesystem or
-            they aren't in the right format.
     """
-    if not (dpdk_tree or tarball):
-        return None
-
-    if not remote:
-        if dpdk_tree:
-            if not Path(dpdk_tree).exists():
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tree '{dpdk_tree}' not found in local filesystem."
-                )
-
-            if not Path(dpdk_tree).is_dir():
-                raise argparse.ArgumentTypeError(f"DPDK tree '{dpdk_tree}' must be a directory.")
-
-            dpdk_tree = os.path.realpath(dpdk_tree)
-
-        if tarball:
-            if not Path(tarball).exists():
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tarball '{tarball}' not found in local filesystem."
-                )
-
-            if not tarfile.is_tarfile(tarball):
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tarball '{tarball}' must be a valid tar archive."
-                )
-
-    return DPDKLocation(dpdk_tree=dpdk_tree, tarball=tarball, remote=remote, build_dir=build_dir)
+    if dpdk_tree:
+        action = parser.find_action("dpdk_tree", _is_from_env)
+
+        try:
+            if remote:
+                return RemoteDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
+            else:
+                return LocalDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
+        except ValidationError as e:
+            print(
+                "An error has occurred while validating the DPDK tree supplied in the "
+                f"{'environment variable' if action else 'arguments'}:",
+                file=sys.stderr,
+            )
+            print(e, file=sys.stderr)
+            sys.exit(1)
+
+    if tarball:
+        action = parser.find_action("tarball", _is_from_env)
+
+        try:
+            if remote:
+                return RemoteDPDKTarballLocation.model_validate({"tarball": tarball})
+            else:
+                return LocalDPDKTarballLocation.model_validate({"tarball": tarball})
+        except ValidationError as e:
+            print(
+                "An error has occurred while validating the DPDK tarball supplied in the "
+                f"{'environment variable' if action else 'arguments'}:",
+                file=sys.stderr,
+            )
+            print(e, file=sys.stderr)
+            sys.exit(1)
+
+    return None
 
 
 def _process_test_suites(
@@ -512,11 +519,24 @@ def _process_test_suites(
     Returns:
         A list of test suite configurations to execute.
     """
-    if parser.find_action("test_suites", _is_from_env):
+    action = parser.find_action("test_suites", _is_from_env)
+    if action:
         # Environment variable in the form of "SUITE1 CASE1 CASE2, SUITE2 CASE1, SUITE3, ..."
         args = [suite_with_cases.split() for suite_with_cases in args[0][0].split(",")]
 
-    return [TestSuiteConfig(test_suite, test_cases) for [test_suite, *test_cases] in args]
+    try:
+        return [
+            TestSuiteConfig(test_suite=test_suite, test_cases=test_cases)
+            for [test_suite, *test_cases] in args
+        ]
+    except ValidationError as e:
+        print(
+            "An error has occurred while validating the test suites supplied in the "
+            f"{'environment variable' if action else 'arguments'}:",
+            file=sys.stderr,
+        )
+        print(e, file=sys.stderr)
+        sys.exit(1)
 
 
 def get_settings() -> Settings:
@@ -536,7 +556,7 @@ def get_settings() -> Settings:
     args = parser.parse_args()
 
     args.dpdk_location = _process_dpdk_location(
-        args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source, args.precompiled_build_dir
+        parser, args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source
     )
     args.test_suites = _process_test_suites(parser, args.test_suites)
 
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index 62867fd80c..6031eaf937 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -17,7 +17,12 @@
 from ipaddress import IPv4Interface, IPv6Interface
 from typing import Union
 
-from framework.config import OS, DPDKLocation, NodeConfiguration, TestRunConfiguration
+from framework.config import (
+    OS,
+    DPDKBuildConfiguration,
+    NodeConfiguration,
+    TestRunConfiguration,
+)
 from framework.exception import ConfigurationError
 from framework.logger import DTSLogger, get_dts_logger
 
@@ -89,13 +94,15 @@ def __init__(self, node_config: NodeConfiguration):
         self._init_ports()
 
     def _init_ports(self) -> None:
-        self.ports = [Port(port_config) for port_config in self.config.ports]
+        self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
         self.main_session.update_ports(self.ports)
         for port in self.ports:
             self.configure_port_state(port)
 
     def set_up_test_run(
-        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+        self,
+        test_run_config: TestRunConfiguration,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Test run setup steps.
 
@@ -105,7 +112,7 @@ def set_up_test_run(
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
-            dpdk_location: The target source of the DPDK tree.
+            dpdk_build_config: The build configuration of DPDK.
         """
         self._setup_hugepages()
 
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 6194ddb989..23baf1df89 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -345,7 +345,7 @@ def extract_remote_tarball(
         """
 
     @abstractmethod
-    def is_remote_dir(self, remote_path: str) -> bool:
+    def is_remote_dir(self, remote_path: PurePath) -> bool:
         """Check if the `remote_path` is a directory.
 
         Args:
@@ -356,7 +356,7 @@ def is_remote_dir(self, remote_path: str) -> bool:
         """
 
     @abstractmethod
-    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
+    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
         """Check if the `remote_tarball_path` is a tar archive.
 
         Args:
diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
index 82c84cf4f8..817405bea4 100644
--- a/dts/framework/testbed_model/port.py
+++ b/dts/framework/testbed_model/port.py
@@ -54,7 +54,7 @@ class Port:
     mac_address: str = ""
     logical_name: str = ""
 
-    def __init__(self, config: PortConfig):
+    def __init__(self, node_name: str, config: PortConfig):
         """Initialize the port from `node_name` and `config`.
 
         Args:
@@ -62,7 +62,7 @@ def __init__(self, config: PortConfig):
             config: The test run configuration of the port.
         """
         self.identifier = PortIdentifier(
-            node=config.node,
+            node=node_name,
             pci=config.pci,
         )
         self.os_driver = config.os_driver
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 5ab7c18fb7..7a6a1b6f84 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -201,12 +201,12 @@ def extract_remote_tarball(
         if expected_dir:
             self.send_command(f"ls {expected_dir}", verify=True)
 
-    def is_remote_dir(self, remote_path: str) -> bool:
+    def is_remote_dir(self, remote_path: PurePath) -> bool:
         """Overrides :meth:`~.os_session.OSSession.is_remote_dir`."""
         result = self.send_command(f"test -d {remote_path}")
         return not result.return_code
 
-    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
+    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
         """Overrides :meth:`~.os_session.OSSession.is_remote_tarfile`."""
         result = self.send_command(f"tar -tvf {remote_tarball_path}")
         return not result.return_code
@@ -393,4 +393,8 @@ def get_node_info(self) -> NodeInfo:
             SETTINGS.timeout,
         ).stdout.split("\n")
         kernel_version = self.send_command("uname -r", SETTINGS.timeout).stdout
-        return NodeInfo(os_release_info[0].strip(), os_release_info[1].strip(), kernel_version)
+        return NodeInfo(
+            os_name=os_release_info[0].strip(),
+            os_version=os_release_info[1].strip(),
+            kernel_version=kernel_version,
+        )
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index e160386324..f3d1eac68e 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -14,13 +14,19 @@
 
 import os
 import time
-from pathlib import PurePath
+from pathlib import Path, PurePath
 
 from framework.config import (
     DPDKBuildConfiguration,
     DPDKBuildInfo,
-    DPDKLocation,
+    DPDKBuildOptionsConfiguration,
+    DPDKPrecompiledBuildConfiguration,
+    DPDKUncompiledBuildConfiguration,
+    LocalDPDKTarballLocation,
+    LocalDPDKTreeLocation,
     NodeInfo,
+    RemoteDPDKTarballLocation,
+    RemoteDPDKTreeLocation,
     SutNodeConfiguration,
     TestRunConfiguration,
 )
@@ -166,7 +172,9 @@ def get_dpdk_build_info(self) -> DPDKBuildInfo:
         return DPDKBuildInfo(dpdk_version=self.dpdk_version, compiler_version=self.compiler_version)
 
     def set_up_test_run(
-        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+        self,
+        test_run_config: TestRunConfiguration,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Extend the test run setup with vdev config and DPDK build set up.
 
@@ -176,12 +184,12 @@ def set_up_test_run(
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
-            dpdk_location: The target source of the DPDK tree.
+            dpdk_build_config: The build configuration of DPDK.
         """
-        super().set_up_test_run(test_run_config, dpdk_location)
-        for vdev in test_run_config.vdevs:
+        super().set_up_test_run(test_run_config, dpdk_build_config)
+        for vdev in test_run_config.system_under_test_node.vdevs:
             self.virtual_devices.append(VirtualDevice(vdev))
-        self._set_up_dpdk(dpdk_location, test_run_config.dpdk_config.dpdk_build_config)
+        self._set_up_dpdk(dpdk_build_config)
 
     def tear_down_test_run(self) -> None:
         """Extend the test run teardown with virtual device teardown and DPDK teardown."""
@@ -190,7 +198,8 @@ def tear_down_test_run(self) -> None:
         self._tear_down_dpdk()
 
     def _set_up_dpdk(
-        self, dpdk_location: DPDKLocation, dpdk_build_config: DPDKBuildConfiguration | None
+        self,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Set up DPDK the SUT node and bind ports.
 
@@ -199,21 +208,26 @@ def _set_up_dpdk(
         are bound to those that DPDK needs.
 
         Args:
-            dpdk_location: The location of the DPDK tree.
-            dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
-                DTS will use pre-built DPDK from a :dataclass:`DPDKLocation`.
+            dpdk_build_config: A DPDK build configuration to test.
         """
-        self._set_remote_dpdk_tree_path(dpdk_location.dpdk_tree, dpdk_location.remote)
-        if not self._remote_dpdk_tree_path:
-            if dpdk_location.dpdk_tree:
-                self._copy_dpdk_tree(dpdk_location.dpdk_tree)
-            elif dpdk_location.tarball:
-                self._prepare_and_extract_dpdk_tarball(dpdk_location.tarball, dpdk_location.remote)
-
-        self._set_remote_dpdk_build_dir(dpdk_location.build_dir)
-        if not self.remote_dpdk_build_dir and dpdk_build_config:
-            self._configure_dpdk_build(dpdk_build_config)
-            self._build_dpdk()
+        match dpdk_build_config.dpdk_location:
+            case RemoteDPDKTreeLocation(dpdk_tree=dpdk_tree):
+                self._set_remote_dpdk_tree_path(dpdk_tree)
+            case LocalDPDKTreeLocation(dpdk_tree=dpdk_tree):
+                self._copy_dpdk_tree(dpdk_tree)
+            case RemoteDPDKTarballLocation(tarball=tarball):
+                self._validate_remote_dpdk_tarball(tarball)
+                self._prepare_and_extract_dpdk_tarball(tarball)
+            case LocalDPDKTarballLocation(tarball=tarball):
+                remote_tarball = self._copy_dpdk_tarball_to_remote(tarball)
+                self._prepare_and_extract_dpdk_tarball(remote_tarball)
+
+        match dpdk_build_config:
+            case DPDKPrecompiledBuildConfiguration(precompiled_build_dir=build_dir):
+                self._set_remote_dpdk_build_dir(build_dir)
+            case DPDKUncompiledBuildConfiguration(build_options=build_options):
+                self._configure_dpdk_build(build_options)
+                self._build_dpdk()
 
         self.bind_ports_to_driver()
 
@@ -226,37 +240,29 @@ def _tear_down_dpdk(self) -> None:
         self.compiler_version = None
         self.bind_ports_to_driver(for_dpdk=False)
 
-    def _set_remote_dpdk_tree_path(self, dpdk_tree: str | None, remote: bool):
+    def _set_remote_dpdk_tree_path(self, dpdk_tree: PurePath):
         """Set the path to the remote DPDK source tree based on the provided DPDK location.
 
-        If :data:`dpdk_tree` and :data:`remote` are defined, check existence of :data:`dpdk_tree`
-        on SUT node and sets the `_remote_dpdk_tree_path` property. Otherwise, sets nothing.
-
         Verify DPDK source tree existence on the SUT node, if exists sets the
         `_remote_dpdk_tree_path` property, otherwise sets nothing.
 
         Args:
             dpdk_tree: The path to the DPDK source tree directory.
-            remote: Indicates whether the `dpdk_tree` is already on the SUT node, instead of the
-                execution host.
 
         Raises:
             RemoteFileNotFoundError: If the DPDK source tree is expected to be on the SUT node but
                 is not found.
         """
-        if remote and dpdk_tree:
-            if not self.main_session.remote_path_exists(dpdk_tree):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
-                )
-            if not self.main_session.is_remote_dir(dpdk_tree):
-                raise ConfigurationError(
-                    f"Remote DPDK source tree '{dpdk_tree}' must be a directory."
-                )
-
-            self.__remote_dpdk_tree_path = PurePath(dpdk_tree)
-
-    def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
+        if not self.main_session.remote_path_exists(dpdk_tree):
+            raise RemoteFileNotFoundError(
+                f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
+            )
+        if not self.main_session.is_remote_dir(dpdk_tree):
+            raise ConfigurationError(f"Remote DPDK source tree '{dpdk_tree}' must be a directory.")
+
+        self.__remote_dpdk_tree_path = dpdk_tree
+
+    def _copy_dpdk_tree(self, dpdk_tree_path: Path) -> None:
         """Copy the DPDK source tree to the SUT.
 
         Args:
@@ -276,25 +282,45 @@ def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
             self._remote_tmp_dir, PurePath(dpdk_tree_path).name
         )
 
-    def _prepare_and_extract_dpdk_tarball(self, dpdk_tarball: str, remote: bool) -> None:
-        """Ensure the DPDK tarball is available on the SUT node and extract it.
+    def _validate_remote_dpdk_tarball(self, dpdk_tarball: PurePath) -> None:
+        """Validate the DPDK tarball on the SUT node.
 
-        This method ensures that the DPDK source tree tarball is available on the
-        SUT node. If the `dpdk_tarball` is local, it is copied to the SUT node. If the
-        `dpdk_tarball` is already on the SUT node, it verifies its existence.
-        The `dpdk_tarball` is then extracted on the SUT node.
+        Args:
+            dpdk_tarball: The path to the DPDK tarball on the SUT node.
 
-        This method sets the `_remote_dpdk_tree_path` property to the path of the
-        extracted DPDK tree on the SUT node.
+        Raises:
+            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but is
+                not found.
+            ConfigurationError: If the `dpdk_tarball` is a valid path but not a valid tar archive.
+        """
+        if not self.main_session.remote_path_exists(dpdk_tarball):
+            raise RemoteFileNotFoundError(f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT.")
+        if not self.main_session.is_remote_tarfile(dpdk_tarball):
+            raise ConfigurationError(f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive.")
+
+    def _copy_dpdk_tarball_to_remote(self, dpdk_tarball: Path) -> PurePath:
+        """Copy the local DPDK tarball to the SUT node.
 
         Args:
-            dpdk_tarball: The path to the DPDK tarball, either locally or on the SUT node.
-            remote: Indicates whether the `dpdk_tarball` is already on the SUT node, instead of the
-                execution host.
+            dpdk_tarball: The local path to the DPDK tarball.
 
-        Raises:
-            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but
-                is not found.
+        Returns:
+            The path of the copied tarball on the SUT node.
+        """
+        self._logger.info(
+            f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
+        )
+        self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
+        return self.main_session.join_remote_path(self._remote_tmp_dir, dpdk_tarball.name)
+
+    def _prepare_and_extract_dpdk_tarball(self, remote_tarball_path: PurePath) -> None:
+        """Prepare the remote DPDK tree path and extract the tarball.
+
+        This method extracts the remote tarball and sets the `_remote_dpdk_tree_path` property to
+        the path of the extracted DPDK tree on the SUT node.
+
+        Args:
+            remote_tarball_path: The path to the DPDK tarball on the SUT node.
         """
 
         def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
@@ -312,30 +338,9 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
                     return PurePath(str(remote_tarball_path).replace(suffixes_to_remove, ""))
             return remote_tarball_path.with_suffix("")
 
-        if remote:
-            if not self.main_session.remote_path_exists(dpdk_tarball):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT."
-                )
-            if not self.main_session.is_remote_tarfile(dpdk_tarball):
-                raise ConfigurationError(
-                    f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive."
-                )
-
-            remote_tarball_path = PurePath(dpdk_tarball)
-        else:
-            self._logger.info(
-                f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
-            )
-            self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
-
-            remote_tarball_path = self.main_session.join_remote_path(
-                self._remote_tmp_dir, PurePath(dpdk_tarball).name
-            )
-
         tarball_top_dir = self.main_session.get_tarball_top_dir(remote_tarball_path)
         self.__remote_dpdk_tree_path = self.main_session.join_remote_path(
-            PurePath(remote_tarball_path).parent,
+            remote_tarball_path.parent,
             tarball_top_dir or remove_tarball_suffix(remote_tarball_path),
         )
 
@@ -348,33 +353,32 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
             self._remote_dpdk_tree_path,
         )
 
-    def _set_remote_dpdk_build_dir(self, build_dir: str | None):
+    def _set_remote_dpdk_build_dir(self, build_dir: str):
         """Set the `remote_dpdk_build_dir` on the SUT.
 
-        If :data:`build_dir` is defined, check existence on the SUT node and sets the
+        Check existence on the SUT node and sets the
         `remote_dpdk_build_dir` property by joining the `_remote_dpdk_tree_path` and `build_dir`.
         Otherwise, sets nothing.
 
         Args:
-            build_dir: If it's defined, DPDK has been pre-built and the build directory is located
+            build_dir: DPDK has been pre-built and the build directory is located
                 in a subdirectory of `dpdk_tree` or `tarball` root directory.
 
         Raises:
             RemoteFileNotFoundError: If the `build_dir` is expected but does not exist on the SUT
                 node.
         """
-        if build_dir:
-            remote_dpdk_build_dir = self.main_session.join_remote_path(
-                self._remote_dpdk_tree_path, build_dir
+        remote_dpdk_build_dir = self.main_session.join_remote_path(
+            self._remote_dpdk_tree_path, build_dir
+        )
+        if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
+            raise RemoteFileNotFoundError(
+                f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
             )
-            if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
-                )
 
-            self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
+        self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
 
-    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
+    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildOptionsConfiguration) -> None:
         """Populate common environment variables and set the DPDK build related properties.
 
         This method sets `compiler_version` for additional information and `remote_dpdk_build_dir`
diff --git a/dts/framework/testbed_model/topology.py b/dts/framework/testbed_model/topology.py
index d38ae36c2a..17b333e76a 100644
--- a/dts/framework/testbed_model/topology.py
+++ b/dts/framework/testbed_model/topology.py
@@ -99,7 +99,16 @@ def __init__(self, sut_ports: Iterable[Port], tg_ports: Iterable[Port]):
                     port_links.append(PortLink(sut_port=sut_port, tg_port=tg_port))
 
         self.type = TopologyType.get_from_value(len(port_links))
-        dummy_port = Port(PortConfig("", "", "", "", "", ""))
+        dummy_port = Port(
+            "",
+            PortConfig(
+                pci="0000:00:00.0",
+                os_driver_for_dpdk="",
+                os_driver="",
+                peer_node="",
+                peer_pci="0000:00:00.0",
+            ),
+        )
         self.tg_port_egress = dummy_port
         self.sut_port_ingress = dummy_port
         self.sut_port_egress = dummy_port
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
index a319fa5320..945f6bbbbb 100644
--- a/dts/framework/testbed_model/traffic_generator/__init__.py
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -38,6 +38,4 @@ def create_traffic_generator(
         case ScapyTrafficGeneratorConfig():
             return ScapyTrafficGenerator(tg_node, traffic_generator_config, privileged=True)
         case _:
-            raise ConfigurationError(
-                f"Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
-            )
+            raise ConfigurationError(f"Unknown traffic generator: {traffic_generator_config.type}")
diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 469a12a780..5ac61cd4e1 100644
--- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -45,7 +45,7 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig, **kwargs):
         """
         self._config = config
         self._tg_node = tg_node
-        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
+        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.type}")
         super().__init__(tg_node, **kwargs)
 
     def send_packet(self, packet: Packet, port: Port) -> None:
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index 78a39e32c7..e862e3ac66 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -28,7 +28,7 @@
 
 from .exception import InternalError
 
-REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
+REGEX_FOR_PCI_ADDRESS: str = r"[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}"
 _REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC: str = r"(?:[\da-fA-F]{2}[:-]){5}[\da-fA-F]{2}"
 _REGEX_FOR_DOT_SEP_MAC: str = r"(?:[\da-fA-F]{4}.){2}[\da-fA-F]{4}"
 REGEX_FOR_MAC_ADDRESS: str = rf"{_REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC}|{_REGEX_FOR_DOT_SEP_MAC}"
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
index d7870bd40f..bc3a2a6bf9 100644
--- a/dts/tests/TestSuite_smoke_tests.py
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -127,7 +127,7 @@ def test_device_bound_to_driver(self) -> None:
         path_to_devbind = self.sut_node.path_to_devbind_script
 
         all_nics_in_dpdk_devbind = self.sut_node.main_session.send_command(
-            f"{path_to_devbind} --status | awk '{REGEX_FOR_PCI_ADDRESS}'",
+            f"{path_to_devbind} --status | awk '/{REGEX_FOR_PCI_ADDRESS}/'",
             SETTINGS.timeout,
         ).stdout
 
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v2 4/5] dts: remove warlock dependency
  2024-10-25 15:58 ` [PATCH v2 0/5] dts: Pydantic configuration Luca Vizzarro
                     ` (2 preceding siblings ...)
  2024-10-25 15:58   ` [PATCH v2 3/5] dts: use pydantic in the configuration Luca Vizzarro
@ 2024-10-25 15:58   ` Luca Vizzarro
  2024-10-25 15:58   ` [PATCH v2 5/5] dts: use TestSuiteSpec class imports Luca Vizzarro
  4 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-25 15:58 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

Since pydantic has completely replaced warlock, there is no more need to
keep it as a dependency. This removes it.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/poetry.lock    | 227 +--------------------------------------------
 dts/pyproject.toml |   1 -
 2 files changed, 1 insertion(+), 227 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index 56c50ad52c..9f7db60793 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -34,24 +34,6 @@ files = [
     {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
 ]
 
-[[package]]
-name = "attrs"
-version = "23.1.0"
-description = "Classes Without Boilerplate"
-optional = false
-python-versions = ">=3.7"
-files = [
-    {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
-    {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
-]
-
-[package.extras]
-cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
-dev = ["attrs[docs,tests]", "pre-commit"]
-docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
-tests = ["attrs[tests-no-zope]", "zope-interface"]
-tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
-
 [[package]]
 name = "babel"
 version = "2.13.1"
@@ -491,66 +473,6 @@ MarkupSafe = ">=2.0"
 [package.extras]
 i18n = ["Babel (>=2.7)"]
 
-[[package]]
-name = "jsonpatch"
-version = "1.33"
-description = "Apply JSON-Patches (RFC 6902)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpatch-1.33-py2.py3-none-any.whl", hash = "sha256:0ae28c0cd062bbd8b8ecc26d7d164fbbea9652a1a3693f3b956c1eae5145dade"},
-    {file = "jsonpatch-1.33.tar.gz", hash = "sha256:9fcd4009c41e6d12348b4a0ff2563ba56a2923a7dfee731d004e212e1ee5030c"},
-]
-
-[package.dependencies]
-jsonpointer = ">=1.9"
-
-[[package]]
-name = "jsonpointer"
-version = "2.4"
-description = "Identify specific nodes in a JSON document (RFC 6901)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpointer-2.4-py2.py3-none-any.whl", hash = "sha256:15d51bba20eea3165644553647711d150376234112651b4f1811022aecad7d7a"},
-    {file = "jsonpointer-2.4.tar.gz", hash = "sha256:585cee82b70211fa9e6043b7bb89db6e1aa49524340dde8ad6b63206ea689d88"},
-]
-
-[[package]]
-name = "jsonschema"
-version = "4.18.4"
-description = "An implementation of JSON Schema validation for Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema-4.18.4-py3-none-any.whl", hash = "sha256:971be834317c22daaa9132340a51c01b50910724082c2c1a2ac87eeec153a3fe"},
-    {file = "jsonschema-4.18.4.tar.gz", hash = "sha256:fb3642735399fa958c0d2aad7057901554596c63349f4f6b283c493cf692a25d"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-jsonschema-specifications = ">=2023.03.6"
-referencing = ">=0.28.4"
-rpds-py = ">=0.7.1"
-
-[package.extras]
-format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
-format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
-
-[[package]]
-name = "jsonschema-specifications"
-version = "2023.7.1"
-description = "The JSON Schema meta-schemas and vocabularies, exposed as a Registry"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema_specifications-2023.7.1-py3-none-any.whl", hash = "sha256:05adf340b659828a004220a9613be00fa3f223f2b82002e273dee62fd50524b1"},
-    {file = "jsonschema_specifications-2023.7.1.tar.gz", hash = "sha256:c91a50404e88a1f6ba40636778e2ee08f6e24c5613fe4c53ac24578a5a7f72bb"},
-]
-
-[package.dependencies]
-referencing = ">=0.28.0"
-
 [[package]]
 name = "markupsafe"
 version = "2.1.3"
@@ -1073,21 +995,6 @@ files = [
     {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
 ]
 
-[[package]]
-name = "referencing"
-version = "0.30.0"
-description = "JSON Referencing + Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "referencing-0.30.0-py3-none-any.whl", hash = "sha256:c257b08a399b6c2f5a3510a50d28ab5dbc7bbde049bcaf954d43c446f83ab548"},
-    {file = "referencing-0.30.0.tar.gz", hash = "sha256:47237742e990457f7512c7d27486394a9aadaf876cbfaa4be65b27b4f4d47c6b"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-rpds-py = ">=0.7.0"
-
 [[package]]
 name = "requests"
 version = "2.31.0"
@@ -1109,112 +1016,6 @@ urllib3 = ">=1.21.1,<3"
 socks = ["PySocks (>=1.5.6,!=1.5.7)"]
 use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
 
-[[package]]
-name = "rpds-py"
-version = "0.9.2"
-description = "Python bindings to Rust's persistent data structures (rpds)"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:ab6919a09c055c9b092798ce18c6c4adf49d24d4d9e43a92b257e3f2548231e7"},
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d55777a80f78dd09410bd84ff8c95ee05519f41113b2df90a69622f5540c4f8b"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a216b26e5af0a8e265d4efd65d3bcec5fba6b26909014effe20cd302fd1138fa"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:29cd8bfb2d716366a035913ced99188a79b623a3512292963d84d3e06e63b496"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:44659b1f326214950a8204a248ca6199535e73a694be8d3e0e869f820767f12f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:745f5a43fdd7d6d25a53ab1a99979e7f8ea419dfefebcab0a5a1e9095490ee5e"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a987578ac5214f18b99d1f2a3851cba5b09f4a689818a106c23dbad0dfeb760f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf4151acb541b6e895354f6ff9ac06995ad9e4175cbc6d30aaed08856558201f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:03421628f0dc10a4119d714a17f646e2837126a25ac7a256bdf7c3943400f67f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:13b602dc3e8dff3063734f02dcf05111e887f301fdda74151a93dbbc249930fe"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:fae5cb554b604b3f9e2c608241b5d8d303e410d7dfb6d397c335f983495ce7f6"},
-    {file = "rpds_py-0.9.2-cp310-none-win32.whl", hash = "sha256:47c5f58a8e0c2c920cc7783113df2fc4ff12bf3a411d985012f145e9242a2764"},
-    {file = "rpds_py-0.9.2-cp310-none-win_amd64.whl", hash = "sha256:4ea6b73c22d8182dff91155af018b11aac9ff7eca085750455c5990cb1cfae6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:e564d2238512c5ef5e9d79338ab77f1cbbda6c2d541ad41b2af445fb200385e3"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f411330a6376fb50e5b7a3e66894e4a39e60ca2e17dce258d53768fea06a37bd"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e7521f5af0233e89939ad626b15278c71b69dc1dfccaa7b97bd4cdf96536bb7"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8d3335c03100a073883857e91db9f2e0ef8a1cf42dc0369cbb9151c149dbbc1b"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d25b1c1096ef0447355f7293fbe9ad740f7c47ae032c2884113f8e87660d8f6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a5d3fbd02efd9cf6a8ffc2f17b53a33542f6b154e88dd7b42ef4a4c0700fdad"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c5934e2833afeaf36bd1eadb57256239785f5af0220ed8d21c2896ec4d3a765f"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:095b460e117685867d45548fbd8598a8d9999227e9061ee7f012d9d264e6048d"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:91378d9f4151adc223d584489591dbb79f78814c0734a7c3bfa9c9e09978121c"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:24a81c177379300220e907e9b864107614b144f6c2a15ed5c3450e19cf536fae"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:de0b6eceb46141984671802d412568d22c6bacc9b230174f9e55fc72ef4f57de"},
-    {file = "rpds_py-0.9.2-cp311-none-win32.whl", hash = "sha256:700375326ed641f3d9d32060a91513ad668bcb7e2cffb18415c399acb25de2ab"},
-    {file = "rpds_py-0.9.2-cp311-none-win_amd64.whl", hash = "sha256:0766babfcf941db8607bdaf82569ec38107dbb03c7f0b72604a0b346b6eb3298"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_10_7_x86_64.whl", hash = "sha256:b1440c291db3f98a914e1afd9d6541e8fc60b4c3aab1a9008d03da4651e67386"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0f2996fbac8e0b77fd67102becb9229986396e051f33dbceada3debaacc7033f"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f30d205755566a25f2ae0382944fcae2f350500ae4df4e795efa9e850821d82"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:159fba751a1e6b1c69244e23ba6c28f879a8758a3e992ed056d86d74a194a0f3"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a1f044792e1adcea82468a72310c66a7f08728d72a244730d14880cd1dabe36b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9251eb8aa82e6cf88510530b29eef4fac825a2b709baf5b94a6094894f252387"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01899794b654e616c8625b194ddd1e5b51ef5b60ed61baa7a2d9c2ad7b2a4238"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0c43f8ae8f6be1d605b0465671124aa8d6a0e40f1fb81dcea28b7e3d87ca1e1"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:207f57c402d1f8712618f737356e4b6f35253b6d20a324d9a47cb9f38ee43a6b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b52e7c5ae35b00566d244ffefba0f46bb6bec749a50412acf42b1c3f402e2c90"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:978fa96dbb005d599ec4fd9ed301b1cc45f1a8f7982d4793faf20b404b56677d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:6aa8326a4a608e1c28da191edd7c924dff445251b94653988efb059b16577a4d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:aad51239bee6bff6823bbbdc8ad85136c6125542bbc609e035ab98ca1e32a192"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4bd4dc3602370679c2dfb818d9c97b1137d4dd412230cfecd3c66a1bf388a196"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dd9da77c6ec1f258387957b754f0df60766ac23ed698b61941ba9acccd3284d1"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:190ca6f55042ea4649ed19c9093a9be9d63cd8a97880106747d7147f88a49d18"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:876bf9ed62323bc7dcfc261dbc5572c996ef26fe6406b0ff985cbcf460fc8a4c"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa2818759aba55df50592ecbc95ebcdc99917fa7b55cc6796235b04193eb3c55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9ea4d00850ef1e917815e59b078ecb338f6a8efda23369677c54a5825dbebb55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:5855c85eb8b8a968a74dc7fb014c9166a05e7e7a8377fb91d78512900aadd13d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:14c408e9d1a80dcb45c05a5149e5961aadb912fff42ca1dd9b68c0044904eb32"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:65a0583c43d9f22cb2130c7b110e695fff834fd5e832a776a107197e59a1898e"},
-    {file = "rpds_py-0.9.2-cp38-none-win32.whl", hash = "sha256:71f2f7715935a61fa3e4ae91d91b67e571aeb5cb5d10331ab681256bda2ad920"},
-    {file = "rpds_py-0.9.2-cp38-none-win_amd64.whl", hash = "sha256:674c704605092e3ebbbd13687b09c9f78c362a4bc710343efe37a91457123044"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:07e2c54bef6838fa44c48dfbc8234e8e2466d851124b551fc4e07a1cfeb37260"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f7fdf55283ad38c33e35e2855565361f4bf0abd02470b8ab28d499c663bc5d7c"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:890ba852c16ace6ed9f90e8670f2c1c178d96510a21b06d2fa12d8783a905193"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:50025635ba8b629a86d9d5474e650da304cb46bbb4d18690532dd79341467846"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:517cbf6e67ae3623c5127206489d69eb2bdb27239a3c3cc559350ef52a3bbf0b"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0836d71ca19071090d524739420a61580f3f894618d10b666cf3d9a1688355b1"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c439fd54b2b9053717cca3de9583be6584b384d88d045f97d409f0ca867d80f"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f68996a3b3dc9335037f82754f9cdbe3a95db42bde571d8c3be26cc6245f2324"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7d68dc8acded354c972116f59b5eb2e5864432948e098c19fe6994926d8e15c3"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:f963c6b1218b96db85fc37a9f0851eaf8b9040aa46dec112611697a7023da535"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5a46859d7f947061b4010e554ccd1791467d1b1759f2dc2ec9055fa239f1bc26"},
-    {file = "rpds_py-0.9.2-cp39-none-win32.whl", hash = "sha256:e07e5dbf8a83c66783a9fe2d4566968ea8c161199680e8ad38d53e075df5f0d0"},
-    {file = "rpds_py-0.9.2-cp39-none-win_amd64.whl", hash = "sha256:682726178138ea45a0766907957b60f3a1bf3acdf212436be9733f28b6c5af3c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_10_7_x86_64.whl", hash = "sha256:196cb208825a8b9c8fc360dc0f87993b8b260038615230242bf18ec84447c08d"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:c7671d45530fcb6d5e22fd40c97e1e1e01965fc298cbda523bb640f3d923b387"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83b32f0940adec65099f3b1c215ef7f1d025d13ff947975a055989cb7fd019a4"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7f67da97f5b9eac838b6980fc6da268622e91f8960e083a34533ca710bec8611"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03975db5f103997904c37e804e5f340c8fdabbb5883f26ee50a255d664eed58c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:987b06d1cdb28f88a42e4fb8a87f094e43f3c435ed8e486533aea0bf2e53d931"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c861a7e4aef15ff91233751619ce3a3d2b9e5877e0fcd76f9ea4f6847183aa16"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:02938432352359805b6da099c9c95c8a0547fe4b274ce8f1a91677401bb9a45f"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:ef1f08f2a924837e112cba2953e15aacfccbbfcd773b4b9b4723f8f2ddded08e"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:35da5cc5cb37c04c4ee03128ad59b8c3941a1e5cd398d78c37f716f32a9b7f67"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:141acb9d4ccc04e704e5992d35472f78c35af047fa0cfae2923835d153f091be"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_10_7_x86_64.whl", hash = "sha256:79f594919d2c1a0cc17d1988a6adaf9a2f000d2e1048f71f298b056b1018e872"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:a06418fe1155e72e16dddc68bb3780ae44cebb2912fbd8bb6ff9161de56e1798"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b2eb034c94b0b96d5eddb290b7b5198460e2d5d0c421751713953a9c4e47d10"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b08605d248b974eb02f40bdcd1a35d3924c83a2a5e8f5d0fa5af852c4d960af"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a0805911caedfe2736935250be5008b261f10a729a303f676d3d5fea6900c96a"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab2299e3f92aa5417d5e16bb45bb4586171c1327568f638e8453c9f8d9e0f020"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c8d7594e38cf98d8a7df25b440f684b510cf4627fe038c297a87496d10a174f"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8b9ec12ad5f0a4625db34db7e0005be2632c1013b253a4a60e8302ad4d462afd"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:1fcdee18fea97238ed17ab6478c66b2095e4ae7177e35fb71fbe561a27adf620"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_i686.whl", hash = "sha256:933a7d5cd4b84f959aedeb84f2030f0a01d63ae6cf256629af3081cf3e3426e8"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:686ba516e02db6d6f8c279d1641f7067ebb5dc58b1d0536c4aaebb7bf01cdc5d"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_10_7_x86_64.whl", hash = "sha256:0173c0444bec0a3d7d848eaeca2d8bd32a1b43f3d3fde6617aac3731fa4be05f"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:d576c3ef8c7b2d560e301eb33891d1944d965a4d7a2eacb6332eee8a71827db6"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed89861ee8c8c47d6beb742a602f912b1bb64f598b1e2f3d758948721d44d468"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1054a08e818f8e18910f1bee731583fe8f899b0a0a5044c6e680ceea34f93876"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99e7c4bb27ff1aab90dcc3e9d37ee5af0231ed98d99cb6f5250de28889a3d502"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c545d9d14d47be716495076b659db179206e3fd997769bc01e2d550eeb685596"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9039a11bca3c41be5a58282ed81ae422fa680409022b996032a43badef2a3752"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fb39aca7a64ad0c9490adfa719dbeeb87d13be137ca189d2564e596f8ba32c07"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:2d8b3b3a2ce0eaa00c5bbbb60b6713e94e7e0becab7b3db6c5c77f979e8ed1f1"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:99b1c16f732b3a9971406fbfe18468592c5a3529585a45a35adbc1389a529a03"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:c27ee01a6c3223025f4badd533bea5e87c988cb0ba2811b690395dfe16088cfe"},
-    {file = "rpds_py-0.9.2.tar.gz", hash = "sha256:8d70e8f14900f2657c249ea4def963bed86a29b81f81f5b76b5a9215680de945"},
-]
-
 [[package]]
 name = "scapy"
 version = "2.5.0"
@@ -1472,17 +1273,6 @@ files = [
     {file = "types_PyYAML-6.0.12.11-py3-none-any.whl", hash = "sha256:a461508f3096d1d5810ec5ab95d7eeecb651f3a15b71959999988942063bf01d"},
 ]
 
-[[package]]
-name = "typing-extensions"
-version = "4.11.0"
-description = "Backported and Experimental Type Hints for Python 3.8+"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "typing_extensions-4.11.0-py3-none-any.whl", hash = "sha256:c1f94d72897edaf4ce775bb7558d5b79d8126906a14ea5ed1635921406c0387a"},
-    {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
-]
-
 [[package]]
 name = "typing-extensions"
 version = "4.12.2"
@@ -1511,22 +1301,7 @@ secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.
 socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
 zstd = ["zstandard (>=0.18.0)"]
 
-[[package]]
-name = "warlock"
-version = "2.0.1"
-description = "Python object model built on JSON schema and JSON patch."
-optional = false
-python-versions = ">=3.7,<4.0"
-files = [
-    {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"},
-    {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"},
-]
-
-[package.dependencies]
-jsonpatch = ">=1,<2"
-jsonschema = ">=4,<5"
-
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827"
+content-hash = "310e2d3725e20ffc6ef017db92e8000c042eb2ac98a1a5eb441de17c87417e9f"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 6c2d1ca8a4..9a3fb02ee9 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -20,7 +20,6 @@ documentation = "https://doc.dpdk.org/guides/tools/dts.html"
 
 [tool.poetry.dependencies]
 python = "^3.10"
-warlock = "^2.0.1"
 PyYAML = "^6.0"
 types-PyYAML = "^6.0.8"
 fabric = "^2.7.1"
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v2 5/5] dts: use TestSuiteSpec class imports
  2024-10-25 15:58 ` [PATCH v2 0/5] dts: Pydantic configuration Luca Vizzarro
                     ` (3 preceding siblings ...)
  2024-10-25 15:58   ` [PATCH v2 4/5] dts: remove warlock dependency Luca Vizzarro
@ 2024-10-25 15:58   ` Luca Vizzarro
  4 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-25 15:58 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

The introduction of TestSuiteSpec adds auto-discovery of test suites,
which are also automatically imported. This causes double imports as the
runner loads the test suites. This changes the behaviour of the runner
to load the imported classes from TestSuiteSpec instead of importing
them again.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/runner.py | 84 ++++-------------------------------------
 1 file changed, 7 insertions(+), 77 deletions(-)

diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index c3d9a27a8c..5f5837a132 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -2,6 +2,7 @@
 # Copyright(c) 2010-2019 Intel Corporation
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
+# Copyright(c) 2024 Arm Limited
 
 """Test suite runner module.
 
@@ -17,8 +18,6 @@
 and the test case stage runs test cases individually.
 """
 
-import importlib
-import inspect
 import os
 import random
 import sys
@@ -39,12 +38,7 @@
     TGNodeConfiguration,
     load_config,
 )
-from .exception import (
-    BlockingTestSuiteError,
-    ConfigurationError,
-    SSHTimeoutError,
-    TestCaseVerifyError,
-)
+from .exception import BlockingTestSuiteError, SSHTimeoutError, TestCaseVerifyError
 from .logger import DTSLogger, DtsStage, get_dts_logger
 from .settings import SETTINGS
 from .test_result import (
@@ -215,11 +209,10 @@ def _get_test_suites_with_cases(
         func: bool,
         perf: bool,
     ) -> list[TestSuiteWithCases]:
-        """Test suites with test cases discovery.
+        """Get test suites with selected cases.
 
-        The test suites with test cases defined in the user configuration are discovered
-        and stored for future use so that we don't import the modules twice and so that
-        the list of test suites with test cases is available for recording right away.
+        The test suites with test cases defined in the user configuration are selected
+        and the corresponding functions and classes are gathered.
 
         Args:
             test_suite_configs: Test suite configurations.
@@ -227,12 +220,12 @@ def _get_test_suites_with_cases(
             perf: Whether to include performance test cases in the final list.
 
         Returns:
-            The discovered test suites, each with test cases.
+            The test suites, each with test cases.
         """
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
+            test_suite_class = test_suite_config.test_suite_spec.class_obj
             test_cases: list[type[TestCase]] = []
             func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
                 test_suite_config.test_cases_names
@@ -245,71 +238,8 @@ def _get_test_suites_with_cases(
             test_suites_with_cases.append(
                 TestSuiteWithCases(test_suite_class=test_suite_class, test_cases=test_cases)
             )
-
         return test_suites_with_cases
 
-    def _get_test_suite_class(self, module_name: str) -> type[TestSuite]:
-        """Find the :class:`TestSuite` class in `module_name`.
-
-        The full module name is `module_name` prefixed with `self._test_suite_module_prefix`.
-        The module name is a standard filename with words separated with underscores.
-        Search the `module_name` for a :class:`TestSuite` class which starts
-        with `self._test_suite_class_prefix`, continuing with CamelCase `module_name`.
-        The first matching class is returned.
-
-        The CamelCase convention applies to abbreviations, acronyms, initialisms and so on::
-
-            OS -> Os
-            TCP -> Tcp
-
-        Args:
-            module_name: The module name without prefix where to search for the test suite.
-
-        Returns:
-            The found test suite class.
-
-        Raises:
-            ConfigurationError: If the corresponding module is not found or
-                a valid :class:`TestSuite` is not found in the module.
-        """
-
-        def is_test_suite(object) -> bool:
-            """Check whether `object` is a :class:`TestSuite`.
-
-            The `object` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
-
-            Args:
-                object: The object to be checked.
-
-            Returns:
-                :data:`True` if `object` is a subclass of `TestSuite`.
-            """
-            try:
-                if issubclass(object, TestSuite) and object is not TestSuite:
-                    return True
-            except TypeError:
-                return False
-            return False
-
-        testsuite_module_path = f"{self._test_suite_module_prefix}{module_name}"
-        try:
-            test_suite_module = importlib.import_module(testsuite_module_path)
-        except ModuleNotFoundError as e:
-            raise ConfigurationError(
-                f"Test suite module '{testsuite_module_path}' not found."
-            ) from e
-
-        camel_case_suite_name = "".join(
-            [suite_word.capitalize() for suite_word in module_name.split("_")]
-        )
-        full_suite_name_to_find = f"{self._test_suite_class_prefix}{camel_case_suite_name}"
-        for class_name, class_obj in inspect.getmembers(test_suite_module, is_test_suite):
-            if class_name == full_suite_name_to_find:
-                return class_obj
-        raise ConfigurationError(
-            f"Couldn't find any valid test suites in {test_suite_module.__name__}."
-        )
-
     def _connect_nodes_and_run_test_run(
         self,
         sut_nodes: dict[str, SutNode],
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v3 0/5] dts: Pydantic configuration
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
                   ` (5 preceding siblings ...)
  2024-10-25 15:58 ` [PATCH v2 0/5] dts: Pydantic configuration Luca Vizzarro
@ 2024-10-25 16:43 ` Luca Vizzarro
  2024-10-25 16:43   ` [PATCH v3 1/5] dts: add pydantic dependency Luca Vizzarro
                     ` (4 more replies)
  2024-10-28 17:49 ` [PATCH v4 0/8] dts: Pydantic configuration Luca Vizzarro
                   ` (2 subsequent siblings)
  9 siblings, 5 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-25 16:43 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

Hi there,

sending a v3 for the pydantic changes.

v3:
- removed the common FrozenModel and configured each BaseModel
  individually, due to mypy complaints
v2:
- rebased and merge conflicts resolved:
  - capabilities patch introducing TestCase has now been combined with
    TestSuiteSpec
  - external build patch added more configuration complexity which has
    been re-worked in pydantic adding exclusion via structured models
- split pydantic/warlock dependency chains
- deleted the config schema as no longer needed
- removed config schema generator
- turned all configuration dataclasses into Pydantic BaseModels
- refactored
- improved docstrings

Best,
Luca

---
Depends-on: series-33590 ("DTS external DPDK build")

Luca Vizzarro (5):
  dts: add pydantic dependency
  dts: add TestSuiteSpec class and discovery
  dts: use pydantic in the configuration
  dts: remove warlock dependency
  dts: use TestSuiteSpec class imports

 doc/api/dts/conf_yaml_schema.json             |   1 -
 doc/api/dts/framework.config.rst              |   6 -
 doc/api/dts/framework.config.types.rst        |   8 -
 dts/conf.yaml                                 |  11 +-
 dts/framework/config/__init__.py              | 844 +++++++++---------
 dts/framework/config/conf_yaml_schema.json    | 458 ----------
 dts/framework/config/types.py                 | 149 ----
 dts/framework/runner.py                       | 139 +--
 dts/framework/settings.py                     | 124 +--
 dts/framework/test_suite.py                   | 189 +++-
 dts/framework/testbed_model/capability.py     |  12 +-
 dts/framework/testbed_model/node.py           |  15 +-
 dts/framework/testbed_model/os_session.py     |   4 +-
 dts/framework/testbed_model/port.py           |   4 +-
 dts/framework/testbed_model/posix_session.py  |  10 +-
 dts/framework/testbed_model/sut_node.py       | 182 ++--
 dts/framework/testbed_model/topology.py       |  11 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 dts/framework/utils.py                        |   2 +-
 dts/poetry.lock                               | 370 ++++----
 dts/pyproject.toml                            |   2 +-
 dts/tests/TestSuite_smoke_tests.py            |   2 +-
 23 files changed, 1010 insertions(+), 1539 deletions(-)
 delete mode 120000 doc/api/dts/conf_yaml_schema.json
 delete mode 100644 doc/api/dts/framework.config.types.rst
 delete mode 100644 dts/framework/config/conf_yaml_schema.json
 delete mode 100644 dts/framework/config/types.py

-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v3 1/5] dts: add pydantic dependency
  2024-10-25 16:43 ` [PATCH v3 0/5] dts: Pydantic configuration Luca Vizzarro
@ 2024-10-25 16:43   ` Luca Vizzarro
  2024-10-25 16:43   ` [PATCH v3 2/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
                     ` (3 subsequent siblings)
  4 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-25 16:43 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

As part of configuration validation and deserialization improvements,
this adds pydantic as a project dependency. Pydantic is a library that
caters to all of the aforementioned needs, while improving the process
and code.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/poetry.lock    | 171 ++++++++++++++++++++++++++++++++++++++++++++-
 dts/pyproject.toml |   1 +
 2 files changed, 170 insertions(+), 2 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index cf5f6569c6..56c50ad52c 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,4 +1,4 @@
-# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand.
 
 [[package]]
 name = "aenum"
@@ -23,6 +23,17 @@ files = [
     {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
 ]
 
+[[package]]
+name = "annotated-types"
+version = "0.7.0"
+description = "Reusable constraint types to use with typing.Annotated"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"},
+    {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
+]
+
 [[package]]
 name = "attrs"
 version = "23.1.0"
@@ -567,6 +578,16 @@ files = [
     {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
     {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
     {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
@@ -762,6 +783,130 @@ files = [
     {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
 ]
 
+[[package]]
+name = "pydantic"
+version = "2.9.2"
+description = "Data validation using Python type hints"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic-2.9.2-py3-none-any.whl", hash = "sha256:f048cec7b26778210e28a0459867920654d48e5e62db0958433636cde4254f12"},
+    {file = "pydantic-2.9.2.tar.gz", hash = "sha256:d155cef71265d1e9807ed1c32b4c8deec042a44a50a4188b25ac67ecd81a9c0f"},
+]
+
+[package.dependencies]
+annotated-types = ">=0.6.0"
+pydantic-core = "2.23.4"
+typing-extensions = [
+    {version = ">=4.12.2", markers = "python_version >= \"3.13\""},
+    {version = ">=4.6.1", markers = "python_version < \"3.13\""},
+]
+
+[package.extras]
+email = ["email-validator (>=2.0.0)"]
+timezone = ["tzdata"]
+
+[[package]]
+name = "pydantic-core"
+version = "2.23.4"
+description = "Core functionality for Pydantic validation and serialization"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic_core-2.23.4-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:b10bd51f823d891193d4717448fab065733958bdb6a6b351967bd349d48d5c9b"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4fc714bdbfb534f94034efaa6eadd74e5b93c8fa6315565a222f7b6f42ca1166"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63e46b3169866bd62849936de036f901a9356e36376079b05efa83caeaa02ceb"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed1a53de42fbe34853ba90513cea21673481cd81ed1be739f7f2efb931b24916"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cfdd16ab5e59fc31b5e906d1a3f666571abc367598e3e02c83403acabc092e07"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:255a8ef062cbf6674450e668482456abac99a5583bbafb73f9ad469540a3a232"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a7cd62e831afe623fbb7aabbb4fe583212115b3ef38a9f6b71869ba644624a2"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f09e2ff1f17c2b51f2bc76d1cc33da96298f0a036a137f5440ab3ec5360b624f"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e38e63e6f3d1cec5a27e0afe90a085af8b6806ee208b33030e65b6516353f1a3"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0dbd8dbed2085ed23b5c04afa29d8fd2771674223135dc9bc937f3c09284d071"},
+    {file = "pydantic_core-2.23.4-cp310-none-win32.whl", hash = "sha256:6531b7ca5f951d663c339002e91aaebda765ec7d61b7d1e3991051906ddde119"},
+    {file = "pydantic_core-2.23.4-cp310-none-win_amd64.whl", hash = "sha256:7c9129eb40958b3d4500fa2467e6a83356b3b61bfff1b414c7361d9220f9ae8f"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:77733e3892bb0a7fa797826361ce8a9184d25c8dffaec60b7ffe928153680ba8"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1b84d168f6c48fabd1f2027a3d1bdfe62f92cade1fb273a5d68e621da0e44e6d"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:df49e7a0861a8c36d089c1ed57d308623d60416dab2647a4a17fe050ba85de0e"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ff02b6d461a6de369f07ec15e465a88895f3223eb75073ffea56b84d9331f607"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:996a38a83508c54c78a5f41456b0103c30508fed9abcad0a59b876d7398f25fd"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d97683ddee4723ae8c95d1eddac7c192e8c552da0c73a925a89fa8649bf13eea"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:216f9b2d7713eb98cb83c80b9c794de1f6b7e3145eef40400c62e86cee5f4e1e"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6f783e0ec4803c787bcea93e13e9932edab72068f68ecffdf86a99fd5918878b"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d0776dea117cf5272382634bd2a5c1b6eb16767c223c6a5317cd3e2a757c61a0"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d5f7a395a8cf1621939692dba2a6b6a830efa6b3cee787d82c7de1ad2930de64"},
+    {file = "pydantic_core-2.23.4-cp311-none-win32.whl", hash = "sha256:74b9127ffea03643e998e0c5ad9bd3811d3dac8c676e47db17b0ee7c3c3bf35f"},
+    {file = "pydantic_core-2.23.4-cp311-none-win_amd64.whl", hash = "sha256:98d134c954828488b153d88ba1f34e14259284f256180ce659e8d83e9c05eaa3"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:f3e0da4ebaef65158d4dfd7d3678aad692f7666877df0002b8a522cdf088f231"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f69a8e0b033b747bb3e36a44e7732f0c99f7edd5cea723d45bc0d6e95377ffee"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:723314c1d51722ab28bfcd5240d858512ffd3116449c557a1336cbe3919beb87"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bb2802e667b7051a1bebbfe93684841cc9351004e2badbd6411bf357ab8d5ac8"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d18ca8148bebe1b0a382a27a8ee60350091a6ddaf475fa05ef50dc35b5df6327"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33e3d65a85a2a4a0dc3b092b938a4062b1a05f3a9abde65ea93b233bca0e03f2"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:128585782e5bfa515c590ccee4b727fb76925dd04a98864182b22e89a4e6ed36"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:68665f4c17edcceecc112dfed5dbe6f92261fb9d6054b47d01bf6371a6196126"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:20152074317d9bed6b7a95ade3b7d6054845d70584216160860425f4fbd5ee9e"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:9261d3ce84fa1d38ed649c3638feefeae23d32ba9182963e465d58d62203bd24"},
+    {file = "pydantic_core-2.23.4-cp312-none-win32.whl", hash = "sha256:4ba762ed58e8d68657fc1281e9bb72e1c3e79cc5d464be146e260c541ec12d84"},
+    {file = "pydantic_core-2.23.4-cp312-none-win_amd64.whl", hash = "sha256:97df63000f4fea395b2824da80e169731088656d1818a11b95f3b173747b6cd9"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7530e201d10d7d14abce4fb54cfe5b94a0aefc87da539d0346a484ead376c3cc"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:df933278128ea1cd77772673c73954e53a1c95a4fdf41eef97c2b779271bd0bd"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cb3da3fd1b6a5d0279a01877713dbda118a2a4fc6f0d821a57da2e464793f05"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42c6dcb030aefb668a2b7009c85b27f90e51e6a3b4d5c9bc4c57631292015b0d"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:696dd8d674d6ce621ab9d45b205df149399e4bb9aa34102c970b721554828510"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2971bb5ffe72cc0f555c13e19b23c85b654dd2a8f7ab493c262071377bfce9f6"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8394d940e5d400d04cad4f75c0598665cbb81aecefaca82ca85bd28264af7f9b"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0dff76e0602ca7d4cdaacc1ac4c005e0ce0dcfe095d5b5259163a80d3a10d327"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7d32706badfe136888bdea71c0def994644e09fff0bfe47441deaed8e96fdbc6"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ed541d70698978a20eb63d8c5d72f2cc6d7079d9d90f6b50bad07826f1320f5f"},
+    {file = "pydantic_core-2.23.4-cp313-none-win32.whl", hash = "sha256:3d5639516376dce1940ea36edf408c554475369f5da2abd45d44621cb616f769"},
+    {file = "pydantic_core-2.23.4-cp313-none-win_amd64.whl", hash = "sha256:5a1504ad17ba4210df3a045132a7baeeba5a200e930f57512ee02909fc5c4cb5"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:d4488a93b071c04dc20f5cecc3631fc78b9789dd72483ba15d423b5b3689b555"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:81965a16b675b35e1d09dd14df53f190f9129c0202356ed44ab2728b1c905658"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ffa2ebd4c8530079140dd2d7f794a9d9a73cbb8e9d59ffe24c63436efa8f271"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:61817945f2fe7d166e75fbfb28004034b48e44878177fc54d81688e7b85a3665"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:29d2c342c4bc01b88402d60189f3df065fb0dda3654744d5a165a5288a657368"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5e11661ce0fd30a6790e8bcdf263b9ec5988e95e63cf901972107efc49218b13"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9d18368b137c6295db49ce7218b1a9ba15c5bc254c96d7c9f9e924a9bc7825ad"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ec4e55f79b1c4ffb2eecd8a0cfba9955a2588497d96851f4c8f99aa4a1d39b12"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:374a5e5049eda9e0a44c696c7ade3ff355f06b1fe0bb945ea3cac2bc336478a2"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5c364564d17da23db1106787675fc7af45f2f7b58b4173bfdd105564e132e6fb"},
+    {file = "pydantic_core-2.23.4-cp38-none-win32.whl", hash = "sha256:d7a80d21d613eec45e3d41eb22f8f94ddc758a6c4720842dc74c0581f54993d6"},
+    {file = "pydantic_core-2.23.4-cp38-none-win_amd64.whl", hash = "sha256:5f5ff8d839f4566a474a969508fe1c5e59c31c80d9e140566f9a37bba7b8d556"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a4fa4fc04dff799089689f4fd502ce7d59de529fc2f40a2c8836886c03e0175a"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a7df63886be5e270da67e0966cf4afbae86069501d35c8c1b3b6c168f42cb36"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dcedcd19a557e182628afa1d553c3895a9f825b936415d0dbd3cd0bbcfd29b4b"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f54b118ce5de9ac21c363d9b3caa6c800341e8c47a508787e5868c6b79c9323"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:86d2f57d3e1379a9525c5ab067b27dbb8a0642fb5d454e17a9ac434f9ce523e3"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:de6d1d1b9e5101508cb37ab0d972357cac5235f5c6533d1071964c47139257df"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1278e0d324f6908e872730c9102b0112477a7f7cf88b308e4fc36ce1bdb6d58c"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9a6b5099eeec78827553827f4c6b8615978bb4b6a88e5d9b93eddf8bb6790f55"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:e55541f756f9b3ee346b840103f32779c695a19826a4c442b7954550a0972040"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a5c7ba8ffb6d6f8f2ab08743be203654bb1aaa8c9dcb09f82ddd34eadb695605"},
+    {file = "pydantic_core-2.23.4-cp39-none-win32.whl", hash = "sha256:37b0fe330e4a58d3c58b24d91d1eb102aeec675a3db4c292ec3928ecd892a9a6"},
+    {file = "pydantic_core-2.23.4-cp39-none-win_amd64.whl", hash = "sha256:1498bec4c05c9c787bde9125cfdcc63a41004ff167f495063191b863399b1a29"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:f455ee30a9d61d3e1a15abd5068827773d6e4dc513e795f380cdd59932c782d5"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1e90d2e3bd2c3863d48525d297cd143fe541be8bbf6f579504b9712cb6b643ec"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e203fdf807ac7e12ab59ca2bfcabb38c7cf0b33c41efeb00f8e5da1d86af480"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e08277a400de01bc72436a0ccd02bdf596631411f592ad985dcee21445bd0068"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f220b0eea5965dec25480b6333c788fb72ce5f9129e8759ef876a1d805d00801"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:d06b0c8da4f16d1d1e352134427cb194a0a6e19ad5db9161bf32b2113409e728"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:ba1a0996f6c2773bd83e63f18914c1de3c9dd26d55f4ac302a7efe93fb8e7433"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:9a5bce9d23aac8f0cf0836ecfc033896aa8443b501c58d0602dbfd5bd5b37753"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:78ddaaa81421a29574a682b3179d4cf9e6d405a09b99d93ddcf7e5239c742e21"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:883a91b5dd7d26492ff2f04f40fbb652de40fcc0afe07e8129e8ae779c2110eb"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:88ad334a15b32a791ea935af224b9de1bf99bcd62fabf745d5f3442199d86d59"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:233710f069d251feb12a56da21e14cca67994eab08362207785cf8c598e74577"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:19442362866a753485ba5e4be408964644dd6a09123d9416c54cd49171f50744"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:624e278a7d29b6445e4e813af92af37820fafb6dcc55c012c834f9e26f9aaaef"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f5ef8f42bec47f21d07668a043f077d507e5bf4e668d5c6dfe6aaba89de1a5b8"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:aea443fffa9fbe3af1a9ba721a87f926fe548d32cab71d188a6ede77d0ff244e"},
+    {file = "pydantic_core-2.23.4.tar.gz", hash = "sha256:2584f7cf844ac4d970fba483a717dbe10c1c1c96a969bf65d61ffe94df1b2863"},
+]
+
+[package.dependencies]
+typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
+
 [[package]]
 name = "pydocstyle"
 version = "6.1.1"
@@ -880,6 +1025,7 @@ files = [
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
+    {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
     {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
     {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
     {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -887,8 +1033,16 @@ files = [
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
+    {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
     {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
     {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
+    {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
+    {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
+    {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
+    {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
+    {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
+    {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
+    {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
     {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
     {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
     {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -905,6 +1059,7 @@ files = [
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
+    {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
     {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
     {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
     {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -912,6 +1067,7 @@ files = [
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
+    {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
     {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
     {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
     {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -1327,6 +1483,17 @@ files = [
     {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
 ]
 
+[[package]]
+name = "typing-extensions"
+version = "4.12.2"
+description = "Backported and Experimental Type Hints for Python 3.8+"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"},
+    {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"},
+]
+
 [[package]]
 name = "urllib3"
 version = "2.0.7"
@@ -1362,4 +1529,4 @@ jsonschema = ">=4,<5"
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "6f20ce05310df93fed1d392160d1653ae5de5c6f260a5865eb3c6111a7c2b394"
+content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 506380ac2f..6c2d1ca8a4 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -28,6 +28,7 @@ scapy = "^2.5.0"
 pydocstyle = "6.1.1"
 typing-extensions = "^4.11.0"
 aenum = "^3.1.15"
+pydantic = "^2.9.2"
 
 [tool.poetry.group.dev.dependencies]
 mypy = "^1.10.0"
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v3 2/5] dts: add TestSuiteSpec class and discovery
  2024-10-25 16:43 ` [PATCH v3 0/5] dts: Pydantic configuration Luca Vizzarro
  2024-10-25 16:43   ` [PATCH v3 1/5] dts: add pydantic dependency Luca Vizzarro
@ 2024-10-25 16:43   ` Luca Vizzarro
  2024-10-25 16:43   ` [PATCH v3 3/5] dts: use pydantic in the configuration Luca Vizzarro
                     ` (2 subsequent siblings)
  4 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-25 16:43 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

Currently there is a lack of a definition which identifies all the test
suites available to test. This change intends to simplify the process to
discover all the test suites and idenfity them.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/runner.py                   |   2 +-
 dts/framework/test_suite.py               | 189 +++++++++++++++++++---
 dts/framework/testbed_model/capability.py |  12 +-
 3 files changed, 177 insertions(+), 26 deletions(-)

diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 8bbe698eaf..195622c653 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -225,7 +225,7 @@ def _get_test_suites_with_cases(
         for test_suite_config in test_suite_configs:
             test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
             test_cases: list[type[TestCase]] = []
-            func_test_cases, perf_test_cases = test_suite_class.get_test_cases(
+            func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
                 test_suite_config.test_cases
             )
             if func:
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index cbe3b30ffc..936eb2cede 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2010-2014 Intel Corporation
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Features common to all test suites.
 
@@ -16,13 +17,20 @@
 import inspect
 from collections import Counter
 from collections.abc import Callable, Sequence
+from dataclasses import dataclass
 from enum import Enum, auto
+from functools import cached_property
+from importlib import import_module
 from ipaddress import IPv4Interface, IPv6Interface, ip_interface
+from pkgutil import iter_modules
+from types import ModuleType
 from typing import ClassVar, Protocol, TypeVar, Union, cast
 
+from pydantic.alias_generators import to_pascal
 from scapy.layers.inet import IP  # type: ignore[import-untyped]
 from scapy.layers.l2 import Ether  # type: ignore[import-untyped]
 from scapy.packet import Packet, Padding, raw  # type: ignore[import-untyped]
+from typing_extensions import Self
 
 from framework.testbed_model.capability import TestProtocol
 from framework.testbed_model.port import Port
@@ -33,7 +41,7 @@
     PacketFilteringConfig,
 )
 
-from .exception import ConfigurationError, TestCaseVerifyError
+from .exception import ConfigurationError, InternalError, TestCaseVerifyError
 from .logger import DTSLogger, get_dts_logger
 from .utils import get_packet_summaries
 
@@ -112,10 +120,24 @@ def __init__(
         self._tg_ip_address_ingress = ip_interface("192.168.101.3/24")
 
     @classmethod
-    def get_test_cases(
+    def get_test_cases(cls) -> list[type["TestCase"]]:
+        """A list of all the available test cases."""
+
+        def is_test_case(function: Callable) -> bool:
+            if inspect.isfunction(function):
+                # TestCase is not used at runtime, so we can't use isinstance() with `function`.
+                # But function.test_type exists.
+                if hasattr(function, "test_type"):
+                    return isinstance(function.test_type, TestCaseType)
+            return False
+
+        return [test_case for _, test_case in inspect.getmembers(cls, is_test_case)]
+
+    @classmethod
+    def filter_test_cases(
         cls, test_case_sublist: Sequence[str] | None = None
     ) -> tuple[set[type["TestCase"]], set[type["TestCase"]]]:
-        """Filter `test_case_subset` from this class.
+        """Filter `test_case_sublist` from this class.
 
         Test cases are regular (or bound) methods decorated with :func:`func_test`
         or :func:`perf_test`.
@@ -129,17 +151,8 @@ def get_test_cases(
             as methods are bound to instances and this method only has access to the class.
 
         Raises:
-            ConfigurationError: If a test case from `test_case_subset` is not found.
+            ConfigurationError: If a test case from `test_case_sublist` is not found.
         """
-
-        def is_test_case(function: Callable) -> bool:
-            if inspect.isfunction(function):
-                # TestCase is not used at runtime, so we can't use isinstance() with `function`.
-                # But function.test_type exists.
-                if hasattr(function, "test_type"):
-                    return isinstance(function.test_type, TestCaseType)
-            return False
-
         if test_case_sublist is None:
             test_case_sublist = []
 
@@ -149,22 +162,22 @@ def is_test_case(function: Callable) -> bool:
         func_test_cases = set()
         perf_test_cases = set()
 
-        for test_case_name, test_case_function in inspect.getmembers(cls, is_test_case):
-            if test_case_name in test_case_sublist_copy:
+        for test_case in cls.get_test_cases():
+            if test_case.name in test_case_sublist_copy:
                 # if test_case_sublist_copy is non-empty, remove the found test case
                 # so that we can look at the remainder at the end
-                test_case_sublist_copy.remove(test_case_name)
+                test_case_sublist_copy.remove(test_case.name)
             elif test_case_sublist:
                 # the original list not being empty means we're filtering test cases
-                # since we didn't remove test_case_name in the previous branch,
+                # since we didn't remove test_case.name in the previous branch,
                 # it doesn't match the filter and we don't want to remove it
                 continue
 
-            match test_case_function.test_type:
+            match test_case.test_type:
                 case TestCaseType.PERFORMANCE:
-                    perf_test_cases.add(test_case_function)
+                    perf_test_cases.add(test_case)
                 case TestCaseType.FUNCTIONAL:
-                    func_test_cases.add(test_case_function)
+                    func_test_cases.add(test_case)
 
         if test_case_sublist_copy:
             raise ConfigurationError(
@@ -536,6 +549,8 @@ class TestCase(TestProtocol, Protocol[TestSuiteMethodType]):
     test case function to :class:`TestCase` and sets common variables.
     """
 
+    #:
+    name: ClassVar[str]
     #:
     test_type: ClassVar[TestCaseType]
     #: necessary for mypy so that it can treat this class as the function it's shadowing
@@ -560,6 +575,7 @@ def make_decorator(
 
         def _decorator(func: TestSuiteMethodType) -> type[TestCase]:
             test_case = cast(type[TestCase], func)
+            test_case.name = func.__name__
             test_case.skip = cls.skip
             test_case.skip_reason = cls.skip_reason
             test_case.required_capabilities = set()
@@ -575,3 +591,136 @@ def _decorator(func: TestSuiteMethodType) -> type[TestCase]:
 func_test: Callable = TestCase.make_decorator(TestCaseType.FUNCTIONAL)
 #: The decorator for performance test cases.
 perf_test: Callable = TestCase.make_decorator(TestCaseType.PERFORMANCE)
+
+
+@dataclass
+class TestSuiteSpec:
+    """A class defining the specification of a test suite.
+
+    Apart from defining all the specs of a test suite, a helper function :meth:`discover_all` is
+    provided to automatically discover all the available test suites.
+
+    Attributes:
+        module_name: The name of the test suite's module.
+    """
+
+    #:
+    TEST_SUITES_PACKAGE_NAME = "tests"
+    #:
+    TEST_SUITE_MODULE_PREFIX = "TestSuite_"
+    #:
+    TEST_SUITE_CLASS_PREFIX = "Test"
+    #:
+    TEST_CASE_METHOD_PREFIX = "test_"
+    #:
+    FUNC_TEST_CASE_REGEX = r"test_(?!perf_)"
+    #:
+    PERF_TEST_CASE_REGEX = r"test_perf_"
+
+    module_name: str
+
+    @cached_property
+    def name(self) -> str:
+        """The name of the test suite's module."""
+        return self.module_name[len(self.TEST_SUITE_MODULE_PREFIX) :]
+
+    @cached_property
+    def module(self) -> ModuleType:
+        """A reference to the test suite's module."""
+        return import_module(f"{self.TEST_SUITES_PACKAGE_NAME}.{self.module_name}")
+
+    @cached_property
+    def class_name(self) -> str:
+        """The name of the test suite's class."""
+        return f"{self.TEST_SUITE_CLASS_PREFIX}{to_pascal(self.name)}"
+
+    @cached_property
+    def class_obj(self) -> type[TestSuite]:
+        """A reference to the test suite's class."""
+
+        def is_test_suite(obj) -> bool:
+            """Check whether `obj` is a :class:`TestSuite`.
+
+            The `obj` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
+
+            Args:
+                obj: The object to be checked.
+
+            Returns:
+                :data:`True` if `obj` is a subclass of `TestSuite`.
+            """
+            try:
+                if issubclass(obj, TestSuite) and obj is not TestSuite:
+                    return True
+            except TypeError:
+                return False
+            return False
+
+        for class_name, class_obj in inspect.getmembers(self.module, is_test_suite):
+            if class_name == self.class_name:
+                return class_obj
+
+        raise InternalError(
+            f"Expected class {self.class_name} not found in module {self.module_name}."
+        )
+
+    @classmethod
+    def discover_all(
+        cls, package_name: str | None = None, module_prefix: str | None = None
+    ) -> list[Self]:
+        """Discover all the test suites.
+
+        The test suites are discovered in the provided `package_name`. The full module name,
+        expected under that package, is prefixed with `module_prefix`.
+        The module name is a standard filename with words separated with underscores.
+        For each module found, search for a :class:`TestSuite` class which starts
+        with :attr:`~TestSuiteSpec.TEST_SUITE_CLASS_PREFIX`, continuing with the module name in
+        PascalCase.
+
+        The PascalCase convention applies to abbreviations, acronyms, initialisms and so on::
+
+            OS -> Os
+            TCP -> Tcp
+
+        Args:
+            package_name: The name of the package where to find the test suites. If :data:`None`,
+                the :attr:`~TestSuiteSpec.TEST_SUITES_PACKAGE_NAME` is used.
+            module_prefix: The name prefix defining the test suite module. If :data:`None`, the
+                :attr:`~TestSuiteSpec.TEST_SUITE_MODULE_PREFIX` constant is used.
+
+        Returns:
+            A list containing all the discovered test suites.
+        """
+        if package_name is None:
+            package_name = cls.TEST_SUITES_PACKAGE_NAME
+        if module_prefix is None:
+            module_prefix = cls.TEST_SUITE_MODULE_PREFIX
+
+        test_suites = []
+
+        test_suites_pkg = import_module(package_name)
+        for _, module_name, is_pkg in iter_modules(test_suites_pkg.__path__):
+            if not module_name.startswith(module_prefix) or is_pkg:
+                continue
+
+            test_suite = cls(module_name)
+            try:
+                if test_suite.class_obj:
+                    test_suites.append(test_suite)
+            except InternalError as err:
+                get_dts_logger().warning(err)
+
+        return test_suites
+
+
+AVAILABLE_TEST_SUITES: list[TestSuiteSpec] = TestSuiteSpec.discover_all()
+"""Constant to store all the available, discovered and imported test suites.
+
+The test suites should be gathered from this list to avoid importing more than once.
+"""
+
+
+def find_by_name(name: str) -> TestSuiteSpec | None:
+    """Find a requested test suite by name from the available ones."""
+    test_suites = filter(lambda t: t.name == name, AVAILABLE_TEST_SUITES)
+    return next(test_suites, None)
diff --git a/dts/framework/testbed_model/capability.py b/dts/framework/testbed_model/capability.py
index 2207957a7a..0d5f0e0b32 100644
--- a/dts/framework/testbed_model/capability.py
+++ b/dts/framework/testbed_model/capability.py
@@ -47,9 +47,9 @@ def test_scatter_mbuf_2048(self):
 
 import inspect
 from abc import ABC, abstractmethod
-from collections.abc import MutableSet, Sequence
+from collections.abc import MutableSet
 from dataclasses import dataclass
-from typing import Callable, ClassVar, Protocol
+from typing import TYPE_CHECKING, Callable, ClassVar, Protocol
 
 from typing_extensions import Self
 
@@ -66,6 +66,9 @@ def test_scatter_mbuf_2048(self):
 from .sut_node import SutNode
 from .topology import Topology, TopologyType
 
+if TYPE_CHECKING:
+    from framework.test_suite import TestCase
+
 
 class Capability(ABC):
     """The base class for various capabilities.
@@ -354,8 +357,7 @@ def set_required(self, test_case_or_suite: type["TestProtocol"]) -> None:
         if inspect.isclass(test_case_or_suite):
             if self.topology_type is not TopologyType.default:
                 self.add_to_required(test_case_or_suite)
-                func_test_cases, perf_test_cases = test_case_or_suite.get_test_cases()
-                for test_case in func_test_cases | perf_test_cases:
+                for test_case in test_case_or_suite.get_test_cases():
                     if test_case.topology_type.topology_type is TopologyType.default:
                         # test case topology has not been set, use the one set by the test suite
                         self.add_to_required(test_case)
@@ -446,7 +448,7 @@ class TestProtocol(Protocol):
     required_capabilities: ClassVar[set[Capability]] = set()
 
     @classmethod
-    def get_test_cases(cls, test_case_sublist: Sequence[str] | None = None) -> tuple[set, set]:
+    def get_test_cases(cls) -> list[type["TestCase"]]:
         """Get test cases. Should be implemented by subclasses containing test cases.
 
         Raises:
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v3 3/5] dts: use pydantic in the configuration
  2024-10-25 16:43 ` [PATCH v3 0/5] dts: Pydantic configuration Luca Vizzarro
  2024-10-25 16:43   ` [PATCH v3 1/5] dts: add pydantic dependency Luca Vizzarro
  2024-10-25 16:43   ` [PATCH v3 2/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
@ 2024-10-25 16:43   ` Luca Vizzarro
  2024-10-25 16:43   ` [PATCH v3 4/5] dts: remove warlock dependency Luca Vizzarro
  2024-10-25 16:43   ` [PATCH v3 5/5] dts: use TestSuiteSpec class imports Luca Vizzarro
  4 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-25 16:43 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

This change brings in pydantic in place of warlock. Pydantic offers
a built-in model validation system in the classes, which allows for
a more resilient and simpler code. As a consequence of this change:

- most validation is now built-in
- further validation is added to verify:
  - cross referencing of node names and ports
  - test suite and test cases names
- dictionaries representing the config schema are removed
- the config schema is no longer used and therefore dropped
- the TrafficGeneratorType enum has been changed from inheriting
  StrEnum to the native str and Enum. This change was necessary to
  enable the discriminator for object unions
- the structure of the classes has been slightly changed to perfectly
  match the structure of the configuration files
- the test suite argument catches the ValidationError that
  TestSuiteConfig can now raise
- the DPDK location has been wrapped under another configuration
  mapping `dpdk_location`
- the DPDK locations are now structured and enforced by classes,
  further simplifying the validation and handling thanks to
  pattern matching

Bugzilla ID: 1508

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 doc/api/dts/conf_yaml_schema.json             |   1 -
 doc/api/dts/framework.config.rst              |   6 -
 doc/api/dts/framework.config.types.rst        |   8 -
 dts/conf.yaml                                 |  11 +-
 dts/framework/config/__init__.py              | 844 +++++++++---------
 dts/framework/config/conf_yaml_schema.json    | 458 ----------
 dts/framework/config/types.py                 | 149 ----
 dts/framework/runner.py                       |  57 +-
 dts/framework/settings.py                     | 124 +--
 dts/framework/testbed_model/node.py           |  15 +-
 dts/framework/testbed_model/os_session.py     |   4 +-
 dts/framework/testbed_model/port.py           |   4 +-
 dts/framework/testbed_model/posix_session.py  |  10 +-
 dts/framework/testbed_model/sut_node.py       | 182 ++--
 dts/framework/testbed_model/topology.py       |  11 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 dts/framework/utils.py                        |   2 +-
 dts/tests/TestSuite_smoke_tests.py            |   2 +-
 19 files changed, 671 insertions(+), 1223 deletions(-)
 delete mode 120000 doc/api/dts/conf_yaml_schema.json
 delete mode 100644 doc/api/dts/framework.config.types.rst
 delete mode 100644 dts/framework/config/conf_yaml_schema.json
 delete mode 100644 dts/framework/config/types.py

diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json
deleted file mode 120000
index 5978642d76..0000000000
--- a/doc/api/dts/conf_yaml_schema.json
+++ /dev/null
@@ -1 +0,0 @@
-../../../dts/framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst
index 261997aefa..cc266276c1 100644
--- a/doc/api/dts/framework.config.rst
+++ b/doc/api/dts/framework.config.rst
@@ -6,9 +6,3 @@ config - Configuration Package
 .. automodule:: framework.config
    :members:
    :show-inheritance:
-
-.. toctree::
-   :hidden:
-   :maxdepth: 1
-
-   framework.config.types
diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst
deleted file mode 100644
index a50a0c874a..0000000000
--- a/doc/api/dts/framework.config.types.rst
+++ /dev/null
@@ -1,8 +0,0 @@
-.. SPDX-License-Identifier: BSD-3-Clause
-
-config.types - Configuration Types
-==================================
-
-.. automodule:: framework.config.types
-   :members:
-   :show-inheritance:
diff --git a/dts/conf.yaml b/dts/conf.yaml
index 8a65a481d6..2496262854 100644
--- a/dts/conf.yaml
+++ b/dts/conf.yaml
@@ -5,11 +5,12 @@
 test_runs:
   # define one test run environment
   - dpdk_build:
-      # dpdk_tree: Commented out because `tarball` is defined.
-      tarball: dpdk-tarball.tar.xz
-      # Either `dpdk_tree` or `tarball` can be defined, but not both.
-      remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
-                    # is located on the SUT node, instead of the execution host.
+      dpdk_location:
+        # dpdk_tree: Commented out because `tarball` is defined.
+        tarball: dpdk-tarball.tar.xz
+        # Either `dpdk_tree` or `tarball` can be defined, but not both.
+        remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
+                      # is located on the SUT node, instead of the execution host.
 
       # precompiled_build_dir: Commented out because `build_options` is defined.
       build_options:
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index d0d95d00c7..154831e595 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -2,17 +2,18 @@
 # Copyright(c) 2010-2021 Intel Corporation
 # Copyright(c) 2022-2023 University of New Hampshire
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Testbed configuration and test suite specification.
 
 This package offers classes that hold real-time information about the testbed, hold test run
 configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
-the YAML test run configuration file
-and validates it according to :download:`the schema <conf_yaml_schema.json>`.
+the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
+model.
 
 The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
-this package. The allowed keys and types inside this dictionary are defined in
-the :doc:`types <framework.config.types>` module.
+this package. The allowed keys and types inside this dictionary map directly to the
+:class:`Configuration` model, its fields and sub-models.
 
 The test run configuration has two main sections:
 
@@ -24,39 +25,29 @@
 
 The real-time information about testbed is supposed to be gathered at runtime.
 
-The classes defined in this package make heavy use of :mod:`dataclasses`.
-All of them use slots and are frozen:
+The classes defined in this package make heavy use of :mod:`pydantic`.
+Nearly all of them are frozen:
 
-    * Slots enables some optimizations, by pre-allocating space for the defined
-      attributes in the underlying data structure,
     * Frozen makes the object immutable. This enables further optimizations,
       and makes it thread safe should we ever want to move in that direction.
 """
 
-import json
-import os.path
 import tarfile
-from dataclasses import dataclass, fields
-from enum import auto, unique
-from pathlib import Path
-from typing import Union
+from enum import Enum, auto, unique
+from functools import cached_property
+from pathlib import Path, PurePath
+from typing import TYPE_CHECKING, Annotated, Any, Literal, NamedTuple
 
-import warlock  # type: ignore[import-untyped]
 import yaml
+from pydantic import BaseModel, Field, ValidationError, field_validator, model_validator
+from pydantic.config import JsonDict
 from typing_extensions import Self
 
-from framework.config.types import (
-    ConfigurationDict,
-    DPDKBuildConfigDict,
-    DPDKConfigurationDict,
-    NodeConfigDict,
-    PortConfigDict,
-    TestRunConfigDict,
-    TestSuiteConfigDict,
-    TrafficGeneratorConfigDict,
-)
 from framework.exception import ConfigurationError
-from framework.utils import StrEnum
+from framework.utils import REGEX_FOR_PCI_ADDRESS, StrEnum
+
+if TYPE_CHECKING:
+    from framework.test_suite import TestSuiteSpec
 
 
 @unique
@@ -118,15 +109,14 @@ class Compiler(StrEnum):
 
 
 @unique
-class TrafficGeneratorType(StrEnum):
+class TrafficGeneratorType(str, Enum):
     """The supported traffic generators."""
 
     #:
-    SCAPY = auto()
+    SCAPY = "SCAPY"
 
 
-@dataclass(slots=True, frozen=True)
-class HugepageConfiguration:
+class HugepageConfiguration(BaseModel, frozen=True, extra="forbid"):
     r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
@@ -138,12 +128,10 @@ class HugepageConfiguration:
     force_first_numa: bool
 
 
-@dataclass(slots=True, frozen=True)
-class PortConfig:
+class PortConfig(BaseModel, frozen=True, extra="forbid"):
     r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
-        node: The :class:`~framework.testbed_model.node.Node` where this port exists.
         pci: The PCI address of the port.
         os_driver_for_dpdk: The operating system driver name for use with DPDK.
         os_driver: The operating system driver name when the operating system controls the port.
@@ -152,70 +140,57 @@ class PortConfig:
         peer_pci: The PCI address of the port connected to this port.
     """
 
-    node: str
-    pci: str
-    os_driver_for_dpdk: str
-    os_driver: str
-    peer_node: str
-    peer_pci: str
-
-    @classmethod
-    def from_dict(cls, node: str, d: PortConfigDict) -> Self:
-        """A convenience method that creates the object from fewer inputs.
-
-        Args:
-            node: The node where this port exists.
-            d: The configuration dictionary.
-
-        Returns:
-            The port configuration instance.
-        """
-        return cls(node=node, **d)
-
-
-@dataclass(slots=True, frozen=True)
-class TrafficGeneratorConfig:
-    """The configuration of traffic generators.
-
-    The class will be expanded when more configuration is needed.
+    pci: str = Field(
+        description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS
+    )
+    os_driver_for_dpdk: str = Field(
+        description="The driver that the kernel should bind this device to for DPDK to use it.",
+        examples=["vfio-pci", "mlx5_core"],
+    )
+    os_driver: str = Field(
+        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
+    )
+    peer_node: str = Field(description="The name of the peer node this port is connected to.")
+    peer_pci: str = Field(
+        description="The PCI address of the peer port this port is connected to.",
+        pattern=REGEX_FOR_PCI_ADDRESS,
+    )
+
+
+class TrafficGeneratorConfig(BaseModel, frozen=True, extra="forbid"):
+    """A protocol required to define traffic generator types.
 
     Attributes:
-        traffic_generator_type: The type of the traffic generator.
+        type: The traffic generator type, the child class is required to define to be distinguished
+            among others.
     """
 
-    traffic_generator_type: TrafficGeneratorType
+    type: TrafficGeneratorType
 
-    @staticmethod
-    def from_dict(d: TrafficGeneratorConfigDict) -> "TrafficGeneratorConfig":
-        """A convenience method that produces traffic generator config of the proper type.
 
-        Args:
-            d: The configuration dictionary.
+class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="forbid"):
+    """Scapy traffic generator specific configuration."""
 
-        Returns:
-            The traffic generator configuration instance.
+    type: Literal[TrafficGeneratorType.SCAPY]
 
-        Raises:
-            ConfigurationError: An unknown traffic generator type was encountered.
-        """
-        match TrafficGeneratorType(d["type"]):
-            case TrafficGeneratorType.SCAPY:
-                return ScapyTrafficGeneratorConfig(
-                    traffic_generator_type=TrafficGeneratorType.SCAPY
-                )
-            case _:
-                raise ConfigurationError(f'Unknown traffic generator type "{d["type"]}".')
 
+#: A union type discriminating traffic generators by the `type` field.
+TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
 
-@dataclass(slots=True, frozen=True)
-class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
-    """Scapy traffic generator specific configuration."""
 
-    pass
+#: A field representing logical core ranges.
+LogicalCores = Annotated[
+    str,
+    Field(
+        description="Comma-separated list of logical cores to use. "
+        "An empty string means use all lcores.",
+        examples=["1,2,3,4,5,18-22", "10-15"],
+        pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
+    ),
+]
 
 
-@dataclass(slots=True, frozen=True)
-class NodeConfiguration:
+class NodeConfiguration(BaseModel, frozen=True, extra="forbid"):
     r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
@@ -234,92 +209,51 @@ class NodeConfiguration:
         ports: The ports that can be used in testing.
     """
 
-    name: str
-    hostname: str
-    user: str
-    password: str | None
+    name: str = Field(description="A unique identifier for this node.")
+    hostname: str = Field(description="The hostname or IP address of the node.")
+    user: str = Field(description="The login user to use to connect to this node.")
+    password: str | None = Field(
+        default=None,
+        description="The login password to use to connect to this node. "
+        "SSH keys are STRONGLY preferred, use only as last resort.",
+    )
     arch: Architecture
     os: OS
-    lcores: str
-    use_first_core: bool
-    hugepages: HugepageConfiguration | None
-    ports: list[PortConfig]
-
-    @staticmethod
-    def from_dict(
-        d: NodeConfigDict,
-    ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
-        """A convenience method that processes the inputs before creating a specialized instance.
-
-        Args:
-            d: The configuration dictionary.
-
-        Returns:
-            Either an SUT or TG configuration instance.
-        """
-        hugepage_config = None
-        if "hugepages_2mb" in d:
-            hugepage_config_dict = d["hugepages_2mb"]
-            if "force_first_numa" not in hugepage_config_dict:
-                hugepage_config_dict["force_first_numa"] = False
-            hugepage_config = HugepageConfiguration(**hugepage_config_dict)
-
-        # The calls here contain duplicated code which is here because Mypy doesn't
-        # properly support dictionary unpacking with TypedDicts
-        if "traffic_generator" in d:
-            return TGNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                traffic_generator=TrafficGeneratorConfig.from_dict(d["traffic_generator"]),
-            )
-        else:
-            return SutNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                memory_channels=d.get("memory_channels", 1),
-            )
+    lcores: LogicalCores = "1"
+    use_first_core: bool = Field(
+        default=False, description="DPDK won't use the first physical core if set to False."
+    )
+    hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
+    ports: list[PortConfig] = Field(min_length=1)
 
 
-@dataclass(slots=True, frozen=True)
-class SutNodeConfiguration(NodeConfiguration):
+class SutNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
     """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
 
     Attributes:
         memory_channels: The number of memory channels to use when running DPDK.
     """
 
-    memory_channels: int
+    memory_channels: int = Field(
+        default=1, description="Number of memory channels to use when running DPDK."
+    )
 
 
-@dataclass(slots=True, frozen=True)
-class TGNodeConfiguration(NodeConfiguration):
+class TGNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
     """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
 
     Attributes:
         traffic_generator: The configuration of the traffic generator present on the TG node.
     """
 
-    traffic_generator: TrafficGeneratorConfig
+    traffic_generator: TrafficGeneratorConfigTypes
+
 
+#: Union type for all the node configuration types.
+NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
 
-@dataclass(slots=True, frozen=True)
-class NodeInfo:
+
+class NodeInfo(BaseModel, frozen=True, extra="forbid"):
     """Supplemental node information.
 
     Attributes:
@@ -336,165 +270,187 @@ class NodeInfo:
     kernel_version: str
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKBuildConfiguration:
-    """DPDK build configuration.
+def resolve_path(path: str) -> Path:
+    """Resolve a path as string into an absolute path."""
+    return Path(path).resolve()
 
-    The configuration used for building DPDK.
+
+class BaseDPDKLocation(BaseModel, frozen=True, extra="forbid"):
+    """DPDK location.
+
+    The path to the DPDK sources, build dir and type of location.
 
     Attributes:
-        arch: The target architecture to build for.
-        os: The target os to build for.
-        cpu: The target CPU to build for.
-        compiler: The compiler executable to use.
-        compiler_wrapper: This string will be put in front of the compiler when
-            executing the build. Useful for adding wrapper commands, such as ``ccache``.
-        name: The name of the compiler.
+        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
+            located on the SUT node, instead of the execution host.
     """
 
-    arch: Architecture
-    os: OS
-    cpu: CPUType
-    compiler: Compiler
-    compiler_wrapper: str
-    name: str
+    remote: bool = False
 
-    @classmethod
-    def from_dict(cls, d: DPDKBuildConfigDict) -> Self:
-        r"""A convenience method that processes the inputs before creating an instance.
 
-        `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
-        `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
+class LocalDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
+    """Local DPDK location parent class.
 
-        Args:
-            d: The configuration dictionary.
+    This class is meant to represent any location that is present only locally.
+    """
 
-        Returns:
-            The DPDK build configuration instance.
-        """
-        return cls(
-            arch=Architecture(d["arch"]),
-            os=OS(d["os"]),
-            cpu=CPUType(d["cpu"]),
-            compiler=Compiler(d["compiler"]),
-            compiler_wrapper=d.get("compiler_wrapper", ""),
-            name=f"{d['arch']}-{d['os']}-{d['cpu']}-{d['compiler']}",
-        )
+    remote: Literal[False] = False
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKLocation:
-    """DPDK location.
+class LocalDPDKTreeLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
+    """Local DPDK tree location.
 
-    The path to the DPDK sources, build dir and type of location.
+    This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
+    validation.
 
     Attributes:
-        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
-            must be provided.
-        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
-            provided.
-        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
-            located on the SUT node, instead of the execution host.
-        build_dir: If it's defined, DPDK has been pre-compiled and the build directory is located in
-            a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will be using
-            `build_options` from configuration to build the DPDK from source.
+        dpdk_tree: The path to the DPDK source tree directory.
     """
 
-    dpdk_tree: str | None
-    tarball: str | None
-    remote: bool
-    build_dir: str | None
+    dpdk_tree: Path
 
-    @classmethod
-    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
-        """A convenience method that processes and validates the inputs before creating an instance.
+    #: Resolve the local DPDK tree path
+    resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
 
-        Validate existence and format of `dpdk_tree` or `tarball` on local filesystem, if
-        `remote` is False.
+    @model_validator(mode="after")
+    def validate_dpdk_tree_path(self) -> Self:
+        """Validate the provided DPDK tree path."""
+        assert self.dpdk_tree.exists(), "DPDK tree not found in local filesystem."
+        assert self.dpdk_tree.is_dir(), "The DPDK tree path must be a directory."
+        return self
 
-        Args:
-            d: The configuration dictionary.
 
-        Returns:
-            The DPDK location instance.
+class LocalDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
+    """Local DPDK tarball location.
 
-        Raises:
-            ConfigurationError: If `dpdk_tree` or `tarball` not found in local filesystem or they
-                aren't in the right format.
-        """
-        dpdk_tree = d.get("dpdk_tree")
-        tarball = d.get("tarball")
-        remote = d.get("remote", False)
-
-        if not remote:
-            if dpdk_tree:
-                if not Path(dpdk_tree).exists():
-                    raise ConfigurationError(
-                        f"DPDK tree '{dpdk_tree}' not found in local filesystem."
-                    )
-
-                if not Path(dpdk_tree).is_dir():
-                    raise ConfigurationError(f"The DPDK tree '{dpdk_tree}' must be a directory.")
-
-                dpdk_tree = os.path.realpath(dpdk_tree)
-
-            if tarball:
-                if not Path(tarball).exists():
-                    raise ConfigurationError(
-                        f"DPDK tarball '{tarball}' not found in local filesystem."
-                    )
-
-                if not tarfile.is_tarfile(tarball):
-                    raise ConfigurationError(
-                        f"The DPDK tarball '{tarball}' must be a valid tar archive."
-                    )
-
-        return cls(
-            dpdk_tree=dpdk_tree,
-            tarball=tarball,
-            remote=remote,
-            build_dir=d.get("precompiled_build_dir"),
-        )
+    This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
+    validation.
+
+    Attributes:
+        tarball: The path to the DPDK tarball.
+    """
+
+    tarball: Path
+
+    #: Resolve the local tarball path
+    resolve_tarball_path = field_validator("tarball")(resolve_path)
+
+    @model_validator(mode="after")
+    def validate_tarball_path(self) -> Self:
+        """Validate the provided tarball."""
+        assert self.tarball.exists(), "DPDK tarball not found in local filesystem."
+        assert tarfile.is_tarfile(self.tarball), "The DPDK tarball must be a valid tar archive."
+        return self
+
+
+class RemoteDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
+    """Remote DPDK location parent class.
+
+    This class is meant to represent any location that is present only remotely.
+    """
+
+    remote: Literal[True] = True
+
+
+class RemoteDPDKTreeLocation(RemoteDPDKLocation, frozen=True, extra="forbid"):
+    """Remote DPDK tree location.
+
+    This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
 
+    Attributes:
+        dpdk_tree: The path to the DPDK source tree directory.
+    """
+
+    dpdk_tree: PurePath
 
-@dataclass
-class DPDKConfiguration:
-    """The configuration of the DPDK build.
 
-    The configuration contain the location of the DPDK and configuration used for
-    building it.
+class RemoteDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
+    """Remote DPDK tarball location.
+
+    This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
+    validation.
+
+    Attributes:
+        tarball: The path to the DPDK tarball.
+    """
+
+    tarball: PurePath
+
+
+#: Union type for different DPDK locations
+DPDKLocation = (
+    LocalDPDKTreeLocation
+    | LocalDPDKTarballLocation
+    | RemoteDPDKTreeLocation
+    | RemoteDPDKTarballLocation
+)
+
+
+class BaseDPDKBuildConfiguration(BaseModel, frozen=True, extra="forbid"):
+    """The base configuration for different types of build.
+
+    The configuration contain the location of the DPDK and configuration used for building it.
 
     Attributes:
         dpdk_location: The location of the DPDK tree.
-        dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
-            DTS will use pre-built DPDK from `build_dir` in a :class:`DPDKLocation`.
     """
 
     dpdk_location: DPDKLocation
-    dpdk_build_config: DPDKBuildConfiguration | None
 
-    @classmethod
-    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
 
-        Args:
-            d: The configuration dictionary.
+class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
+    """DPDK precompiled build configuration.
 
-        Returns:
-            The DPDK configuration.
-        """
-        return cls(
-            dpdk_location=DPDKLocation.from_dict(d),
-            dpdk_build_config=(
-                DPDKBuildConfiguration.from_dict(d["build_options"])
-                if d.get("build_options")
-                else None
-            ),
-        )
+    Attributes:
+        precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory
+            is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will
+            be using `dpdk_build_config` from configuration to build the DPDK from source.
+    """
 
+    precompiled_build_dir: str = Field(min_length=1)
 
-@dataclass(slots=True, frozen=True)
-class DPDKBuildInfo:
+
+class DPDKBuildOptionsConfiguration(BaseModel, frozen=True, extra="forbid"):
+    """DPDK build options configuration.
+
+    The build options used for building DPDK.
+
+    Attributes:
+        arch: The target architecture to build for.
+        os: The target os to build for.
+        cpu: The target CPU to build for.
+        compiler: The compiler executable to use.
+        compiler_wrapper: This string will be put in front of the compiler when executing the build.
+            Useful for adding wrapper commands, such as ``ccache``.
+    """
+
+    arch: Architecture
+    os: OS
+    cpu: CPUType
+    compiler: Compiler
+    compiler_wrapper: str = ""
+
+    @cached_property
+    def name(self) -> str:
+        """The name of the compiler."""
+        return f"{self.arch}-{self.os}-{self.cpu}-{self.compiler}"
+
+
+class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
+    """DPDK uncompiled build configuration.
+
+    Attributes:
+        build_options: The build options to compile DPDK.
+    """
+
+    build_options: DPDKBuildOptionsConfiguration
+
+
+#: Union type for different build configurations
+DPDKBuildConfiguration = DPDKPrecompiledBuildConfiguration | DPDKUncompiledBuildConfiguration
+
+
+class DPDKBuildInfo(BaseModel, frozen=True, extra="forbid"):
     """Various versions and other information about a DPDK build.
 
     Attributes:
@@ -506,44 +462,108 @@ class DPDKBuildInfo:
     compiler_version: str | None
 
 
-@dataclass(slots=True, frozen=True)
-class TestSuiteConfig:
+def make_parsable_schema(schema: JsonDict):
+    """Updates a model's JSON schema to make a string representation a valid alternative.
+
+    This utility function is required to be used with models that can be represented and validated
+    as a string instead of an object mapping. Normally the generated JSON schema will just show
+    the object mapping. This function wraps the mapping under an anyOf property sequenced with a
+    string type.
+
+    This function is a valid `Callable` for the
+    :attr:`~pydantic.config.ConfigDict.json_schema_extra` attribute.
+    """
+    inner_schema = schema.copy()
+    del inner_schema["title"]
+
+    title = schema.get("title")
+    description = schema.get("description")
+
+    schema.clear()
+
+    schema["title"] = title
+    schema["description"] = description
+    schema["anyOf"] = [inner_schema, {"type": "string"}]
+
+
+class TestSuiteConfig(
+    BaseModel, frozen=True, extra="forbid", json_schema_extra=make_parsable_schema
+):
     """Test suite configuration.
 
-    Information about a single test suite to be executed.
+    Information about a single test suite to be executed. It can be represented and validated as a
+    string type in the form of: ``TEST_SUITE [TEST_CASE, ...]``, in the configuration file.
 
     Attributes:
-        test_suite: The name of the test suite module without the starting ``TestSuite_``.
-        test_cases: The names of test cases from this test suite to execute.
+        test_suite_name: The name of the test suite module without the starting ``TestSuite_``.
+        test_cases_names: The names of test cases from this test suite to execute.
             If empty, all test cases will be executed.
     """
 
-    test_suite: str
-    test_cases: list[str]
-
+    test_suite_name: str = Field(
+        title="Test suite name",
+        description="The identifying module name of the test suite without the prefix.",
+        alias="test_suite",
+    )
+    test_cases_names: list[str] = Field(
+        default_factory=list,
+        title="Test cases by name",
+        description="The identifying name of the test cases of the test suite.",
+        alias="test_cases",
+    )
+
+    @cached_property
+    def test_suite_spec(self) -> "TestSuiteSpec":
+        """The specification of the requested test suite."""
+        from framework.test_suite import find_by_name
+
+        test_suite_spec = find_by_name(self.test_suite_name)
+        assert (
+            test_suite_spec is not None
+        ), f"{self.test_suite_name} is not a valid test suite module name."
+        return test_suite_spec
+
+    @model_validator(mode="before")
     @classmethod
-    def from_dict(
-        cls,
-        entry: str | TestSuiteConfigDict,
-    ) -> Self:
-        """Create an instance from two different types.
+    def convert_from_string(cls, data: Any) -> Any:
+        """Convert the string representation into a valid mapping."""
+        if isinstance(data, str):
+            [test_suite, *test_cases] = data.split()
+            return dict(test_suite=test_suite, test_cases=test_cases)
+        return data
+
+    @model_validator(mode="after")
+    def validate_names(self) -> Self:
+        """Validate the supplied test suite and test cases names.
+
+        This validator relies on the cached property `test_suite_spec` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        available_test_cases = map(
+            lambda t: t.name, self.test_suite_spec.class_obj.get_test_cases()
+        )
+        for requested_test_case in self.test_cases_names:
+            assert requested_test_case in available_test_cases, (
+                f"{requested_test_case} is not a valid test case "
+                f"of test suite {self.test_suite_name}."
+            )
 
-        Args:
-            entry: Either a suite name or a dictionary containing the config.
+        return self
 
-        Returns:
-            The test suite configuration instance.
-        """
-        if isinstance(entry, str):
-            return cls(test_suite=entry, test_cases=[])
-        elif isinstance(entry, dict):
-            return cls(test_suite=entry["suite"], test_cases=entry["cases"])
-        else:
-            raise TypeError(f"{type(entry)} is not valid for a test suite config.")
 
+class TestRunSUTNodeConfiguration(BaseModel, frozen=True, extra="forbid"):
+    """The SUT node configuration of a test run.
 
-@dataclass(slots=True, frozen=True)
-class TestRunConfiguration:
+    Attributes:
+        node_name: The SUT node to use in this test run.
+        vdevs: The names of virtual devices to test.
+    """
+
+    node_name: str
+    vdevs: list[str] = Field(default_factory=list)
+
+
+class TestRunConfiguration(BaseModel, frozen=True, extra="forbid"):
     """The configuration of a test run.
 
     The configuration contains testbed information, what tests to execute
@@ -555,144 +575,130 @@ class TestRunConfiguration:
         func: Whether to run functional tests.
         skip_smoke_tests: Whether to skip smoke tests.
         test_suites: The names of test suites and/or test cases to execute.
-        system_under_test_node: The SUT node to use in this test run.
-        traffic_generator_node: The TG node to use in this test run.
-        vdevs: The names of virtual devices to test.
+        system_under_test_node: The SUT node configuration to use in this test run.
+        traffic_generator_node: The TG node name to use in this test run.
         random_seed: The seed to use for pseudo-random generation.
     """
 
-    dpdk_config: DPDKConfiguration
-    perf: bool
-    func: bool
-    skip_smoke_tests: bool
-    test_suites: list[TestSuiteConfig]
-    system_under_test_node: SutNodeConfiguration
-    traffic_generator_node: TGNodeConfiguration
-    vdevs: list[str]
-    random_seed: int | None
+    dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
+    perf: bool = Field(description="Enable performance testing.")
+    func: bool = Field(description="Enable functional testing.")
+    skip_smoke_tests: bool = False
+    test_suites: list[TestSuiteConfig] = Field(min_length=1)
+    system_under_test_node: TestRunSUTNodeConfiguration
+    traffic_generator_node: str
+    random_seed: int | None = None
 
-    @classmethod
-    def from_dict(
-        cls,
-        d: TestRunConfigDict,
-        node_map: dict[str, SutNodeConfiguration | TGNodeConfiguration],
-    ) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
-
-        The DPDK build and the test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
-
-        Args:
-            d: The test run configuration dictionary.
-            node_map: A dictionary mapping node names to their config objects.
-
-        Returns:
-            The test run configuration instance.
-        """
-        test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"]))
-        sut_name = d["system_under_test_node"]["node_name"]
-        skip_smoke_tests = d.get("skip_smoke_tests", False)
-        assert sut_name in node_map, f"Unknown SUT {sut_name} in test run {d}"
-        system_under_test_node = node_map[sut_name]
-        assert isinstance(
-            system_under_test_node, SutNodeConfiguration
-        ), f"Invalid SUT configuration {system_under_test_node}"
-
-        tg_name = d["traffic_generator_node"]
-        assert tg_name in node_map, f"Unknown TG {tg_name} in test run {d}"
-        traffic_generator_node = node_map[tg_name]
-        assert isinstance(
-            traffic_generator_node, TGNodeConfiguration
-        ), f"Invalid TG configuration {traffic_generator_node}"
-
-        vdevs = (
-            d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else []
-        )
-        random_seed = d.get("random_seed", None)
-        return cls(
-            dpdk_config=DPDKConfiguration.from_dict(d["dpdk_build"]),
-            perf=d["perf"],
-            func=d["func"],
-            skip_smoke_tests=skip_smoke_tests,
-            test_suites=test_suites,
-            system_under_test_node=system_under_test_node,
-            traffic_generator_node=traffic_generator_node,
-            vdevs=vdevs,
-            random_seed=random_seed,
-        )
-
-    def copy_and_modify(self, **kwargs) -> Self:
-        """Create a shallow copy with any of the fields modified.
-
-        The only new data are those passed to this method.
-        The rest are copied from the object's fields calling the method.
 
-        Args:
-            **kwargs: The names and types of keyword arguments are defined
-                by the fields of the :class:`TestRunConfiguration` class.
-
-        Returns:
-            The copied and modified test run configuration.
-        """
-        new_config = {}
-        for field in fields(self):
-            if field.name in kwargs:
-                new_config[field.name] = kwargs[field.name]
-            else:
-                new_config[field.name] = getattr(self, field.name)
+class TestRunWithNodesConfiguration(NamedTuple):
+    """Tuple containing the configuration of the test run and its associated nodes."""
 
-        return type(self)(**new_config)
+    #:
+    test_run_config: TestRunConfiguration
+    #:
+    sut_node_config: SutNodeConfiguration
+    #:
+    tg_node_config: TGNodeConfiguration
 
 
-@dataclass(slots=True, frozen=True)
-class Configuration:
+class Configuration(BaseModel, extra="forbid"):
     """DTS testbed and test configuration.
 
-    The node configuration is not stored in this object. Rather, all used node configurations
-    are stored inside the test run configuration where the nodes are actually used.
-
     Attributes:
         test_runs: Test run configurations.
+        nodes: Node configurations.
     """
 
-    test_runs: list[TestRunConfiguration]
+    test_runs: list[TestRunConfiguration] = Field(min_length=1)
+    nodes: list[NodeConfigurationTypes] = Field(min_length=1)
 
-    @classmethod
-    def from_dict(cls, d: ConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
+    @cached_property
+    def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]:
+        """List of test runs with the associated nodes."""
+        test_runs_with_nodes = []
 
-        DPDK build and test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
+        for test_run_no, test_run in enumerate(self.test_runs):
+            sut_node_name = test_run.system_under_test_node.node_name
+            sut_node = next(filter(lambda n: n.name == sut_node_name, self.nodes), None)
 
-        Args:
-            d: The configuration dictionary.
+            assert sut_node is not None, (
+                f"test_runs.{test_run_no}.sut_node_config.node_name "
+                f"({test_run.system_under_test_node.node_name}) is not a valid node name"
+            )
+            assert isinstance(sut_node, SutNodeConfiguration), (
+                f"test_runs.{test_run_no}.sut_node_config.node_name is a valid node name, "
+                "but it is not a valid SUT node"
+            )
 
-        Returns:
-            The whole configuration instance.
-        """
-        nodes: list[SutNodeConfiguration | TGNodeConfiguration] = list(
-            map(NodeConfiguration.from_dict, d["nodes"])
-        )
-        assert len(nodes) > 0, "There must be a node to test"
+            tg_node_name = test_run.traffic_generator_node
+            tg_node = next(filter(lambda n: n.name == tg_node_name, self.nodes), None)
 
-        node_map = {node.name: node for node in nodes}
-        assert len(nodes) == len(node_map), "Duplicate node names are not allowed"
+            assert tg_node is not None, (
+                f"test_runs.{test_run_no}.tg_node_name "
+                f"({test_run.traffic_generator_node}) is not a valid node name"
+            )
+            assert isinstance(tg_node, TGNodeConfiguration), (
+                f"test_runs.{test_run_no}.tg_node_name is a valid node name, "
+                "but it is not a valid TG node"
+            )
 
-        test_runs: list[TestRunConfiguration] = list(
-            map(TestRunConfiguration.from_dict, d["test_runs"], [node_map for _ in d])
-        )
+            test_runs_with_nodes.append(TestRunWithNodesConfiguration(test_run, sut_node, tg_node))
 
-        return cls(test_runs=test_runs)
+        return test_runs_with_nodes
+
+    @field_validator("nodes")
+    @classmethod
+    def validate_node_names(cls, nodes: list[NodeConfiguration]) -> list[NodeConfiguration]:
+        """Validate that the node names are unique."""
+        nodes_by_name: dict[str, int] = {}
+        for node_no, node in enumerate(nodes):
+            assert node.name not in nodes_by_name, (
+                f"node {node_no} cannot have the same name as node {nodes_by_name[node.name]} "
+                f"({node.name})"
+            )
+            nodes_by_name[node.name] = node_no
+
+        return nodes
+
+    @model_validator(mode="after")
+    def validate_ports(self) -> Self:
+        """Validate that the ports are all linked to valid ones."""
+        port_links: dict[tuple[str, str], Literal[False] | tuple[int, int]] = {
+            (node.name, port.pci): False for node in self.nodes for port in node.ports
+        }
+
+        for node_no, node in enumerate(self.nodes):
+            for port_no, port in enumerate(node.ports):
+                peer_port_identifier = (port.peer_node, port.peer_pci)
+                peer_port = port_links.get(peer_port_identifier, None)
+                assert peer_port is not None, (
+                    "invalid peer port specified for " f"nodes.{node_no}.ports.{port_no}"
+                )
+                assert peer_port is False, (
+                    f"the peer port specified for nodes.{node_no}.ports.{port_no} "
+                    f"is already linked to nodes.{peer_port[0]}.ports.{peer_port[1]}"
+                )
+                port_links[peer_port_identifier] = (node_no, port_no)
+
+        return self
+
+    @model_validator(mode="after")
+    def validate_test_runs_with_nodes(self) -> Self:
+        """Validate the test runs to nodes associations.
+
+        This validator relies on the cached property `test_runs_with_nodes` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        if self.test_runs_with_nodes:
+            pass
+        return self
 
 
 def load_config(config_file_path: Path) -> Configuration:
     """Load DTS test run configuration from a file.
 
-    Load the YAML test run configuration file
-    and :download:`the configuration file schema <conf_yaml_schema.json>`,
-    validate the test run configuration file, and create a test run configuration object.
+    Load the YAML test run configuration file, validate it, and create a test run configuration
+    object.
 
     The YAML test run configuration file is specified in the :option:`--config-file` command line
     argument or the :envvar:`DTS_CFG_FILE` environment variable.
@@ -702,14 +708,14 @@ def load_config(config_file_path: Path) -> Configuration:
 
     Returns:
         The parsed test run configuration.
+
+    Raises:
+        ConfigurationError: If the supplied configuration file is invalid.
     """
     with open(config_file_path, "r") as f:
         config_data = yaml.safe_load(f)
 
-    schema_path = os.path.join(Path(__file__).parent.resolve(), "conf_yaml_schema.json")
-
-    with open(schema_path, "r") as f:
-        schema = json.load(f)
-    config = warlock.model_factory(schema, name="_Config")(config_data)
-    config_obj: Configuration = Configuration.from_dict(dict(config))  # type: ignore[arg-type]
-    return config_obj
+    try:
+        return Configuration.model_validate(config_data)
+    except ValidationError as e:
+        raise ConfigurationError("failed to load the supplied configuration") from e
diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
deleted file mode 100644
index 3e37555fc2..0000000000
--- a/dts/framework/config/conf_yaml_schema.json
+++ /dev/null
@@ -1,458 +0,0 @@
-{
-  "$schema": "https://json-schema.org/draft-07/schema",
-  "title": "DTS Config Schema",
-  "definitions": {
-    "node_name": {
-      "type": "string",
-      "description": "A unique identifier for a node"
-    },
-    "NIC": {
-      "type": "string",
-      "enum": [
-        "ALL",
-        "ConnectX3_MT4103",
-        "ConnectX4_LX_MT4117",
-        "ConnectX4_MT4115",
-        "ConnectX5_MT4119",
-        "ConnectX5_MT4121",
-        "I40E_10G-10G_BASE_T_BC",
-        "I40E_10G-10G_BASE_T_X722",
-        "I40E_10G-SFP_X722",
-        "I40E_10G-SFP_XL710",
-        "I40E_10G-X722_A0",
-        "I40E_1G-1G_BASE_T_X722",
-        "I40E_25G-25G_SFP28",
-        "I40E_40G-QSFP_A",
-        "I40E_40G-QSFP_B",
-        "IAVF-ADAPTIVE_VF",
-        "IAVF-VF",
-        "IAVF_10G-X722_VF",
-        "ICE_100G-E810C_QSFP",
-        "ICE_25G-E810C_SFP",
-        "ICE_25G-E810_XXV_SFP",
-        "IGB-I350_VF",
-        "IGB_1G-82540EM",
-        "IGB_1G-82545EM_COPPER",
-        "IGB_1G-82571EB_COPPER",
-        "IGB_1G-82574L",
-        "IGB_1G-82576",
-        "IGB_1G-82576_QUAD_COPPER",
-        "IGB_1G-82576_QUAD_COPPER_ET2",
-        "IGB_1G-82580_COPPER",
-        "IGB_1G-I210_COPPER",
-        "IGB_1G-I350_COPPER",
-        "IGB_1G-I354_SGMII",
-        "IGB_1G-PCH_LPTLP_I218_LM",
-        "IGB_1G-PCH_LPTLP_I218_V",
-        "IGB_1G-PCH_LPT_I217_LM",
-        "IGB_1G-PCH_LPT_I217_V",
-        "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
-        "IGC-I225_LM",
-        "IGC-I226_LM",
-        "IXGBE_10G-82599_SFP",
-        "IXGBE_10G-82599_SFP_SF_QP",
-        "IXGBE_10G-82599_T3_LOM",
-        "IXGBE_10G-82599_VF",
-        "IXGBE_10G-X540T",
-        "IXGBE_10G-X540_VF",
-        "IXGBE_10G-X550EM_A_SFP",
-        "IXGBE_10G-X550EM_X_10G_T",
-        "IXGBE_10G-X550EM_X_SFP",
-        "IXGBE_10G-X550EM_X_VF",
-        "IXGBE_10G-X550T",
-        "IXGBE_10G-X550_VF",
-        "brcm_57414",
-        "brcm_P2100G",
-        "cavium_0011",
-        "cavium_a034",
-        "cavium_a063",
-        "cavium_a064",
-        "fastlinq_ql41000",
-        "fastlinq_ql41000_vf",
-        "fastlinq_ql45000",
-        "fastlinq_ql45000_vf",
-        "hi1822",
-        "virtio"
-      ]
-    },
-
-    "ARCH": {
-      "type": "string",
-      "enum": [
-        "x86_64",
-        "arm64",
-        "ppc64le"
-      ]
-    },
-    "OS": {
-      "type": "string",
-      "enum": [
-        "linux"
-      ]
-    },
-    "cpu": {
-      "type": "string",
-      "description": "Native should be the default on x86",
-      "enum": [
-        "native",
-        "armv8a",
-        "dpaa2",
-        "thunderx",
-        "xgene1"
-      ]
-    },
-    "compiler": {
-      "type": "string",
-      "enum": [
-        "gcc",
-        "clang",
-        "icc",
-        "mscv"
-      ]
-    },
-    "build_options": {
-      "type": "object",
-      "properties": {
-        "arch": {
-          "type": "string",
-          "enum": [
-            "ALL",
-            "x86_64",
-            "arm64",
-            "ppc64le",
-            "other"
-          ]
-        },
-        "os": {
-          "$ref": "#/definitions/OS"
-        },
-        "cpu": {
-          "$ref": "#/definitions/cpu"
-        },
-        "compiler": {
-          "$ref": "#/definitions/compiler"
-        },
-        "compiler_wrapper": {
-          "type": "string",
-          "description": "This will be added before compiler to the CC variable when building DPDK. Optional."
-        }
-      },
-      "additionalProperties": false,
-      "required": [
-        "arch",
-        "os",
-        "cpu",
-        "compiler"
-      ]
-    },
-    "dpdk_build": {
-      "type": "object",
-      "description": "DPDK source and build configuration.",
-      "properties": {
-        "dpdk_tree": {
-          "type": "string",
-          "description": "The path to the DPDK source tree directory to test. Only one of `dpdk_tree` or `tarball` must be provided."
-        },
-        "tarball": {
-          "type": "string",
-          "description": "The path to the DPDK source tarball to test. Only one of `dpdk_tree` or `tarball` must be provided."
-        },
-        "remote": {
-          "type": "boolean",
-          "description": "Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball` is located on the SUT node, instead of the execution host."
-        },
-        "precompiled_build_dir": {
-          "type": "string",
-          "description": "If it's defined, DPDK has been pre-built and the build directory is located in a subdirectory of DPDK tree root directory. Otherwise, will be using a `build_options` to build the DPDK from source. Either this or `build_options` must be defined, but not both."
-        },
-        "build_options": {
-          "$ref": "#/definitions/build_options",
-          "description": "Either this or `precompiled_build_dir` must be defined, but not both. DPDK build configuration supported by DTS."
-        }
-      },
-      "allOf": [
-        {
-          "oneOf": [
-            {
-            "required": [
-              "dpdk_tree"
-              ]
-            },
-            {
-              "required": [
-                "tarball"
-              ]
-            }
-          ]
-        },
-        {
-          "oneOf": [
-            {
-              "required": [
-                "precompiled_build_dir"
-              ]
-            },
-            {
-              "required": [
-                "build_options"
-              ]
-            }
-          ]
-        }
-      ],
-      "additionalProperties": false
-    },
-    "hugepages_2mb": {
-      "type": "object",
-      "description": "Optional hugepage configuration. If not specified, hugepages won't be configured and DTS will use system configuration.",
-      "properties": {
-        "number_of": {
-          "type": "integer",
-          "description": "The number of hugepages to configure. Hugepage size will be the system default."
-        },
-        "force_first_numa": {
-          "type": "boolean",
-          "description": "Set to True to force configuring hugepages on the first NUMA node. Defaults to False."
-        }
-      },
-      "additionalProperties": false,
-      "required": [
-        "number_of"
-      ]
-    },
-    "mac_address": {
-      "type": "string",
-      "description": "A MAC address",
-      "pattern": "^([0-9A-Fa-f]{2}[:-]){5}([0-9A-Fa-f]{2})$"
-    },
-    "pci_address": {
-      "type": "string",
-      "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
-    },
-    "port_peer_address": {
-      "description": "Peer is a TRex port, and IXIA port or a PCI address",
-      "oneOf": [
-        {
-          "description": "PCI peer port",
-          "$ref": "#/definitions/pci_address"
-        }
-      ]
-    },
-    "test_suite": {
-      "type": "string",
-      "enum": [
-        "hello_world",
-        "os_udp",
-        "pmd_buffer_scatter"
-      ]
-    },
-    "test_target": {
-      "type": "object",
-      "properties": {
-        "suite": {
-          "$ref": "#/definitions/test_suite"
-        },
-        "cases": {
-          "type": "array",
-          "description": "If specified, only this subset of test suite's test cases will be run.",
-          "items": {
-            "type": "string"
-          },
-          "minimum": 1
-        }
-      },
-      "required": [
-        "suite"
-      ],
-      "additionalProperties": false
-    }
-  },
-  "type": "object",
-  "properties": {
-    "nodes": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "name": {
-            "type": "string",
-            "description": "A unique identifier for this node"
-          },
-          "hostname": {
-            "type": "string",
-            "description": "A hostname from which the node running DTS can access this node. This can also be an IP address."
-          },
-          "user": {
-            "type": "string",
-            "description": "The user to access this node with."
-          },
-          "password": {
-            "type": "string",
-            "description": "The password to use on this node. Use only as a last resort. SSH keys are STRONGLY preferred."
-          },
-          "arch": {
-            "$ref": "#/definitions/ARCH"
-          },
-          "os": {
-            "$ref": "#/definitions/OS"
-          },
-          "lcores": {
-            "type": "string",
-            "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
-            "description": "Optional comma-separated list of logical cores to use, e.g.: 1,2,3,4,5,18-22. Defaults to 1. An empty string means use all lcores."
-          },
-          "use_first_core": {
-            "type": "boolean",
-            "description": "Indicate whether DPDK should use the first physical core. It won't be used by default."
-          },
-          "memory_channels": {
-            "type": "integer",
-            "description": "How many memory channels to use. Optional, defaults to 1."
-          },
-          "hugepages_2mb": {
-            "$ref": "#/definitions/hugepages_2mb"
-          },
-          "ports": {
-            "type": "array",
-            "items": {
-              "type": "object",
-              "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.",
-              "properties": {
-                "pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The local PCI address of the port"
-                },
-                "os_driver_for_dpdk": {
-                  "type": "string",
-                  "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)"
-                },
-                "os_driver": {
-                  "type": "string",
-                  "description": "The driver normally used by this port (ex: i40e)"
-                },
-                "peer_node": {
-                  "type": "string",
-                  "description": "The name of the node the peer port is on"
-                },
-                "peer_pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The PCI address of the peer port"
-                }
-              },
-              "additionalProperties": false,
-              "required": [
-                "pci",
-                "os_driver_for_dpdk",
-                "os_driver",
-                "peer_node",
-                "peer_pci"
-              ]
-            },
-            "minimum": 1
-          },
-          "traffic_generator": {
-            "oneOf": [
-              {
-                "type": "object",
-                "description": "Scapy traffic generator. Used for functional testing.",
-                "properties": {
-                  "type": {
-                    "type": "string",
-                    "enum": [
-                      "SCAPY"
-                    ]
-                  }
-                }
-              }
-            ]
-          }
-        },
-        "additionalProperties": false,
-        "required": [
-          "name",
-          "hostname",
-          "user",
-          "arch",
-          "os"
-        ]
-      },
-      "minimum": 1
-    },
-    "test_runs": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "dpdk_build": {
-            "$ref": "#/definitions/dpdk_build"
-          },
-          "perf": {
-            "type": "boolean",
-            "description": "Enable performance testing."
-          },
-          "func": {
-            "type": "boolean",
-            "description": "Enable functional testing."
-          },
-          "test_suites": {
-            "type": "array",
-            "items": {
-              "oneOf": [
-                {
-                  "$ref": "#/definitions/test_suite"
-                },
-                {
-                  "$ref": "#/definitions/test_target"
-                }
-              ]
-            }
-          },
-          "skip_smoke_tests": {
-            "description": "Optional field that allows you to skip smoke testing",
-            "type": "boolean"
-          },
-          "system_under_test_node": {
-            "type":"object",
-            "properties": {
-              "node_name": {
-                "$ref": "#/definitions/node_name"
-              },
-              "vdevs": {
-                "description": "Optional list of names of vdevs to be used in the test run",
-                "type": "array",
-                "items": {
-                  "type": "string"
-                }
-              }
-            },
-            "required": [
-              "node_name"
-            ]
-          },
-          "traffic_generator_node": {
-            "$ref": "#/definitions/node_name"
-          },
-          "random_seed": {
-            "type": "integer",
-            "description": "Optional field. Allows you to set a seed for pseudo-random generation."
-          }
-        },
-        "additionalProperties": false,
-        "required": [
-          "dpdk_build",
-          "perf",
-          "func",
-          "test_suites",
-          "system_under_test_node",
-          "traffic_generator_node"
-        ]
-      },
-      "minimum": 1
-    }
-  },
-  "required": [
-    "test_runs",
-    "nodes"
-  ],
-  "additionalProperties": false
-}
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
deleted file mode 100644
index 02e738a61e..0000000000
--- a/dts/framework/config/types.py
+++ /dev/null
@@ -1,149 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-"""Configuration dictionary contents specification.
-
-These type definitions serve as documentation of the configuration dictionary contents.
-
-The definitions use the built-in :class:`~typing.TypedDict` construct.
-"""
-
-from typing import TypedDict
-
-
-class PortConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    pci: str
-    #:
-    os_driver_for_dpdk: str
-    #:
-    os_driver: str
-    #:
-    peer_node: str
-    #:
-    peer_pci: str
-
-
-class TrafficGeneratorConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    type: str
-
-
-class HugepageConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    number_of: int
-    #:
-    force_first_numa: bool
-
-
-class NodeConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    hugepages_2mb: HugepageConfigurationDict
-    #:
-    name: str
-    #:
-    hostname: str
-    #:
-    user: str
-    #:
-    password: str
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    lcores: str
-    #:
-    use_first_core: bool
-    #:
-    ports: list[PortConfigDict]
-    #:
-    memory_channels: int
-    #:
-    traffic_generator: TrafficGeneratorConfigDict
-
-
-class DPDKBuildConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    cpu: str
-    #:
-    compiler: str
-    #:
-    compiler_wrapper: str
-
-
-class DPDKConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    dpdk_tree: str | None
-    #:
-    tarball: str | None
-    #:
-    remote: bool
-    #:
-    precompiled_build_dir: str | None
-    #:
-    build_options: DPDKBuildConfigDict
-
-
-class TestSuiteConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    suite: str
-    #:
-    cases: list[str]
-
-
-class TestRunSUTConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    node_name: str
-    #:
-    vdevs: list[str]
-
-
-class TestRunConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    dpdk_build: DPDKConfigurationDict
-    #:
-    perf: bool
-    #:
-    func: bool
-    #:
-    skip_smoke_tests: bool
-    #:
-    test_suites: TestSuiteConfigDict
-    #:
-    system_under_test_node: TestRunSUTConfigDict
-    #:
-    traffic_generator_node: str
-    #:
-    random_seed: int
-
-
-class ConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    nodes: list[NodeConfigDict]
-    #:
-    test_runs: list[TestRunConfigDict]
diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 195622c653..c3d9a27a8c 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -30,7 +30,15 @@
 from framework.testbed_model.sut_node import SutNode
 from framework.testbed_model.tg_node import TGNode
 
-from .config import Configuration, TestRunConfiguration, TestSuiteConfig, load_config
+from .config import (
+    Configuration,
+    DPDKPrecompiledBuildConfiguration,
+    SutNodeConfiguration,
+    TestRunConfiguration,
+    TestSuiteConfig,
+    TGNodeConfiguration,
+    load_config,
+)
 from .exception import (
     BlockingTestSuiteError,
     ConfigurationError,
@@ -133,11 +141,10 @@ def run(self) -> None:
             self._result.update_setup(Result.PASS)
 
             # for all test run sections
-            for test_run_config in self._configuration.test_runs:
+            for test_run_with_nodes_config in self._configuration.test_runs_with_nodes:
+                test_run_config, sut_node_config, tg_node_config = test_run_with_nodes_config
                 self._logger.set_stage(DtsStage.test_run_setup)
-                self._logger.info(
-                    f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
-                )
+                self._logger.info(f"Running test run with SUT '{sut_node_config.name}'.")
                 self._init_random_seed(test_run_config)
                 test_run_result = self._result.add_test_run(test_run_config)
                 # we don't want to modify the original config, so create a copy
@@ -145,7 +152,7 @@ def run(self) -> None:
                     SETTINGS.test_suites if SETTINGS.test_suites else test_run_config.test_suites
                 )
                 if not test_run_config.skip_smoke_tests:
-                    test_run_test_suites[:0] = [TestSuiteConfig.from_dict("smoke_tests")]
+                    test_run_test_suites[:0] = [TestSuiteConfig(test_suite="smoke_tests")]
                 try:
                     test_suites_with_cases = self._get_test_suites_with_cases(
                         test_run_test_suites, test_run_config.func, test_run_config.perf
@@ -161,6 +168,8 @@ def run(self) -> None:
                     self._connect_nodes_and_run_test_run(
                         sut_nodes,
                         tg_nodes,
+                        sut_node_config,
+                        tg_node_config,
                         test_run_config,
                         test_run_result,
                         test_suites_with_cases,
@@ -223,10 +232,10 @@ def _get_test_suites_with_cases(
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
+            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
             test_cases: list[type[TestCase]] = []
             func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
-                test_suite_config.test_cases
+                test_suite_config.test_cases_names
             )
             if func:
                 test_cases.extend(func_test_cases)
@@ -305,6 +314,8 @@ def _connect_nodes_and_run_test_run(
         self,
         sut_nodes: dict[str, SutNode],
         tg_nodes: dict[str, TGNode],
+        sut_node_config: SutNodeConfiguration,
+        tg_node_config: TGNodeConfiguration,
         test_run_config: TestRunConfiguration,
         test_run_result: TestRunResult,
         test_suites_with_cases: Iterable[TestSuiteWithCases],
@@ -319,24 +330,26 @@ def _connect_nodes_and_run_test_run(
         Args:
             sut_nodes: A dictionary storing connected/to be connected SUT nodes.
             tg_nodes: A dictionary storing connected/to be connected TG nodes.
+            sut_node_config: The test run's SUT node configuration.
+            tg_node_config: The test run's TG node configuration.
             test_run_config: A test run configuration.
             test_run_result: The test run's result.
             test_suites_with_cases: The test suites with test cases to run.
         """
-        sut_node = sut_nodes.get(test_run_config.system_under_test_node.name)
-        tg_node = tg_nodes.get(test_run_config.traffic_generator_node.name)
+        sut_node = sut_nodes.get(sut_node_config.name)
+        tg_node = tg_nodes.get(tg_node_config.name)
 
         try:
             if not sut_node:
-                sut_node = SutNode(test_run_config.system_under_test_node)
+                sut_node = SutNode(sut_node_config)
                 sut_nodes[sut_node.name] = sut_node
             if not tg_node:
-                tg_node = TGNode(test_run_config.traffic_generator_node)
+                tg_node = TGNode(tg_node_config)
                 tg_nodes[tg_node.name] = tg_node
         except Exception as e:
-            failed_node = test_run_config.system_under_test_node.name
+            failed_node = test_run_config.system_under_test_node.node_name
             if sut_node:
-                failed_node = test_run_config.traffic_generator_node.name
+                failed_node = test_run_config.traffic_generator_node
             self._logger.exception(f"The Creation of node {failed_node} failed.")
             test_run_result.update_setup(Result.FAIL, e)
 
@@ -369,14 +382,22 @@ def _run_test_run(
             ConfigurationError: If the DPDK sources or build is not set up from config or settings.
         """
         self._logger.info(
-            f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
+            f"Running test run with SUT '{test_run_config.system_under_test_node.node_name}'."
         )
         test_run_result.add_sut_info(sut_node.node_info)
         try:
-            dpdk_location = SETTINGS.dpdk_location or test_run_config.dpdk_config.dpdk_location
-            sut_node.set_up_test_run(test_run_config, dpdk_location)
+            dpdk_build_config = test_run_config.dpdk_config
+            if new_location := SETTINGS.dpdk_location:
+                dpdk_build_config = dpdk_build_config.model_copy(
+                    update={"dpdk_location": new_location}
+                )
+            if dir := SETTINGS.precompiled_build_dir:
+                dpdk_build_config = DPDKPrecompiledBuildConfiguration(
+                    dpdk_location=dpdk_build_config.dpdk_location, precompiled_build_dir=dir
+                )
+            sut_node.set_up_test_run(test_run_config, dpdk_build_config)
             test_run_result.add_dpdk_build_info(sut_node.get_dpdk_build_info())
-            tg_node.set_up_test_run(test_run_config, dpdk_location)
+            tg_node.set_up_test_run(test_run_config, dpdk_build_config)
             test_run_result.update_setup(Result.PASS)
         except Exception as e:
             self._logger.exception("Test run setup failed.")
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index a452319b90..1253ed86ac 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -60,9 +60,8 @@
 .. option:: --precompiled-build-dir
 .. envvar:: DTS_PRECOMPILED_BUILD_DIR
 
-    Define the subdirectory under the DPDK tree root directory where the pre-compiled binaries are
-    located. If set, DTS will build DPDK under the `build` directory instead. Can only be used with
-    --dpdk-tree or --tarball.
+    Define the subdirectory under the DPDK tree root directory or tarball where the pre-compiled
+    binaries are located.
 
 .. option:: --test-suite
 .. envvar:: DTS_TEST_SUITES
@@ -95,13 +94,21 @@
 import argparse
 import os
 import sys
-import tarfile
 from argparse import Action, ArgumentDefaultsHelpFormatter, _get_action_name
 from dataclasses import dataclass, field
 from pathlib import Path
 from typing import Callable
 
-from .config import DPDKLocation, TestSuiteConfig
+from pydantic import ValidationError
+
+from .config import (
+    DPDKLocation,
+    LocalDPDKTarballLocation,
+    LocalDPDKTreeLocation,
+    RemoteDPDKTarballLocation,
+    RemoteDPDKTreeLocation,
+    TestSuiteConfig,
+)
 
 
 @dataclass(slots=True)
@@ -122,6 +129,8 @@ class Settings:
     #:
     dpdk_location: DPDKLocation | None = None
     #:
+    precompiled_build_dir: str | None = None
+    #:
     compile_timeout: float = 1200
     #:
     test_suites: list[TestSuiteConfig] = field(default_factory=list)
@@ -383,13 +392,11 @@ def _get_parser() -> _DTSArgumentParser:
 
     action = dpdk_build.add_argument(
         "--precompiled-build-dir",
-        help="Define the subdirectory under the DPDK tree root directory where the pre-compiled "
-        "binaries are located. If set, DTS will build DPDK under the `build` directory instead. "
-        "Can only be used with --dpdk-tree or --tarball.",
+        help="Define the subdirectory under the DPDK tree root directory or tarball where the "
+        "pre-compiled binaries are located.",
         metavar="DIR_NAME",
     )
     _add_env_var_to_action(action)
-    _required_with_one_of(parser, action, "dpdk_tarball_path", "dpdk_tree_path")
 
     action = parser.add_argument(
         "--compile-timeout",
@@ -442,61 +449,61 @@ def _get_parser() -> _DTSArgumentParser:
 
 
 def _process_dpdk_location(
+    parser: _DTSArgumentParser,
     dpdk_tree: str | None,
     tarball: str | None,
     remote: bool,
-    build_dir: str | None,
-):
+) -> DPDKLocation | None:
     """Process and validate DPDK build arguments.
 
     Ensures that either `dpdk_tree` or `tarball` is provided. Validate existence and format of
     `dpdk_tree` or `tarball` on local filesystem, if `remote` is False. Constructs and returns
-    the :class:`DPDKLocation` with the provided parameters if validation is successful.
+    any valid :class:`DPDKLocation` with the provided parameters if validation is successful.
 
     Args:
-        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
-            must be provided.
-        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
-            provided.
+        dpdk_tree: The path to the DPDK source tree directory.
+        tarball: The path to the DPDK tarball.
         remote: If :data:`True`, `dpdk_tree` or `tarball` is located on the SUT node, instead of the
             execution host.
-        build_dir: If it's defined, DPDK has been pre-built and the build directory is located in a
-            subdirectory of `dpdk_tree` or `tarball` root directory.
 
     Returns:
         A DPDK location if construction is successful, otherwise None.
-
-    Raises:
-        argparse.ArgumentTypeError: If `dpdk_tree` or `tarball` not found in local filesystem or
-            they aren't in the right format.
     """
-    if not (dpdk_tree or tarball):
-        return None
-
-    if not remote:
-        if dpdk_tree:
-            if not Path(dpdk_tree).exists():
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tree '{dpdk_tree}' not found in local filesystem."
-                )
-
-            if not Path(dpdk_tree).is_dir():
-                raise argparse.ArgumentTypeError(f"DPDK tree '{dpdk_tree}' must be a directory.")
-
-            dpdk_tree = os.path.realpath(dpdk_tree)
-
-        if tarball:
-            if not Path(tarball).exists():
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tarball '{tarball}' not found in local filesystem."
-                )
-
-            if not tarfile.is_tarfile(tarball):
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tarball '{tarball}' must be a valid tar archive."
-                )
-
-    return DPDKLocation(dpdk_tree=dpdk_tree, tarball=tarball, remote=remote, build_dir=build_dir)
+    if dpdk_tree:
+        action = parser.find_action("dpdk_tree", _is_from_env)
+
+        try:
+            if remote:
+                return RemoteDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
+            else:
+                return LocalDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
+        except ValidationError as e:
+            print(
+                "An error has occurred while validating the DPDK tree supplied in the "
+                f"{'environment variable' if action else 'arguments'}:",
+                file=sys.stderr,
+            )
+            print(e, file=sys.stderr)
+            sys.exit(1)
+
+    if tarball:
+        action = parser.find_action("tarball", _is_from_env)
+
+        try:
+            if remote:
+                return RemoteDPDKTarballLocation.model_validate({"tarball": tarball})
+            else:
+                return LocalDPDKTarballLocation.model_validate({"tarball": tarball})
+        except ValidationError as e:
+            print(
+                "An error has occurred while validating the DPDK tarball supplied in the "
+                f"{'environment variable' if action else 'arguments'}:",
+                file=sys.stderr,
+            )
+            print(e, file=sys.stderr)
+            sys.exit(1)
+
+    return None
 
 
 def _process_test_suites(
@@ -512,11 +519,24 @@ def _process_test_suites(
     Returns:
         A list of test suite configurations to execute.
     """
-    if parser.find_action("test_suites", _is_from_env):
+    action = parser.find_action("test_suites", _is_from_env)
+    if action:
         # Environment variable in the form of "SUITE1 CASE1 CASE2, SUITE2 CASE1, SUITE3, ..."
         args = [suite_with_cases.split() for suite_with_cases in args[0][0].split(",")]
 
-    return [TestSuiteConfig(test_suite, test_cases) for [test_suite, *test_cases] in args]
+    try:
+        return [
+            TestSuiteConfig(test_suite=test_suite, test_cases=test_cases)
+            for [test_suite, *test_cases] in args
+        ]
+    except ValidationError as e:
+        print(
+            "An error has occurred while validating the test suites supplied in the "
+            f"{'environment variable' if action else 'arguments'}:",
+            file=sys.stderr,
+        )
+        print(e, file=sys.stderr)
+        sys.exit(1)
 
 
 def get_settings() -> Settings:
@@ -536,7 +556,7 @@ def get_settings() -> Settings:
     args = parser.parse_args()
 
     args.dpdk_location = _process_dpdk_location(
-        args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source, args.precompiled_build_dir
+        parser, args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source
     )
     args.test_suites = _process_test_suites(parser, args.test_suites)
 
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index 62867fd80c..6031eaf937 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -17,7 +17,12 @@
 from ipaddress import IPv4Interface, IPv6Interface
 from typing import Union
 
-from framework.config import OS, DPDKLocation, NodeConfiguration, TestRunConfiguration
+from framework.config import (
+    OS,
+    DPDKBuildConfiguration,
+    NodeConfiguration,
+    TestRunConfiguration,
+)
 from framework.exception import ConfigurationError
 from framework.logger import DTSLogger, get_dts_logger
 
@@ -89,13 +94,15 @@ def __init__(self, node_config: NodeConfiguration):
         self._init_ports()
 
     def _init_ports(self) -> None:
-        self.ports = [Port(port_config) for port_config in self.config.ports]
+        self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
         self.main_session.update_ports(self.ports)
         for port in self.ports:
             self.configure_port_state(port)
 
     def set_up_test_run(
-        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+        self,
+        test_run_config: TestRunConfiguration,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Test run setup steps.
 
@@ -105,7 +112,7 @@ def set_up_test_run(
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
-            dpdk_location: The target source of the DPDK tree.
+            dpdk_build_config: The build configuration of DPDK.
         """
         self._setup_hugepages()
 
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 6194ddb989..23baf1df89 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -345,7 +345,7 @@ def extract_remote_tarball(
         """
 
     @abstractmethod
-    def is_remote_dir(self, remote_path: str) -> bool:
+    def is_remote_dir(self, remote_path: PurePath) -> bool:
         """Check if the `remote_path` is a directory.
 
         Args:
@@ -356,7 +356,7 @@ def is_remote_dir(self, remote_path: str) -> bool:
         """
 
     @abstractmethod
-    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
+    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
         """Check if the `remote_tarball_path` is a tar archive.
 
         Args:
diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
index 82c84cf4f8..817405bea4 100644
--- a/dts/framework/testbed_model/port.py
+++ b/dts/framework/testbed_model/port.py
@@ -54,7 +54,7 @@ class Port:
     mac_address: str = ""
     logical_name: str = ""
 
-    def __init__(self, config: PortConfig):
+    def __init__(self, node_name: str, config: PortConfig):
         """Initialize the port from `node_name` and `config`.
 
         Args:
@@ -62,7 +62,7 @@ def __init__(self, config: PortConfig):
             config: The test run configuration of the port.
         """
         self.identifier = PortIdentifier(
-            node=config.node,
+            node=node_name,
             pci=config.pci,
         )
         self.os_driver = config.os_driver
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 5ab7c18fb7..7a6a1b6f84 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -201,12 +201,12 @@ def extract_remote_tarball(
         if expected_dir:
             self.send_command(f"ls {expected_dir}", verify=True)
 
-    def is_remote_dir(self, remote_path: str) -> bool:
+    def is_remote_dir(self, remote_path: PurePath) -> bool:
         """Overrides :meth:`~.os_session.OSSession.is_remote_dir`."""
         result = self.send_command(f"test -d {remote_path}")
         return not result.return_code
 
-    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
+    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
         """Overrides :meth:`~.os_session.OSSession.is_remote_tarfile`."""
         result = self.send_command(f"tar -tvf {remote_tarball_path}")
         return not result.return_code
@@ -393,4 +393,8 @@ def get_node_info(self) -> NodeInfo:
             SETTINGS.timeout,
         ).stdout.split("\n")
         kernel_version = self.send_command("uname -r", SETTINGS.timeout).stdout
-        return NodeInfo(os_release_info[0].strip(), os_release_info[1].strip(), kernel_version)
+        return NodeInfo(
+            os_name=os_release_info[0].strip(),
+            os_version=os_release_info[1].strip(),
+            kernel_version=kernel_version,
+        )
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index e160386324..f3d1eac68e 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -14,13 +14,19 @@
 
 import os
 import time
-from pathlib import PurePath
+from pathlib import Path, PurePath
 
 from framework.config import (
     DPDKBuildConfiguration,
     DPDKBuildInfo,
-    DPDKLocation,
+    DPDKBuildOptionsConfiguration,
+    DPDKPrecompiledBuildConfiguration,
+    DPDKUncompiledBuildConfiguration,
+    LocalDPDKTarballLocation,
+    LocalDPDKTreeLocation,
     NodeInfo,
+    RemoteDPDKTarballLocation,
+    RemoteDPDKTreeLocation,
     SutNodeConfiguration,
     TestRunConfiguration,
 )
@@ -166,7 +172,9 @@ def get_dpdk_build_info(self) -> DPDKBuildInfo:
         return DPDKBuildInfo(dpdk_version=self.dpdk_version, compiler_version=self.compiler_version)
 
     def set_up_test_run(
-        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+        self,
+        test_run_config: TestRunConfiguration,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Extend the test run setup with vdev config and DPDK build set up.
 
@@ -176,12 +184,12 @@ def set_up_test_run(
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
-            dpdk_location: The target source of the DPDK tree.
+            dpdk_build_config: The build configuration of DPDK.
         """
-        super().set_up_test_run(test_run_config, dpdk_location)
-        for vdev in test_run_config.vdevs:
+        super().set_up_test_run(test_run_config, dpdk_build_config)
+        for vdev in test_run_config.system_under_test_node.vdevs:
             self.virtual_devices.append(VirtualDevice(vdev))
-        self._set_up_dpdk(dpdk_location, test_run_config.dpdk_config.dpdk_build_config)
+        self._set_up_dpdk(dpdk_build_config)
 
     def tear_down_test_run(self) -> None:
         """Extend the test run teardown with virtual device teardown and DPDK teardown."""
@@ -190,7 +198,8 @@ def tear_down_test_run(self) -> None:
         self._tear_down_dpdk()
 
     def _set_up_dpdk(
-        self, dpdk_location: DPDKLocation, dpdk_build_config: DPDKBuildConfiguration | None
+        self,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Set up DPDK the SUT node and bind ports.
 
@@ -199,21 +208,26 @@ def _set_up_dpdk(
         are bound to those that DPDK needs.
 
         Args:
-            dpdk_location: The location of the DPDK tree.
-            dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
-                DTS will use pre-built DPDK from a :dataclass:`DPDKLocation`.
+            dpdk_build_config: A DPDK build configuration to test.
         """
-        self._set_remote_dpdk_tree_path(dpdk_location.dpdk_tree, dpdk_location.remote)
-        if not self._remote_dpdk_tree_path:
-            if dpdk_location.dpdk_tree:
-                self._copy_dpdk_tree(dpdk_location.dpdk_tree)
-            elif dpdk_location.tarball:
-                self._prepare_and_extract_dpdk_tarball(dpdk_location.tarball, dpdk_location.remote)
-
-        self._set_remote_dpdk_build_dir(dpdk_location.build_dir)
-        if not self.remote_dpdk_build_dir and dpdk_build_config:
-            self._configure_dpdk_build(dpdk_build_config)
-            self._build_dpdk()
+        match dpdk_build_config.dpdk_location:
+            case RemoteDPDKTreeLocation(dpdk_tree=dpdk_tree):
+                self._set_remote_dpdk_tree_path(dpdk_tree)
+            case LocalDPDKTreeLocation(dpdk_tree=dpdk_tree):
+                self._copy_dpdk_tree(dpdk_tree)
+            case RemoteDPDKTarballLocation(tarball=tarball):
+                self._validate_remote_dpdk_tarball(tarball)
+                self._prepare_and_extract_dpdk_tarball(tarball)
+            case LocalDPDKTarballLocation(tarball=tarball):
+                remote_tarball = self._copy_dpdk_tarball_to_remote(tarball)
+                self._prepare_and_extract_dpdk_tarball(remote_tarball)
+
+        match dpdk_build_config:
+            case DPDKPrecompiledBuildConfiguration(precompiled_build_dir=build_dir):
+                self._set_remote_dpdk_build_dir(build_dir)
+            case DPDKUncompiledBuildConfiguration(build_options=build_options):
+                self._configure_dpdk_build(build_options)
+                self._build_dpdk()
 
         self.bind_ports_to_driver()
 
@@ -226,37 +240,29 @@ def _tear_down_dpdk(self) -> None:
         self.compiler_version = None
         self.bind_ports_to_driver(for_dpdk=False)
 
-    def _set_remote_dpdk_tree_path(self, dpdk_tree: str | None, remote: bool):
+    def _set_remote_dpdk_tree_path(self, dpdk_tree: PurePath):
         """Set the path to the remote DPDK source tree based on the provided DPDK location.
 
-        If :data:`dpdk_tree` and :data:`remote` are defined, check existence of :data:`dpdk_tree`
-        on SUT node and sets the `_remote_dpdk_tree_path` property. Otherwise, sets nothing.
-
         Verify DPDK source tree existence on the SUT node, if exists sets the
         `_remote_dpdk_tree_path` property, otherwise sets nothing.
 
         Args:
             dpdk_tree: The path to the DPDK source tree directory.
-            remote: Indicates whether the `dpdk_tree` is already on the SUT node, instead of the
-                execution host.
 
         Raises:
             RemoteFileNotFoundError: If the DPDK source tree is expected to be on the SUT node but
                 is not found.
         """
-        if remote and dpdk_tree:
-            if not self.main_session.remote_path_exists(dpdk_tree):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
-                )
-            if not self.main_session.is_remote_dir(dpdk_tree):
-                raise ConfigurationError(
-                    f"Remote DPDK source tree '{dpdk_tree}' must be a directory."
-                )
-
-            self.__remote_dpdk_tree_path = PurePath(dpdk_tree)
-
-    def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
+        if not self.main_session.remote_path_exists(dpdk_tree):
+            raise RemoteFileNotFoundError(
+                f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
+            )
+        if not self.main_session.is_remote_dir(dpdk_tree):
+            raise ConfigurationError(f"Remote DPDK source tree '{dpdk_tree}' must be a directory.")
+
+        self.__remote_dpdk_tree_path = dpdk_tree
+
+    def _copy_dpdk_tree(self, dpdk_tree_path: Path) -> None:
         """Copy the DPDK source tree to the SUT.
 
         Args:
@@ -276,25 +282,45 @@ def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
             self._remote_tmp_dir, PurePath(dpdk_tree_path).name
         )
 
-    def _prepare_and_extract_dpdk_tarball(self, dpdk_tarball: str, remote: bool) -> None:
-        """Ensure the DPDK tarball is available on the SUT node and extract it.
+    def _validate_remote_dpdk_tarball(self, dpdk_tarball: PurePath) -> None:
+        """Validate the DPDK tarball on the SUT node.
 
-        This method ensures that the DPDK source tree tarball is available on the
-        SUT node. If the `dpdk_tarball` is local, it is copied to the SUT node. If the
-        `dpdk_tarball` is already on the SUT node, it verifies its existence.
-        The `dpdk_tarball` is then extracted on the SUT node.
+        Args:
+            dpdk_tarball: The path to the DPDK tarball on the SUT node.
 
-        This method sets the `_remote_dpdk_tree_path` property to the path of the
-        extracted DPDK tree on the SUT node.
+        Raises:
+            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but is
+                not found.
+            ConfigurationError: If the `dpdk_tarball` is a valid path but not a valid tar archive.
+        """
+        if not self.main_session.remote_path_exists(dpdk_tarball):
+            raise RemoteFileNotFoundError(f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT.")
+        if not self.main_session.is_remote_tarfile(dpdk_tarball):
+            raise ConfigurationError(f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive.")
+
+    def _copy_dpdk_tarball_to_remote(self, dpdk_tarball: Path) -> PurePath:
+        """Copy the local DPDK tarball to the SUT node.
 
         Args:
-            dpdk_tarball: The path to the DPDK tarball, either locally or on the SUT node.
-            remote: Indicates whether the `dpdk_tarball` is already on the SUT node, instead of the
-                execution host.
+            dpdk_tarball: The local path to the DPDK tarball.
 
-        Raises:
-            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but
-                is not found.
+        Returns:
+            The path of the copied tarball on the SUT node.
+        """
+        self._logger.info(
+            f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
+        )
+        self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
+        return self.main_session.join_remote_path(self._remote_tmp_dir, dpdk_tarball.name)
+
+    def _prepare_and_extract_dpdk_tarball(self, remote_tarball_path: PurePath) -> None:
+        """Prepare the remote DPDK tree path and extract the tarball.
+
+        This method extracts the remote tarball and sets the `_remote_dpdk_tree_path` property to
+        the path of the extracted DPDK tree on the SUT node.
+
+        Args:
+            remote_tarball_path: The path to the DPDK tarball on the SUT node.
         """
 
         def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
@@ -312,30 +338,9 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
                     return PurePath(str(remote_tarball_path).replace(suffixes_to_remove, ""))
             return remote_tarball_path.with_suffix("")
 
-        if remote:
-            if not self.main_session.remote_path_exists(dpdk_tarball):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT."
-                )
-            if not self.main_session.is_remote_tarfile(dpdk_tarball):
-                raise ConfigurationError(
-                    f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive."
-                )
-
-            remote_tarball_path = PurePath(dpdk_tarball)
-        else:
-            self._logger.info(
-                f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
-            )
-            self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
-
-            remote_tarball_path = self.main_session.join_remote_path(
-                self._remote_tmp_dir, PurePath(dpdk_tarball).name
-            )
-
         tarball_top_dir = self.main_session.get_tarball_top_dir(remote_tarball_path)
         self.__remote_dpdk_tree_path = self.main_session.join_remote_path(
-            PurePath(remote_tarball_path).parent,
+            remote_tarball_path.parent,
             tarball_top_dir or remove_tarball_suffix(remote_tarball_path),
         )
 
@@ -348,33 +353,32 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
             self._remote_dpdk_tree_path,
         )
 
-    def _set_remote_dpdk_build_dir(self, build_dir: str | None):
+    def _set_remote_dpdk_build_dir(self, build_dir: str):
         """Set the `remote_dpdk_build_dir` on the SUT.
 
-        If :data:`build_dir` is defined, check existence on the SUT node and sets the
+        Check existence on the SUT node and sets the
         `remote_dpdk_build_dir` property by joining the `_remote_dpdk_tree_path` and `build_dir`.
         Otherwise, sets nothing.
 
         Args:
-            build_dir: If it's defined, DPDK has been pre-built and the build directory is located
+            build_dir: DPDK has been pre-built and the build directory is located
                 in a subdirectory of `dpdk_tree` or `tarball` root directory.
 
         Raises:
             RemoteFileNotFoundError: If the `build_dir` is expected but does not exist on the SUT
                 node.
         """
-        if build_dir:
-            remote_dpdk_build_dir = self.main_session.join_remote_path(
-                self._remote_dpdk_tree_path, build_dir
+        remote_dpdk_build_dir = self.main_session.join_remote_path(
+            self._remote_dpdk_tree_path, build_dir
+        )
+        if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
+            raise RemoteFileNotFoundError(
+                f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
             )
-            if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
-                )
 
-            self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
+        self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
 
-    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
+    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildOptionsConfiguration) -> None:
         """Populate common environment variables and set the DPDK build related properties.
 
         This method sets `compiler_version` for additional information and `remote_dpdk_build_dir`
diff --git a/dts/framework/testbed_model/topology.py b/dts/framework/testbed_model/topology.py
index d38ae36c2a..17b333e76a 100644
--- a/dts/framework/testbed_model/topology.py
+++ b/dts/framework/testbed_model/topology.py
@@ -99,7 +99,16 @@ def __init__(self, sut_ports: Iterable[Port], tg_ports: Iterable[Port]):
                     port_links.append(PortLink(sut_port=sut_port, tg_port=tg_port))
 
         self.type = TopologyType.get_from_value(len(port_links))
-        dummy_port = Port(PortConfig("", "", "", "", "", ""))
+        dummy_port = Port(
+            "",
+            PortConfig(
+                pci="0000:00:00.0",
+                os_driver_for_dpdk="",
+                os_driver="",
+                peer_node="",
+                peer_pci="0000:00:00.0",
+            ),
+        )
         self.tg_port_egress = dummy_port
         self.sut_port_ingress = dummy_port
         self.sut_port_egress = dummy_port
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
index a319fa5320..945f6bbbbb 100644
--- a/dts/framework/testbed_model/traffic_generator/__init__.py
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -38,6 +38,4 @@ def create_traffic_generator(
         case ScapyTrafficGeneratorConfig():
             return ScapyTrafficGenerator(tg_node, traffic_generator_config, privileged=True)
         case _:
-            raise ConfigurationError(
-                f"Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
-            )
+            raise ConfigurationError(f"Unknown traffic generator: {traffic_generator_config.type}")
diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 469a12a780..5ac61cd4e1 100644
--- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -45,7 +45,7 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig, **kwargs):
         """
         self._config = config
         self._tg_node = tg_node
-        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
+        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.type}")
         super().__init__(tg_node, **kwargs)
 
     def send_packet(self, packet: Packet, port: Port) -> None:
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index 78a39e32c7..e862e3ac66 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -28,7 +28,7 @@
 
 from .exception import InternalError
 
-REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
+REGEX_FOR_PCI_ADDRESS: str = r"[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}"
 _REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC: str = r"(?:[\da-fA-F]{2}[:-]){5}[\da-fA-F]{2}"
 _REGEX_FOR_DOT_SEP_MAC: str = r"(?:[\da-fA-F]{4}.){2}[\da-fA-F]{4}"
 REGEX_FOR_MAC_ADDRESS: str = rf"{_REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC}|{_REGEX_FOR_DOT_SEP_MAC}"
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
index d7870bd40f..bc3a2a6bf9 100644
--- a/dts/tests/TestSuite_smoke_tests.py
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -127,7 +127,7 @@ def test_device_bound_to_driver(self) -> None:
         path_to_devbind = self.sut_node.path_to_devbind_script
 
         all_nics_in_dpdk_devbind = self.sut_node.main_session.send_command(
-            f"{path_to_devbind} --status | awk '{REGEX_FOR_PCI_ADDRESS}'",
+            f"{path_to_devbind} --status | awk '/{REGEX_FOR_PCI_ADDRESS}/'",
             SETTINGS.timeout,
         ).stdout
 
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v3 4/5] dts: remove warlock dependency
  2024-10-25 16:43 ` [PATCH v3 0/5] dts: Pydantic configuration Luca Vizzarro
                     ` (2 preceding siblings ...)
  2024-10-25 16:43   ` [PATCH v3 3/5] dts: use pydantic in the configuration Luca Vizzarro
@ 2024-10-25 16:43   ` Luca Vizzarro
  2024-10-25 16:43   ` [PATCH v3 5/5] dts: use TestSuiteSpec class imports Luca Vizzarro
  4 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-25 16:43 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

Since pydantic has completely replaced warlock, there is no more need to
keep it as a dependency. This removes it.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/poetry.lock    | 227 +--------------------------------------------
 dts/pyproject.toml |   1 -
 2 files changed, 1 insertion(+), 227 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index 56c50ad52c..9f7db60793 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -34,24 +34,6 @@ files = [
     {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
 ]
 
-[[package]]
-name = "attrs"
-version = "23.1.0"
-description = "Classes Without Boilerplate"
-optional = false
-python-versions = ">=3.7"
-files = [
-    {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
-    {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
-]
-
-[package.extras]
-cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
-dev = ["attrs[docs,tests]", "pre-commit"]
-docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
-tests = ["attrs[tests-no-zope]", "zope-interface"]
-tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
-
 [[package]]
 name = "babel"
 version = "2.13.1"
@@ -491,66 +473,6 @@ MarkupSafe = ">=2.0"
 [package.extras]
 i18n = ["Babel (>=2.7)"]
 
-[[package]]
-name = "jsonpatch"
-version = "1.33"
-description = "Apply JSON-Patches (RFC 6902)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpatch-1.33-py2.py3-none-any.whl", hash = "sha256:0ae28c0cd062bbd8b8ecc26d7d164fbbea9652a1a3693f3b956c1eae5145dade"},
-    {file = "jsonpatch-1.33.tar.gz", hash = "sha256:9fcd4009c41e6d12348b4a0ff2563ba56a2923a7dfee731d004e212e1ee5030c"},
-]
-
-[package.dependencies]
-jsonpointer = ">=1.9"
-
-[[package]]
-name = "jsonpointer"
-version = "2.4"
-description = "Identify specific nodes in a JSON document (RFC 6901)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpointer-2.4-py2.py3-none-any.whl", hash = "sha256:15d51bba20eea3165644553647711d150376234112651b4f1811022aecad7d7a"},
-    {file = "jsonpointer-2.4.tar.gz", hash = "sha256:585cee82b70211fa9e6043b7bb89db6e1aa49524340dde8ad6b63206ea689d88"},
-]
-
-[[package]]
-name = "jsonschema"
-version = "4.18.4"
-description = "An implementation of JSON Schema validation for Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema-4.18.4-py3-none-any.whl", hash = "sha256:971be834317c22daaa9132340a51c01b50910724082c2c1a2ac87eeec153a3fe"},
-    {file = "jsonschema-4.18.4.tar.gz", hash = "sha256:fb3642735399fa958c0d2aad7057901554596c63349f4f6b283c493cf692a25d"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-jsonschema-specifications = ">=2023.03.6"
-referencing = ">=0.28.4"
-rpds-py = ">=0.7.1"
-
-[package.extras]
-format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
-format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
-
-[[package]]
-name = "jsonschema-specifications"
-version = "2023.7.1"
-description = "The JSON Schema meta-schemas and vocabularies, exposed as a Registry"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema_specifications-2023.7.1-py3-none-any.whl", hash = "sha256:05adf340b659828a004220a9613be00fa3f223f2b82002e273dee62fd50524b1"},
-    {file = "jsonschema_specifications-2023.7.1.tar.gz", hash = "sha256:c91a50404e88a1f6ba40636778e2ee08f6e24c5613fe4c53ac24578a5a7f72bb"},
-]
-
-[package.dependencies]
-referencing = ">=0.28.0"
-
 [[package]]
 name = "markupsafe"
 version = "2.1.3"
@@ -1073,21 +995,6 @@ files = [
     {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
 ]
 
-[[package]]
-name = "referencing"
-version = "0.30.0"
-description = "JSON Referencing + Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "referencing-0.30.0-py3-none-any.whl", hash = "sha256:c257b08a399b6c2f5a3510a50d28ab5dbc7bbde049bcaf954d43c446f83ab548"},
-    {file = "referencing-0.30.0.tar.gz", hash = "sha256:47237742e990457f7512c7d27486394a9aadaf876cbfaa4be65b27b4f4d47c6b"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-rpds-py = ">=0.7.0"
-
 [[package]]
 name = "requests"
 version = "2.31.0"
@@ -1109,112 +1016,6 @@ urllib3 = ">=1.21.1,<3"
 socks = ["PySocks (>=1.5.6,!=1.5.7)"]
 use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
 
-[[package]]
-name = "rpds-py"
-version = "0.9.2"
-description = "Python bindings to Rust's persistent data structures (rpds)"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:ab6919a09c055c9b092798ce18c6c4adf49d24d4d9e43a92b257e3f2548231e7"},
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d55777a80f78dd09410bd84ff8c95ee05519f41113b2df90a69622f5540c4f8b"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a216b26e5af0a8e265d4efd65d3bcec5fba6b26909014effe20cd302fd1138fa"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:29cd8bfb2d716366a035913ced99188a79b623a3512292963d84d3e06e63b496"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:44659b1f326214950a8204a248ca6199535e73a694be8d3e0e869f820767f12f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:745f5a43fdd7d6d25a53ab1a99979e7f8ea419dfefebcab0a5a1e9095490ee5e"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a987578ac5214f18b99d1f2a3851cba5b09f4a689818a106c23dbad0dfeb760f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf4151acb541b6e895354f6ff9ac06995ad9e4175cbc6d30aaed08856558201f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:03421628f0dc10a4119d714a17f646e2837126a25ac7a256bdf7c3943400f67f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:13b602dc3e8dff3063734f02dcf05111e887f301fdda74151a93dbbc249930fe"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:fae5cb554b604b3f9e2c608241b5d8d303e410d7dfb6d397c335f983495ce7f6"},
-    {file = "rpds_py-0.9.2-cp310-none-win32.whl", hash = "sha256:47c5f58a8e0c2c920cc7783113df2fc4ff12bf3a411d985012f145e9242a2764"},
-    {file = "rpds_py-0.9.2-cp310-none-win_amd64.whl", hash = "sha256:4ea6b73c22d8182dff91155af018b11aac9ff7eca085750455c5990cb1cfae6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:e564d2238512c5ef5e9d79338ab77f1cbbda6c2d541ad41b2af445fb200385e3"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f411330a6376fb50e5b7a3e66894e4a39e60ca2e17dce258d53768fea06a37bd"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e7521f5af0233e89939ad626b15278c71b69dc1dfccaa7b97bd4cdf96536bb7"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8d3335c03100a073883857e91db9f2e0ef8a1cf42dc0369cbb9151c149dbbc1b"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d25b1c1096ef0447355f7293fbe9ad740f7c47ae032c2884113f8e87660d8f6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a5d3fbd02efd9cf6a8ffc2f17b53a33542f6b154e88dd7b42ef4a4c0700fdad"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c5934e2833afeaf36bd1eadb57256239785f5af0220ed8d21c2896ec4d3a765f"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:095b460e117685867d45548fbd8598a8d9999227e9061ee7f012d9d264e6048d"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:91378d9f4151adc223d584489591dbb79f78814c0734a7c3bfa9c9e09978121c"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:24a81c177379300220e907e9b864107614b144f6c2a15ed5c3450e19cf536fae"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:de0b6eceb46141984671802d412568d22c6bacc9b230174f9e55fc72ef4f57de"},
-    {file = "rpds_py-0.9.2-cp311-none-win32.whl", hash = "sha256:700375326ed641f3d9d32060a91513ad668bcb7e2cffb18415c399acb25de2ab"},
-    {file = "rpds_py-0.9.2-cp311-none-win_amd64.whl", hash = "sha256:0766babfcf941db8607bdaf82569ec38107dbb03c7f0b72604a0b346b6eb3298"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_10_7_x86_64.whl", hash = "sha256:b1440c291db3f98a914e1afd9d6541e8fc60b4c3aab1a9008d03da4651e67386"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0f2996fbac8e0b77fd67102becb9229986396e051f33dbceada3debaacc7033f"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f30d205755566a25f2ae0382944fcae2f350500ae4df4e795efa9e850821d82"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:159fba751a1e6b1c69244e23ba6c28f879a8758a3e992ed056d86d74a194a0f3"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a1f044792e1adcea82468a72310c66a7f08728d72a244730d14880cd1dabe36b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9251eb8aa82e6cf88510530b29eef4fac825a2b709baf5b94a6094894f252387"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01899794b654e616c8625b194ddd1e5b51ef5b60ed61baa7a2d9c2ad7b2a4238"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0c43f8ae8f6be1d605b0465671124aa8d6a0e40f1fb81dcea28b7e3d87ca1e1"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:207f57c402d1f8712618f737356e4b6f35253b6d20a324d9a47cb9f38ee43a6b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b52e7c5ae35b00566d244ffefba0f46bb6bec749a50412acf42b1c3f402e2c90"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:978fa96dbb005d599ec4fd9ed301b1cc45f1a8f7982d4793faf20b404b56677d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:6aa8326a4a608e1c28da191edd7c924dff445251b94653988efb059b16577a4d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:aad51239bee6bff6823bbbdc8ad85136c6125542bbc609e035ab98ca1e32a192"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4bd4dc3602370679c2dfb818d9c97b1137d4dd412230cfecd3c66a1bf388a196"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dd9da77c6ec1f258387957b754f0df60766ac23ed698b61941ba9acccd3284d1"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:190ca6f55042ea4649ed19c9093a9be9d63cd8a97880106747d7147f88a49d18"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:876bf9ed62323bc7dcfc261dbc5572c996ef26fe6406b0ff985cbcf460fc8a4c"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa2818759aba55df50592ecbc95ebcdc99917fa7b55cc6796235b04193eb3c55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9ea4d00850ef1e917815e59b078ecb338f6a8efda23369677c54a5825dbebb55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:5855c85eb8b8a968a74dc7fb014c9166a05e7e7a8377fb91d78512900aadd13d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:14c408e9d1a80dcb45c05a5149e5961aadb912fff42ca1dd9b68c0044904eb32"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:65a0583c43d9f22cb2130c7b110e695fff834fd5e832a776a107197e59a1898e"},
-    {file = "rpds_py-0.9.2-cp38-none-win32.whl", hash = "sha256:71f2f7715935a61fa3e4ae91d91b67e571aeb5cb5d10331ab681256bda2ad920"},
-    {file = "rpds_py-0.9.2-cp38-none-win_amd64.whl", hash = "sha256:674c704605092e3ebbbd13687b09c9f78c362a4bc710343efe37a91457123044"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:07e2c54bef6838fa44c48dfbc8234e8e2466d851124b551fc4e07a1cfeb37260"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f7fdf55283ad38c33e35e2855565361f4bf0abd02470b8ab28d499c663bc5d7c"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:890ba852c16ace6ed9f90e8670f2c1c178d96510a21b06d2fa12d8783a905193"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:50025635ba8b629a86d9d5474e650da304cb46bbb4d18690532dd79341467846"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:517cbf6e67ae3623c5127206489d69eb2bdb27239a3c3cc559350ef52a3bbf0b"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0836d71ca19071090d524739420a61580f3f894618d10b666cf3d9a1688355b1"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c439fd54b2b9053717cca3de9583be6584b384d88d045f97d409f0ca867d80f"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f68996a3b3dc9335037f82754f9cdbe3a95db42bde571d8c3be26cc6245f2324"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7d68dc8acded354c972116f59b5eb2e5864432948e098c19fe6994926d8e15c3"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:f963c6b1218b96db85fc37a9f0851eaf8b9040aa46dec112611697a7023da535"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5a46859d7f947061b4010e554ccd1791467d1b1759f2dc2ec9055fa239f1bc26"},
-    {file = "rpds_py-0.9.2-cp39-none-win32.whl", hash = "sha256:e07e5dbf8a83c66783a9fe2d4566968ea8c161199680e8ad38d53e075df5f0d0"},
-    {file = "rpds_py-0.9.2-cp39-none-win_amd64.whl", hash = "sha256:682726178138ea45a0766907957b60f3a1bf3acdf212436be9733f28b6c5af3c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_10_7_x86_64.whl", hash = "sha256:196cb208825a8b9c8fc360dc0f87993b8b260038615230242bf18ec84447c08d"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:c7671d45530fcb6d5e22fd40c97e1e1e01965fc298cbda523bb640f3d923b387"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83b32f0940adec65099f3b1c215ef7f1d025d13ff947975a055989cb7fd019a4"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7f67da97f5b9eac838b6980fc6da268622e91f8960e083a34533ca710bec8611"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03975db5f103997904c37e804e5f340c8fdabbb5883f26ee50a255d664eed58c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:987b06d1cdb28f88a42e4fb8a87f094e43f3c435ed8e486533aea0bf2e53d931"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c861a7e4aef15ff91233751619ce3a3d2b9e5877e0fcd76f9ea4f6847183aa16"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:02938432352359805b6da099c9c95c8a0547fe4b274ce8f1a91677401bb9a45f"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:ef1f08f2a924837e112cba2953e15aacfccbbfcd773b4b9b4723f8f2ddded08e"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:35da5cc5cb37c04c4ee03128ad59b8c3941a1e5cd398d78c37f716f32a9b7f67"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:141acb9d4ccc04e704e5992d35472f78c35af047fa0cfae2923835d153f091be"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_10_7_x86_64.whl", hash = "sha256:79f594919d2c1a0cc17d1988a6adaf9a2f000d2e1048f71f298b056b1018e872"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:a06418fe1155e72e16dddc68bb3780ae44cebb2912fbd8bb6ff9161de56e1798"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b2eb034c94b0b96d5eddb290b7b5198460e2d5d0c421751713953a9c4e47d10"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b08605d248b974eb02f40bdcd1a35d3924c83a2a5e8f5d0fa5af852c4d960af"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a0805911caedfe2736935250be5008b261f10a729a303f676d3d5fea6900c96a"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab2299e3f92aa5417d5e16bb45bb4586171c1327568f638e8453c9f8d9e0f020"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c8d7594e38cf98d8a7df25b440f684b510cf4627fe038c297a87496d10a174f"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8b9ec12ad5f0a4625db34db7e0005be2632c1013b253a4a60e8302ad4d462afd"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:1fcdee18fea97238ed17ab6478c66b2095e4ae7177e35fb71fbe561a27adf620"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_i686.whl", hash = "sha256:933a7d5cd4b84f959aedeb84f2030f0a01d63ae6cf256629af3081cf3e3426e8"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:686ba516e02db6d6f8c279d1641f7067ebb5dc58b1d0536c4aaebb7bf01cdc5d"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_10_7_x86_64.whl", hash = "sha256:0173c0444bec0a3d7d848eaeca2d8bd32a1b43f3d3fde6617aac3731fa4be05f"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:d576c3ef8c7b2d560e301eb33891d1944d965a4d7a2eacb6332eee8a71827db6"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed89861ee8c8c47d6beb742a602f912b1bb64f598b1e2f3d758948721d44d468"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1054a08e818f8e18910f1bee731583fe8f899b0a0a5044c6e680ceea34f93876"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99e7c4bb27ff1aab90dcc3e9d37ee5af0231ed98d99cb6f5250de28889a3d502"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c545d9d14d47be716495076b659db179206e3fd997769bc01e2d550eeb685596"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9039a11bca3c41be5a58282ed81ae422fa680409022b996032a43badef2a3752"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fb39aca7a64ad0c9490adfa719dbeeb87d13be137ca189d2564e596f8ba32c07"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:2d8b3b3a2ce0eaa00c5bbbb60b6713e94e7e0becab7b3db6c5c77f979e8ed1f1"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:99b1c16f732b3a9971406fbfe18468592c5a3529585a45a35adbc1389a529a03"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:c27ee01a6c3223025f4badd533bea5e87c988cb0ba2811b690395dfe16088cfe"},
-    {file = "rpds_py-0.9.2.tar.gz", hash = "sha256:8d70e8f14900f2657c249ea4def963bed86a29b81f81f5b76b5a9215680de945"},
-]
-
 [[package]]
 name = "scapy"
 version = "2.5.0"
@@ -1472,17 +1273,6 @@ files = [
     {file = "types_PyYAML-6.0.12.11-py3-none-any.whl", hash = "sha256:a461508f3096d1d5810ec5ab95d7eeecb651f3a15b71959999988942063bf01d"},
 ]
 
-[[package]]
-name = "typing-extensions"
-version = "4.11.0"
-description = "Backported and Experimental Type Hints for Python 3.8+"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "typing_extensions-4.11.0-py3-none-any.whl", hash = "sha256:c1f94d72897edaf4ce775bb7558d5b79d8126906a14ea5ed1635921406c0387a"},
-    {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
-]
-
 [[package]]
 name = "typing-extensions"
 version = "4.12.2"
@@ -1511,22 +1301,7 @@ secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.
 socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
 zstd = ["zstandard (>=0.18.0)"]
 
-[[package]]
-name = "warlock"
-version = "2.0.1"
-description = "Python object model built on JSON schema and JSON patch."
-optional = false
-python-versions = ">=3.7,<4.0"
-files = [
-    {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"},
-    {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"},
-]
-
-[package.dependencies]
-jsonpatch = ">=1,<2"
-jsonschema = ">=4,<5"
-
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827"
+content-hash = "310e2d3725e20ffc6ef017db92e8000c042eb2ac98a1a5eb441de17c87417e9f"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 6c2d1ca8a4..9a3fb02ee9 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -20,7 +20,6 @@ documentation = "https://doc.dpdk.org/guides/tools/dts.html"
 
 [tool.poetry.dependencies]
 python = "^3.10"
-warlock = "^2.0.1"
 PyYAML = "^6.0"
 types-PyYAML = "^6.0.8"
 fabric = "^2.7.1"
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v3 5/5] dts: use TestSuiteSpec class imports
  2024-10-25 16:43 ` [PATCH v3 0/5] dts: Pydantic configuration Luca Vizzarro
                     ` (3 preceding siblings ...)
  2024-10-25 16:43   ` [PATCH v3 4/5] dts: remove warlock dependency Luca Vizzarro
@ 2024-10-25 16:43   ` Luca Vizzarro
  4 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-25 16:43 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

The introduction of TestSuiteSpec adds auto-discovery of test suites,
which are also automatically imported. This causes double imports as the
runner loads the test suites. This changes the behaviour of the runner
to load the imported classes from TestSuiteSpec instead of importing
them again.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/runner.py | 84 ++++-------------------------------------
 1 file changed, 7 insertions(+), 77 deletions(-)

diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index c3d9a27a8c..5f5837a132 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -2,6 +2,7 @@
 # Copyright(c) 2010-2019 Intel Corporation
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
+# Copyright(c) 2024 Arm Limited
 
 """Test suite runner module.
 
@@ -17,8 +18,6 @@
 and the test case stage runs test cases individually.
 """
 
-import importlib
-import inspect
 import os
 import random
 import sys
@@ -39,12 +38,7 @@
     TGNodeConfiguration,
     load_config,
 )
-from .exception import (
-    BlockingTestSuiteError,
-    ConfigurationError,
-    SSHTimeoutError,
-    TestCaseVerifyError,
-)
+from .exception import BlockingTestSuiteError, SSHTimeoutError, TestCaseVerifyError
 from .logger import DTSLogger, DtsStage, get_dts_logger
 from .settings import SETTINGS
 from .test_result import (
@@ -215,11 +209,10 @@ def _get_test_suites_with_cases(
         func: bool,
         perf: bool,
     ) -> list[TestSuiteWithCases]:
-        """Test suites with test cases discovery.
+        """Get test suites with selected cases.
 
-        The test suites with test cases defined in the user configuration are discovered
-        and stored for future use so that we don't import the modules twice and so that
-        the list of test suites with test cases is available for recording right away.
+        The test suites with test cases defined in the user configuration are selected
+        and the corresponding functions and classes are gathered.
 
         Args:
             test_suite_configs: Test suite configurations.
@@ -227,12 +220,12 @@ def _get_test_suites_with_cases(
             perf: Whether to include performance test cases in the final list.
 
         Returns:
-            The discovered test suites, each with test cases.
+            The test suites, each with test cases.
         """
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
+            test_suite_class = test_suite_config.test_suite_spec.class_obj
             test_cases: list[type[TestCase]] = []
             func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
                 test_suite_config.test_cases_names
@@ -245,71 +238,8 @@ def _get_test_suites_with_cases(
             test_suites_with_cases.append(
                 TestSuiteWithCases(test_suite_class=test_suite_class, test_cases=test_cases)
             )
-
         return test_suites_with_cases
 
-    def _get_test_suite_class(self, module_name: str) -> type[TestSuite]:
-        """Find the :class:`TestSuite` class in `module_name`.
-
-        The full module name is `module_name` prefixed with `self._test_suite_module_prefix`.
-        The module name is a standard filename with words separated with underscores.
-        Search the `module_name` for a :class:`TestSuite` class which starts
-        with `self._test_suite_class_prefix`, continuing with CamelCase `module_name`.
-        The first matching class is returned.
-
-        The CamelCase convention applies to abbreviations, acronyms, initialisms and so on::
-
-            OS -> Os
-            TCP -> Tcp
-
-        Args:
-            module_name: The module name without prefix where to search for the test suite.
-
-        Returns:
-            The found test suite class.
-
-        Raises:
-            ConfigurationError: If the corresponding module is not found or
-                a valid :class:`TestSuite` is not found in the module.
-        """
-
-        def is_test_suite(object) -> bool:
-            """Check whether `object` is a :class:`TestSuite`.
-
-            The `object` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
-
-            Args:
-                object: The object to be checked.
-
-            Returns:
-                :data:`True` if `object` is a subclass of `TestSuite`.
-            """
-            try:
-                if issubclass(object, TestSuite) and object is not TestSuite:
-                    return True
-            except TypeError:
-                return False
-            return False
-
-        testsuite_module_path = f"{self._test_suite_module_prefix}{module_name}"
-        try:
-            test_suite_module = importlib.import_module(testsuite_module_path)
-        except ModuleNotFoundError as e:
-            raise ConfigurationError(
-                f"Test suite module '{testsuite_module_path}' not found."
-            ) from e
-
-        camel_case_suite_name = "".join(
-            [suite_word.capitalize() for suite_word in module_name.split("_")]
-        )
-        full_suite_name_to_find = f"{self._test_suite_class_prefix}{camel_case_suite_name}"
-        for class_name, class_obj in inspect.getmembers(test_suite_module, is_test_suite):
-            if class_name == full_suite_name_to_find:
-                return class_obj
-        raise ConfigurationError(
-            f"Couldn't find any valid test suites in {test_suite_module.__name__}."
-        )
-
     def _connect_nodes_and_run_test_run(
         self,
         sut_nodes: dict[str, SutNode],
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v4 0/8] dts: Pydantic configuration
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
                   ` (6 preceding siblings ...)
  2024-10-25 16:43 ` [PATCH v3 0/5] dts: Pydantic configuration Luca Vizzarro
@ 2024-10-28 17:49 ` Luca Vizzarro
  2024-10-28 17:49   ` [PATCH v4 1/8] dts: add pydantic dependency Luca Vizzarro
                     ` (7 more replies)
  2024-11-06 18:09 ` [PATCH v5 0/8] dts: Pydantic configuration Luca Vizzarro
  2024-11-08 11:39 ` [PATCH v6 0/9] " Luca Vizzarro
  9 siblings, 8 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-28 17:49 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

Hi there,

sending a v4 for the pydantic changes.

v4:
- added autodoc_pydantic due to autodoc warnings
- fixed pydantic models docstrings
- updated docs
- refactored DPDKBuildInfo and NodeInfo which didn't belong in
  configuration
v3:
- removed the common FrozenModel and configured each BaseModel
  individually, due to mypy complaints
v2:
- rebased and merge conflicts resolved:
  - capabilities patch introducing TestCase has now been combined with
    TestSuiteSpec
  - external build patch added more configuration complexity which has
    been re-worked in pydantic adding exclusion via structured models
- split pydantic/warlock dependency chains
- deleted the config schema as no longer needed
- removed config schema generator
- turned all configuration dataclasses into Pydantic BaseModels
- refactored
- improved docstrings

Best,
Luca

---
Depends-on: series-33590 ("DTS external DPDK build")

Luca Vizzarro (8):
  dts: add pydantic dependency
  dts: add TestSuiteSpec class and discovery
  dts: refactor build and node info classes
  dts: use pydantic in the configuration
  dts: remove warlock dependency
  dts: add autodoc pydantic
  dts: improve configuration API docs
  dts: use TestSuiteSpec class imports

 doc/api/dts/conf_yaml_schema.json             |   1 -
 doc/api/dts/framework.config.rst              |   6 -
 doc/api/dts/framework.config.types.rst        |   8 -
 doc/guides/conf.py                            |  13 +
 doc/guides/tools/dts.rst                      | 192 +---
 dts/conf.yaml                                 |  11 +-
 dts/framework/config/__init__.py              | 848 ++++++++----------
 dts/framework/config/conf_yaml_schema.json    | 459 ----------
 dts/framework/config/types.py                 | 149 ---
 dts/framework/runner.py                       | 139 +--
 dts/framework/settings.py                     | 124 +--
 dts/framework/test_result.py                  |   4 +-
 dts/framework/test_suite.py                   | 189 +++-
 dts/framework/testbed_model/capability.py     |  12 +-
 dts/framework/testbed_model/node.py           |  15 +-
 dts/framework/testbed_model/os_session.py     |  25 +-
 dts/framework/testbed_model/port.py           |   4 +-
 dts/framework/testbed_model/posix_session.py  |  14 +-
 dts/framework/testbed_model/sut_node.py       | 200 +++--
 dts/framework/testbed_model/topology.py       |  11 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 dts/framework/utils.py                        |   2 +-
 dts/poetry.lock                               | 423 +++++----
 dts/pyproject.toml                            |   3 +-
 dts/tests/TestSuite_smoke_tests.py            |   2 +-
 26 files changed, 1067 insertions(+), 1793 deletions(-)
 delete mode 120000 doc/api/dts/conf_yaml_schema.json
 delete mode 100644 doc/api/dts/framework.config.types.rst
 delete mode 100644 dts/framework/config/conf_yaml_schema.json
 delete mode 100644 dts/framework/config/types.py

-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v4 1/8] dts: add pydantic dependency
  2024-10-28 17:49 ` [PATCH v4 0/8] dts: Pydantic configuration Luca Vizzarro
@ 2024-10-28 17:49   ` Luca Vizzarro
  2024-10-31 18:42     ` Nicholas Pratte
  2024-10-28 17:49   ` [PATCH v4 2/8] dts: add TestSuiteSpec class and discovery Luca Vizzarro
                     ` (6 subsequent siblings)
  7 siblings, 1 reply; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-28 17:49 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

As part of configuration validation and deserialization improvements,
this adds pydantic as a project dependency. Pydantic is a library that
caters to all of the aforementioned needs, while improving the process
and code.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/poetry.lock    | 171 ++++++++++++++++++++++++++++++++++++++++++++-
 dts/pyproject.toml |   1 +
 2 files changed, 170 insertions(+), 2 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index cf5f6569c6..56c50ad52c 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,4 +1,4 @@
-# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand.
 
 [[package]]
 name = "aenum"
@@ -23,6 +23,17 @@ files = [
     {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
 ]
 
+[[package]]
+name = "annotated-types"
+version = "0.7.0"
+description = "Reusable constraint types to use with typing.Annotated"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"},
+    {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
+]
+
 [[package]]
 name = "attrs"
 version = "23.1.0"
@@ -567,6 +578,16 @@ files = [
     {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
     {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
     {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
@@ -762,6 +783,130 @@ files = [
     {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
 ]
 
+[[package]]
+name = "pydantic"
+version = "2.9.2"
+description = "Data validation using Python type hints"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic-2.9.2-py3-none-any.whl", hash = "sha256:f048cec7b26778210e28a0459867920654d48e5e62db0958433636cde4254f12"},
+    {file = "pydantic-2.9.2.tar.gz", hash = "sha256:d155cef71265d1e9807ed1c32b4c8deec042a44a50a4188b25ac67ecd81a9c0f"},
+]
+
+[package.dependencies]
+annotated-types = ">=0.6.0"
+pydantic-core = "2.23.4"
+typing-extensions = [
+    {version = ">=4.12.2", markers = "python_version >= \"3.13\""},
+    {version = ">=4.6.1", markers = "python_version < \"3.13\""},
+]
+
+[package.extras]
+email = ["email-validator (>=2.0.0)"]
+timezone = ["tzdata"]
+
+[[package]]
+name = "pydantic-core"
+version = "2.23.4"
+description = "Core functionality for Pydantic validation and serialization"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic_core-2.23.4-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:b10bd51f823d891193d4717448fab065733958bdb6a6b351967bd349d48d5c9b"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4fc714bdbfb534f94034efaa6eadd74e5b93c8fa6315565a222f7b6f42ca1166"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63e46b3169866bd62849936de036f901a9356e36376079b05efa83caeaa02ceb"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed1a53de42fbe34853ba90513cea21673481cd81ed1be739f7f2efb931b24916"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cfdd16ab5e59fc31b5e906d1a3f666571abc367598e3e02c83403acabc092e07"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:255a8ef062cbf6674450e668482456abac99a5583bbafb73f9ad469540a3a232"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a7cd62e831afe623fbb7aabbb4fe583212115b3ef38a9f6b71869ba644624a2"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f09e2ff1f17c2b51f2bc76d1cc33da96298f0a036a137f5440ab3ec5360b624f"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e38e63e6f3d1cec5a27e0afe90a085af8b6806ee208b33030e65b6516353f1a3"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0dbd8dbed2085ed23b5c04afa29d8fd2771674223135dc9bc937f3c09284d071"},
+    {file = "pydantic_core-2.23.4-cp310-none-win32.whl", hash = "sha256:6531b7ca5f951d663c339002e91aaebda765ec7d61b7d1e3991051906ddde119"},
+    {file = "pydantic_core-2.23.4-cp310-none-win_amd64.whl", hash = "sha256:7c9129eb40958b3d4500fa2467e6a83356b3b61bfff1b414c7361d9220f9ae8f"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:77733e3892bb0a7fa797826361ce8a9184d25c8dffaec60b7ffe928153680ba8"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1b84d168f6c48fabd1f2027a3d1bdfe62f92cade1fb273a5d68e621da0e44e6d"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:df49e7a0861a8c36d089c1ed57d308623d60416dab2647a4a17fe050ba85de0e"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ff02b6d461a6de369f07ec15e465a88895f3223eb75073ffea56b84d9331f607"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:996a38a83508c54c78a5f41456b0103c30508fed9abcad0a59b876d7398f25fd"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d97683ddee4723ae8c95d1eddac7c192e8c552da0c73a925a89fa8649bf13eea"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:216f9b2d7713eb98cb83c80b9c794de1f6b7e3145eef40400c62e86cee5f4e1e"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6f783e0ec4803c787bcea93e13e9932edab72068f68ecffdf86a99fd5918878b"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d0776dea117cf5272382634bd2a5c1b6eb16767c223c6a5317cd3e2a757c61a0"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d5f7a395a8cf1621939692dba2a6b6a830efa6b3cee787d82c7de1ad2930de64"},
+    {file = "pydantic_core-2.23.4-cp311-none-win32.whl", hash = "sha256:74b9127ffea03643e998e0c5ad9bd3811d3dac8c676e47db17b0ee7c3c3bf35f"},
+    {file = "pydantic_core-2.23.4-cp311-none-win_amd64.whl", hash = "sha256:98d134c954828488b153d88ba1f34e14259284f256180ce659e8d83e9c05eaa3"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:f3e0da4ebaef65158d4dfd7d3678aad692f7666877df0002b8a522cdf088f231"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f69a8e0b033b747bb3e36a44e7732f0c99f7edd5cea723d45bc0d6e95377ffee"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:723314c1d51722ab28bfcd5240d858512ffd3116449c557a1336cbe3919beb87"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bb2802e667b7051a1bebbfe93684841cc9351004e2badbd6411bf357ab8d5ac8"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d18ca8148bebe1b0a382a27a8ee60350091a6ddaf475fa05ef50dc35b5df6327"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33e3d65a85a2a4a0dc3b092b938a4062b1a05f3a9abde65ea93b233bca0e03f2"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:128585782e5bfa515c590ccee4b727fb76925dd04a98864182b22e89a4e6ed36"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:68665f4c17edcceecc112dfed5dbe6f92261fb9d6054b47d01bf6371a6196126"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:20152074317d9bed6b7a95ade3b7d6054845d70584216160860425f4fbd5ee9e"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:9261d3ce84fa1d38ed649c3638feefeae23d32ba9182963e465d58d62203bd24"},
+    {file = "pydantic_core-2.23.4-cp312-none-win32.whl", hash = "sha256:4ba762ed58e8d68657fc1281e9bb72e1c3e79cc5d464be146e260c541ec12d84"},
+    {file = "pydantic_core-2.23.4-cp312-none-win_amd64.whl", hash = "sha256:97df63000f4fea395b2824da80e169731088656d1818a11b95f3b173747b6cd9"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7530e201d10d7d14abce4fb54cfe5b94a0aefc87da539d0346a484ead376c3cc"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:df933278128ea1cd77772673c73954e53a1c95a4fdf41eef97c2b779271bd0bd"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cb3da3fd1b6a5d0279a01877713dbda118a2a4fc6f0d821a57da2e464793f05"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42c6dcb030aefb668a2b7009c85b27f90e51e6a3b4d5c9bc4c57631292015b0d"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:696dd8d674d6ce621ab9d45b205df149399e4bb9aa34102c970b721554828510"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2971bb5ffe72cc0f555c13e19b23c85b654dd2a8f7ab493c262071377bfce9f6"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8394d940e5d400d04cad4f75c0598665cbb81aecefaca82ca85bd28264af7f9b"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0dff76e0602ca7d4cdaacc1ac4c005e0ce0dcfe095d5b5259163a80d3a10d327"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7d32706badfe136888bdea71c0def994644e09fff0bfe47441deaed8e96fdbc6"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ed541d70698978a20eb63d8c5d72f2cc6d7079d9d90f6b50bad07826f1320f5f"},
+    {file = "pydantic_core-2.23.4-cp313-none-win32.whl", hash = "sha256:3d5639516376dce1940ea36edf408c554475369f5da2abd45d44621cb616f769"},
+    {file = "pydantic_core-2.23.4-cp313-none-win_amd64.whl", hash = "sha256:5a1504ad17ba4210df3a045132a7baeeba5a200e930f57512ee02909fc5c4cb5"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:d4488a93b071c04dc20f5cecc3631fc78b9789dd72483ba15d423b5b3689b555"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:81965a16b675b35e1d09dd14df53f190f9129c0202356ed44ab2728b1c905658"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ffa2ebd4c8530079140dd2d7f794a9d9a73cbb8e9d59ffe24c63436efa8f271"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:61817945f2fe7d166e75fbfb28004034b48e44878177fc54d81688e7b85a3665"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:29d2c342c4bc01b88402d60189f3df065fb0dda3654744d5a165a5288a657368"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5e11661ce0fd30a6790e8bcdf263b9ec5988e95e63cf901972107efc49218b13"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9d18368b137c6295db49ce7218b1a9ba15c5bc254c96d7c9f9e924a9bc7825ad"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ec4e55f79b1c4ffb2eecd8a0cfba9955a2588497d96851f4c8f99aa4a1d39b12"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:374a5e5049eda9e0a44c696c7ade3ff355f06b1fe0bb945ea3cac2bc336478a2"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5c364564d17da23db1106787675fc7af45f2f7b58b4173bfdd105564e132e6fb"},
+    {file = "pydantic_core-2.23.4-cp38-none-win32.whl", hash = "sha256:d7a80d21d613eec45e3d41eb22f8f94ddc758a6c4720842dc74c0581f54993d6"},
+    {file = "pydantic_core-2.23.4-cp38-none-win_amd64.whl", hash = "sha256:5f5ff8d839f4566a474a969508fe1c5e59c31c80d9e140566f9a37bba7b8d556"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a4fa4fc04dff799089689f4fd502ce7d59de529fc2f40a2c8836886c03e0175a"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a7df63886be5e270da67e0966cf4afbae86069501d35c8c1b3b6c168f42cb36"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dcedcd19a557e182628afa1d553c3895a9f825b936415d0dbd3cd0bbcfd29b4b"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f54b118ce5de9ac21c363d9b3caa6c800341e8c47a508787e5868c6b79c9323"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:86d2f57d3e1379a9525c5ab067b27dbb8a0642fb5d454e17a9ac434f9ce523e3"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:de6d1d1b9e5101508cb37ab0d972357cac5235f5c6533d1071964c47139257df"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1278e0d324f6908e872730c9102b0112477a7f7cf88b308e4fc36ce1bdb6d58c"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9a6b5099eeec78827553827f4c6b8615978bb4b6a88e5d9b93eddf8bb6790f55"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:e55541f756f9b3ee346b840103f32779c695a19826a4c442b7954550a0972040"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a5c7ba8ffb6d6f8f2ab08743be203654bb1aaa8c9dcb09f82ddd34eadb695605"},
+    {file = "pydantic_core-2.23.4-cp39-none-win32.whl", hash = "sha256:37b0fe330e4a58d3c58b24d91d1eb102aeec675a3db4c292ec3928ecd892a9a6"},
+    {file = "pydantic_core-2.23.4-cp39-none-win_amd64.whl", hash = "sha256:1498bec4c05c9c787bde9125cfdcc63a41004ff167f495063191b863399b1a29"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:f455ee30a9d61d3e1a15abd5068827773d6e4dc513e795f380cdd59932c782d5"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1e90d2e3bd2c3863d48525d297cd143fe541be8bbf6f579504b9712cb6b643ec"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e203fdf807ac7e12ab59ca2bfcabb38c7cf0b33c41efeb00f8e5da1d86af480"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e08277a400de01bc72436a0ccd02bdf596631411f592ad985dcee21445bd0068"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f220b0eea5965dec25480b6333c788fb72ce5f9129e8759ef876a1d805d00801"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:d06b0c8da4f16d1d1e352134427cb194a0a6e19ad5db9161bf32b2113409e728"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:ba1a0996f6c2773bd83e63f18914c1de3c9dd26d55f4ac302a7efe93fb8e7433"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:9a5bce9d23aac8f0cf0836ecfc033896aa8443b501c58d0602dbfd5bd5b37753"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:78ddaaa81421a29574a682b3179d4cf9e6d405a09b99d93ddcf7e5239c742e21"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:883a91b5dd7d26492ff2f04f40fbb652de40fcc0afe07e8129e8ae779c2110eb"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:88ad334a15b32a791ea935af224b9de1bf99bcd62fabf745d5f3442199d86d59"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:233710f069d251feb12a56da21e14cca67994eab08362207785cf8c598e74577"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:19442362866a753485ba5e4be408964644dd6a09123d9416c54cd49171f50744"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:624e278a7d29b6445e4e813af92af37820fafb6dcc55c012c834f9e26f9aaaef"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f5ef8f42bec47f21d07668a043f077d507e5bf4e668d5c6dfe6aaba89de1a5b8"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:aea443fffa9fbe3af1a9ba721a87f926fe548d32cab71d188a6ede77d0ff244e"},
+    {file = "pydantic_core-2.23.4.tar.gz", hash = "sha256:2584f7cf844ac4d970fba483a717dbe10c1c1c96a969bf65d61ffe94df1b2863"},
+]
+
+[package.dependencies]
+typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
+
 [[package]]
 name = "pydocstyle"
 version = "6.1.1"
@@ -880,6 +1025,7 @@ files = [
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
+    {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
     {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
     {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
     {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -887,8 +1033,16 @@ files = [
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
+    {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
     {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
     {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
+    {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
+    {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
+    {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
+    {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
+    {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
+    {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
+    {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
     {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
     {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
     {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -905,6 +1059,7 @@ files = [
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
+    {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
     {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
     {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
     {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -912,6 +1067,7 @@ files = [
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
+    {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
     {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
     {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
     {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -1327,6 +1483,17 @@ files = [
     {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
 ]
 
+[[package]]
+name = "typing-extensions"
+version = "4.12.2"
+description = "Backported and Experimental Type Hints for Python 3.8+"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"},
+    {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"},
+]
+
 [[package]]
 name = "urllib3"
 version = "2.0.7"
@@ -1362,4 +1529,4 @@ jsonschema = ">=4,<5"
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "6f20ce05310df93fed1d392160d1653ae5de5c6f260a5865eb3c6111a7c2b394"
+content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 506380ac2f..6c2d1ca8a4 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -28,6 +28,7 @@ scapy = "^2.5.0"
 pydocstyle = "6.1.1"
 typing-extensions = "^4.11.0"
 aenum = "^3.1.15"
+pydantic = "^2.9.2"
 
 [tool.poetry.group.dev.dependencies]
 mypy = "^1.10.0"
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v4 2/8] dts: add TestSuiteSpec class and discovery
  2024-10-28 17:49 ` [PATCH v4 0/8] dts: Pydantic configuration Luca Vizzarro
  2024-10-28 17:49   ` [PATCH v4 1/8] dts: add pydantic dependency Luca Vizzarro
@ 2024-10-28 17:49   ` Luca Vizzarro
  2024-10-31 19:32     ` Nicholas Pratte
  2024-10-31 20:21     ` Nicholas Pratte
  2024-10-28 17:49   ` [PATCH v4 3/8] dts: refactor build and node info classes Luca Vizzarro
                     ` (5 subsequent siblings)
  7 siblings, 2 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-28 17:49 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

Currently there is a lack of a definition which identifies all the test
suites available to test. This change intends to simplify the process to
discover all the test suites and idenfity them.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/runner.py                   |   2 +-
 dts/framework/test_suite.py               | 189 +++++++++++++++++++---
 dts/framework/testbed_model/capability.py |  12 +-
 3 files changed, 177 insertions(+), 26 deletions(-)

diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 8bbe698eaf..195622c653 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -225,7 +225,7 @@ def _get_test_suites_with_cases(
         for test_suite_config in test_suite_configs:
             test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
             test_cases: list[type[TestCase]] = []
-            func_test_cases, perf_test_cases = test_suite_class.get_test_cases(
+            func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
                 test_suite_config.test_cases
             )
             if func:
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index cbe3b30ffc..936eb2cede 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2010-2014 Intel Corporation
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Features common to all test suites.
 
@@ -16,13 +17,20 @@
 import inspect
 from collections import Counter
 from collections.abc import Callable, Sequence
+from dataclasses import dataclass
 from enum import Enum, auto
+from functools import cached_property
+from importlib import import_module
 from ipaddress import IPv4Interface, IPv6Interface, ip_interface
+from pkgutil import iter_modules
+from types import ModuleType
 from typing import ClassVar, Protocol, TypeVar, Union, cast
 
+from pydantic.alias_generators import to_pascal
 from scapy.layers.inet import IP  # type: ignore[import-untyped]
 from scapy.layers.l2 import Ether  # type: ignore[import-untyped]
 from scapy.packet import Packet, Padding, raw  # type: ignore[import-untyped]
+from typing_extensions import Self
 
 from framework.testbed_model.capability import TestProtocol
 from framework.testbed_model.port import Port
@@ -33,7 +41,7 @@
     PacketFilteringConfig,
 )
 
-from .exception import ConfigurationError, TestCaseVerifyError
+from .exception import ConfigurationError, InternalError, TestCaseVerifyError
 from .logger import DTSLogger, get_dts_logger
 from .utils import get_packet_summaries
 
@@ -112,10 +120,24 @@ def __init__(
         self._tg_ip_address_ingress = ip_interface("192.168.101.3/24")
 
     @classmethod
-    def get_test_cases(
+    def get_test_cases(cls) -> list[type["TestCase"]]:
+        """A list of all the available test cases."""
+
+        def is_test_case(function: Callable) -> bool:
+            if inspect.isfunction(function):
+                # TestCase is not used at runtime, so we can't use isinstance() with `function`.
+                # But function.test_type exists.
+                if hasattr(function, "test_type"):
+                    return isinstance(function.test_type, TestCaseType)
+            return False
+
+        return [test_case for _, test_case in inspect.getmembers(cls, is_test_case)]
+
+    @classmethod
+    def filter_test_cases(
         cls, test_case_sublist: Sequence[str] | None = None
     ) -> tuple[set[type["TestCase"]], set[type["TestCase"]]]:
-        """Filter `test_case_subset` from this class.
+        """Filter `test_case_sublist` from this class.
 
         Test cases are regular (or bound) methods decorated with :func:`func_test`
         or :func:`perf_test`.
@@ -129,17 +151,8 @@ def get_test_cases(
             as methods are bound to instances and this method only has access to the class.
 
         Raises:
-            ConfigurationError: If a test case from `test_case_subset` is not found.
+            ConfigurationError: If a test case from `test_case_sublist` is not found.
         """
-
-        def is_test_case(function: Callable) -> bool:
-            if inspect.isfunction(function):
-                # TestCase is not used at runtime, so we can't use isinstance() with `function`.
-                # But function.test_type exists.
-                if hasattr(function, "test_type"):
-                    return isinstance(function.test_type, TestCaseType)
-            return False
-
         if test_case_sublist is None:
             test_case_sublist = []
 
@@ -149,22 +162,22 @@ def is_test_case(function: Callable) -> bool:
         func_test_cases = set()
         perf_test_cases = set()
 
-        for test_case_name, test_case_function in inspect.getmembers(cls, is_test_case):
-            if test_case_name in test_case_sublist_copy:
+        for test_case in cls.get_test_cases():
+            if test_case.name in test_case_sublist_copy:
                 # if test_case_sublist_copy is non-empty, remove the found test case
                 # so that we can look at the remainder at the end
-                test_case_sublist_copy.remove(test_case_name)
+                test_case_sublist_copy.remove(test_case.name)
             elif test_case_sublist:
                 # the original list not being empty means we're filtering test cases
-                # since we didn't remove test_case_name in the previous branch,
+                # since we didn't remove test_case.name in the previous branch,
                 # it doesn't match the filter and we don't want to remove it
                 continue
 
-            match test_case_function.test_type:
+            match test_case.test_type:
                 case TestCaseType.PERFORMANCE:
-                    perf_test_cases.add(test_case_function)
+                    perf_test_cases.add(test_case)
                 case TestCaseType.FUNCTIONAL:
-                    func_test_cases.add(test_case_function)
+                    func_test_cases.add(test_case)
 
         if test_case_sublist_copy:
             raise ConfigurationError(
@@ -536,6 +549,8 @@ class TestCase(TestProtocol, Protocol[TestSuiteMethodType]):
     test case function to :class:`TestCase` and sets common variables.
     """
 
+    #:
+    name: ClassVar[str]
     #:
     test_type: ClassVar[TestCaseType]
     #: necessary for mypy so that it can treat this class as the function it's shadowing
@@ -560,6 +575,7 @@ def make_decorator(
 
         def _decorator(func: TestSuiteMethodType) -> type[TestCase]:
             test_case = cast(type[TestCase], func)
+            test_case.name = func.__name__
             test_case.skip = cls.skip
             test_case.skip_reason = cls.skip_reason
             test_case.required_capabilities = set()
@@ -575,3 +591,136 @@ def _decorator(func: TestSuiteMethodType) -> type[TestCase]:
 func_test: Callable = TestCase.make_decorator(TestCaseType.FUNCTIONAL)
 #: The decorator for performance test cases.
 perf_test: Callable = TestCase.make_decorator(TestCaseType.PERFORMANCE)
+
+
+@dataclass
+class TestSuiteSpec:
+    """A class defining the specification of a test suite.
+
+    Apart from defining all the specs of a test suite, a helper function :meth:`discover_all` is
+    provided to automatically discover all the available test suites.
+
+    Attributes:
+        module_name: The name of the test suite's module.
+    """
+
+    #:
+    TEST_SUITES_PACKAGE_NAME = "tests"
+    #:
+    TEST_SUITE_MODULE_PREFIX = "TestSuite_"
+    #:
+    TEST_SUITE_CLASS_PREFIX = "Test"
+    #:
+    TEST_CASE_METHOD_PREFIX = "test_"
+    #:
+    FUNC_TEST_CASE_REGEX = r"test_(?!perf_)"
+    #:
+    PERF_TEST_CASE_REGEX = r"test_perf_"
+
+    module_name: str
+
+    @cached_property
+    def name(self) -> str:
+        """The name of the test suite's module."""
+        return self.module_name[len(self.TEST_SUITE_MODULE_PREFIX) :]
+
+    @cached_property
+    def module(self) -> ModuleType:
+        """A reference to the test suite's module."""
+        return import_module(f"{self.TEST_SUITES_PACKAGE_NAME}.{self.module_name}")
+
+    @cached_property
+    def class_name(self) -> str:
+        """The name of the test suite's class."""
+        return f"{self.TEST_SUITE_CLASS_PREFIX}{to_pascal(self.name)}"
+
+    @cached_property
+    def class_obj(self) -> type[TestSuite]:
+        """A reference to the test suite's class."""
+
+        def is_test_suite(obj) -> bool:
+            """Check whether `obj` is a :class:`TestSuite`.
+
+            The `obj` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
+
+            Args:
+                obj: The object to be checked.
+
+            Returns:
+                :data:`True` if `obj` is a subclass of `TestSuite`.
+            """
+            try:
+                if issubclass(obj, TestSuite) and obj is not TestSuite:
+                    return True
+            except TypeError:
+                return False
+            return False
+
+        for class_name, class_obj in inspect.getmembers(self.module, is_test_suite):
+            if class_name == self.class_name:
+                return class_obj
+
+        raise InternalError(
+            f"Expected class {self.class_name} not found in module {self.module_name}."
+        )
+
+    @classmethod
+    def discover_all(
+        cls, package_name: str | None = None, module_prefix: str | None = None
+    ) -> list[Self]:
+        """Discover all the test suites.
+
+        The test suites are discovered in the provided `package_name`. The full module name,
+        expected under that package, is prefixed with `module_prefix`.
+        The module name is a standard filename with words separated with underscores.
+        For each module found, search for a :class:`TestSuite` class which starts
+        with :attr:`~TestSuiteSpec.TEST_SUITE_CLASS_PREFIX`, continuing with the module name in
+        PascalCase.
+
+        The PascalCase convention applies to abbreviations, acronyms, initialisms and so on::
+
+            OS -> Os
+            TCP -> Tcp
+
+        Args:
+            package_name: The name of the package where to find the test suites. If :data:`None`,
+                the :attr:`~TestSuiteSpec.TEST_SUITES_PACKAGE_NAME` is used.
+            module_prefix: The name prefix defining the test suite module. If :data:`None`, the
+                :attr:`~TestSuiteSpec.TEST_SUITE_MODULE_PREFIX` constant is used.
+
+        Returns:
+            A list containing all the discovered test suites.
+        """
+        if package_name is None:
+            package_name = cls.TEST_SUITES_PACKAGE_NAME
+        if module_prefix is None:
+            module_prefix = cls.TEST_SUITE_MODULE_PREFIX
+
+        test_suites = []
+
+        test_suites_pkg = import_module(package_name)
+        for _, module_name, is_pkg in iter_modules(test_suites_pkg.__path__):
+            if not module_name.startswith(module_prefix) or is_pkg:
+                continue
+
+            test_suite = cls(module_name)
+            try:
+                if test_suite.class_obj:
+                    test_suites.append(test_suite)
+            except InternalError as err:
+                get_dts_logger().warning(err)
+
+        return test_suites
+
+
+AVAILABLE_TEST_SUITES: list[TestSuiteSpec] = TestSuiteSpec.discover_all()
+"""Constant to store all the available, discovered and imported test suites.
+
+The test suites should be gathered from this list to avoid importing more than once.
+"""
+
+
+def find_by_name(name: str) -> TestSuiteSpec | None:
+    """Find a requested test suite by name from the available ones."""
+    test_suites = filter(lambda t: t.name == name, AVAILABLE_TEST_SUITES)
+    return next(test_suites, None)
diff --git a/dts/framework/testbed_model/capability.py b/dts/framework/testbed_model/capability.py
index 2207957a7a..0d5f0e0b32 100644
--- a/dts/framework/testbed_model/capability.py
+++ b/dts/framework/testbed_model/capability.py
@@ -47,9 +47,9 @@ def test_scatter_mbuf_2048(self):
 
 import inspect
 from abc import ABC, abstractmethod
-from collections.abc import MutableSet, Sequence
+from collections.abc import MutableSet
 from dataclasses import dataclass
-from typing import Callable, ClassVar, Protocol
+from typing import TYPE_CHECKING, Callable, ClassVar, Protocol
 
 from typing_extensions import Self
 
@@ -66,6 +66,9 @@ def test_scatter_mbuf_2048(self):
 from .sut_node import SutNode
 from .topology import Topology, TopologyType
 
+if TYPE_CHECKING:
+    from framework.test_suite import TestCase
+
 
 class Capability(ABC):
     """The base class for various capabilities.
@@ -354,8 +357,7 @@ def set_required(self, test_case_or_suite: type["TestProtocol"]) -> None:
         if inspect.isclass(test_case_or_suite):
             if self.topology_type is not TopologyType.default:
                 self.add_to_required(test_case_or_suite)
-                func_test_cases, perf_test_cases = test_case_or_suite.get_test_cases()
-                for test_case in func_test_cases | perf_test_cases:
+                for test_case in test_case_or_suite.get_test_cases():
                     if test_case.topology_type.topology_type is TopologyType.default:
                         # test case topology has not been set, use the one set by the test suite
                         self.add_to_required(test_case)
@@ -446,7 +448,7 @@ class TestProtocol(Protocol):
     required_capabilities: ClassVar[set[Capability]] = set()
 
     @classmethod
-    def get_test_cases(cls, test_case_sublist: Sequence[str] | None = None) -> tuple[set, set]:
+    def get_test_cases(cls) -> list[type["TestCase"]]:
         """Get test cases. Should be implemented by subclasses containing test cases.
 
         Raises:
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v4 3/8] dts: refactor build and node info classes
  2024-10-28 17:49 ` [PATCH v4 0/8] dts: Pydantic configuration Luca Vizzarro
  2024-10-28 17:49   ` [PATCH v4 1/8] dts: add pydantic dependency Luca Vizzarro
  2024-10-28 17:49   ` [PATCH v4 2/8] dts: add TestSuiteSpec class and discovery Luca Vizzarro
@ 2024-10-28 17:49   ` Luca Vizzarro
  2024-10-31 20:16     ` Nicholas Pratte
  2024-10-28 17:49   ` [PATCH v4 4/8] dts: use pydantic in the configuration Luca Vizzarro
                     ` (4 subsequent siblings)
  7 siblings, 1 reply; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-28 17:49 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

The DPDKBuildInfo and NodeInfo classes, representing information
gathered in runtime, were erroneously placed in the configuration
package. This moves them in more appropriate modules.

NodeInfo, specifically, ia moved to os_session instead of node mostly
as a consequence of circular dependencies. And given os_session is the
top-most module to reference it, it appears to be the most suitable
place outside of node.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/config/__init__.py             | 31 --------------------
 dts/framework/test_result.py                 |  4 ++-
 dts/framework/testbed_model/os_session.py    | 21 ++++++++++++-
 dts/framework/testbed_model/posix_session.py |  4 +--
 dts/framework/testbed_model/sut_node.py      | 18 ++++++++++--
 5 files changed, 40 insertions(+), 38 deletions(-)

diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index d0d95d00c7..7403ccbf14 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -318,24 +318,6 @@ class TGNodeConfiguration(NodeConfiguration):
     traffic_generator: TrafficGeneratorConfig
 
 
-@dataclass(slots=True, frozen=True)
-class NodeInfo:
-    """Supplemental node information.
-
-    Attributes:
-        os_name: The name of the running operating system of
-            the :class:`~framework.testbed_model.node.Node`.
-        os_version: The version of the running operating system of
-            the :class:`~framework.testbed_model.node.Node`.
-        kernel_version: The kernel version of the running operating system of
-            the :class:`~framework.testbed_model.node.Node`.
-    """
-
-    os_name: str
-    os_version: str
-    kernel_version: str
-
-
 @dataclass(slots=True, frozen=True)
 class DPDKBuildConfiguration:
     """DPDK build configuration.
@@ -493,19 +475,6 @@ def from_dict(cls, d: DPDKConfigurationDict) -> Self:
         )
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKBuildInfo:
-    """Various versions and other information about a DPDK build.
-
-    Attributes:
-        dpdk_version: The DPDK version that was built.
-        compiler_version: The version of the compiler used to build DPDK.
-    """
-
-    dpdk_version: str | None
-    compiler_version: str | None
-
-
 @dataclass(slots=True, frozen=True)
 class TestSuiteConfig:
     """Test suite configuration.
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 00263ad69e..d2f3a90eed 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -30,11 +30,13 @@
 
 from framework.testbed_model.capability import Capability
 
-from .config import DPDKBuildInfo, NodeInfo, TestRunConfiguration, TestSuiteConfig
+from .config import TestRunConfiguration, TestSuiteConfig
 from .exception import DTSError, ErrorSeverity
 from .logger import DTSLogger
 from .settings import SETTINGS
 from .test_suite import TestCase, TestSuite
+from .testbed_model.os_session import NodeInfo
+from .testbed_model.sut_node import DPDKBuildInfo
 
 
 @dataclass(slots=True, frozen=True)
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 6194ddb989..5f087f40d6 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -24,11 +24,12 @@
 """
 from abc import ABC, abstractmethod
 from collections.abc import Iterable
+from dataclasses import dataclass
 from ipaddress import IPv4Interface, IPv6Interface
 from pathlib import Path, PurePath, PurePosixPath
 from typing import Union
 
-from framework.config import Architecture, NodeConfiguration, NodeInfo
+from framework.config import Architecture, NodeConfiguration
 from framework.logger import DTSLogger
 from framework.remote_session import (
     InteractiveRemoteSession,
@@ -44,6 +45,24 @@
 from .port import Port
 
 
+@dataclass(slots=True, frozen=True)
+class NodeInfo:
+    """Supplemental node information.
+
+    Attributes:
+        os_name: The name of the running operating system of
+            the :class:`~framework.testbed_model.node.Node`.
+        os_version: The version of the running operating system of
+            the :class:`~framework.testbed_model.node.Node`.
+        kernel_version: The kernel version of the running operating system of
+            the :class:`~framework.testbed_model.node.Node`.
+    """
+
+    os_name: str
+    os_version: str
+    kernel_version: str
+
+
 class OSSession(ABC):
     """OS-unaware to OS-aware translation API definition.
 
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 5ab7c18fb7..0d3abbc519 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -15,7 +15,7 @@
 from collections.abc import Iterable
 from pathlib import Path, PurePath, PurePosixPath
 
-from framework.config import Architecture, NodeInfo
+from framework.config import Architecture
 from framework.exception import DPDKBuildError, RemoteCommandExecutionError
 from framework.settings import SETTINGS
 from framework.utils import (
@@ -26,7 +26,7 @@
     extract_tarball,
 )
 
-from .os_session import OSSession
+from .os_session import NodeInfo, OSSession
 
 
 class PosixSession(OSSession):
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index e160386324..a6c42b548c 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -14,13 +14,12 @@
 
 import os
 import time
+from dataclasses import dataclass
 from pathlib import PurePath
 
 from framework.config import (
     DPDKBuildConfiguration,
-    DPDKBuildInfo,
     DPDKLocation,
-    NodeInfo,
     SutNodeConfiguration,
     TestRunConfiguration,
 )
@@ -30,10 +29,23 @@
 from framework.utils import MesonArgs, TarCompressionFormat
 
 from .node import Node
-from .os_session import OSSession
+from .os_session import NodeInfo, OSSession
 from .virtual_device import VirtualDevice
 
 
+@dataclass(slots=True, frozen=True)
+class DPDKBuildInfo:
+    """Various versions and other information about a DPDK build.
+
+    Attributes:
+        dpdk_version: The DPDK version that was built.
+        compiler_version: The version of the compiler used to build DPDK.
+    """
+
+    dpdk_version: str | None
+    compiler_version: str | None
+
+
 class SutNode(Node):
     """The system under test node.
 
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v4 4/8] dts: use pydantic in the configuration
  2024-10-28 17:49 ` [PATCH v4 0/8] dts: Pydantic configuration Luca Vizzarro
                     ` (2 preceding siblings ...)
  2024-10-28 17:49   ` [PATCH v4 3/8] dts: refactor build and node info classes Luca Vizzarro
@ 2024-10-28 17:49   ` Luca Vizzarro
  2024-10-31 20:20     ` Nicholas Pratte
  2024-10-28 17:49   ` [PATCH v4 5/8] dts: remove warlock dependency Luca Vizzarro
                     ` (3 subsequent siblings)
  7 siblings, 1 reply; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-28 17:49 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

This change brings in pydantic in place of warlock. Pydantic offers
a built-in model validation system in the classes, which allows for
a more resilient and simpler code. As a consequence of this change:

- most validation is now built-in
- further validation is added to verify:
  - cross referencing of node names and ports
  - test suite and test cases names
- dictionaries representing the config schema are removed
- the config schema is no longer used and therefore dropped
- the TrafficGeneratorType enum has been changed from inheriting
  StrEnum to the native str and Enum. This change was necessary to
  enable the discriminator for object unions
- the structure of the classes has been slightly changed to perfectly
  match the structure of the configuration files
- the test suite argument catches the ValidationError that
  TestSuiteConfig can now raise
- the DPDK location has been wrapped under another configuration
  mapping `dpdk_location`
- the DPDK locations are now structured and enforced by classes,
  further simplifying the validation and handling thanks to
  pattern matching

Bugzilla ID: 1508

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 doc/api/dts/conf_yaml_schema.json             |   1 -
 doc/api/dts/framework.config.rst              |   6 -
 doc/api/dts/framework.config.types.rst        |   8 -
 dts/conf.yaml                                 |  11 +-
 dts/framework/config/__init__.py              | 822 +++++++++---------
 dts/framework/config/conf_yaml_schema.json    | 459 ----------
 dts/framework/config/types.py                 | 149 ----
 dts/framework/runner.py                       |  57 +-
 dts/framework/settings.py                     | 124 +--
 dts/framework/testbed_model/node.py           |  15 +-
 dts/framework/testbed_model/os_session.py     |   4 +-
 dts/framework/testbed_model/port.py           |   4 +-
 dts/framework/testbed_model/posix_session.py  |  10 +-
 dts/framework/testbed_model/sut_node.py       | 182 ++--
 dts/framework/testbed_model/topology.py       |  11 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 dts/framework/utils.py                        |   2 +-
 dts/tests/TestSuite_smoke_tests.py            |   2 +-
 19 files changed, 653 insertions(+), 1220 deletions(-)
 delete mode 120000 doc/api/dts/conf_yaml_schema.json
 delete mode 100644 doc/api/dts/framework.config.types.rst
 delete mode 100644 dts/framework/config/conf_yaml_schema.json
 delete mode 100644 dts/framework/config/types.py

diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json
deleted file mode 120000
index 5978642d76..0000000000
--- a/doc/api/dts/conf_yaml_schema.json
+++ /dev/null
@@ -1 +0,0 @@
-../../../dts/framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst
index 261997aefa..cc266276c1 100644
--- a/doc/api/dts/framework.config.rst
+++ b/doc/api/dts/framework.config.rst
@@ -6,9 +6,3 @@ config - Configuration Package
 .. automodule:: framework.config
    :members:
    :show-inheritance:
-
-.. toctree::
-   :hidden:
-   :maxdepth: 1
-
-   framework.config.types
diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst
deleted file mode 100644
index a50a0c874a..0000000000
--- a/doc/api/dts/framework.config.types.rst
+++ /dev/null
@@ -1,8 +0,0 @@
-.. SPDX-License-Identifier: BSD-3-Clause
-
-config.types - Configuration Types
-==================================
-
-.. automodule:: framework.config.types
-   :members:
-   :show-inheritance:
diff --git a/dts/conf.yaml b/dts/conf.yaml
index 8a65a481d6..2496262854 100644
--- a/dts/conf.yaml
+++ b/dts/conf.yaml
@@ -5,11 +5,12 @@
 test_runs:
   # define one test run environment
   - dpdk_build:
-      # dpdk_tree: Commented out because `tarball` is defined.
-      tarball: dpdk-tarball.tar.xz
-      # Either `dpdk_tree` or `tarball` can be defined, but not both.
-      remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
-                    # is located on the SUT node, instead of the execution host.
+      dpdk_location:
+        # dpdk_tree: Commented out because `tarball` is defined.
+        tarball: dpdk-tarball.tar.xz
+        # Either `dpdk_tree` or `tarball` can be defined, but not both.
+        remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
+                      # is located on the SUT node, instead of the execution host.
 
       # precompiled_build_dir: Commented out because `build_options` is defined.
       build_options:
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index 7403ccbf14..c86bfaaabf 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -2,17 +2,18 @@
 # Copyright(c) 2010-2021 Intel Corporation
 # Copyright(c) 2022-2023 University of New Hampshire
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Testbed configuration and test suite specification.
 
 This package offers classes that hold real-time information about the testbed, hold test run
 configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
-the YAML test run configuration file
-and validates it according to :download:`the schema <conf_yaml_schema.json>`.
+the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
+model.
 
 The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
-this package. The allowed keys and types inside this dictionary are defined in
-the :doc:`types <framework.config.types>` module.
+this package. The allowed keys and types inside this dictionary map directly to the
+:class:`Configuration` model, its fields and sub-models.
 
 The test run configuration has two main sections:
 
@@ -24,39 +25,28 @@
 
 The real-time information about testbed is supposed to be gathered at runtime.
 
-The classes defined in this package make heavy use of :mod:`dataclasses`.
-All of them use slots and are frozen:
+The classes defined in this package make heavy use of :mod:`pydantic`.
+Nearly all of them are frozen:
 
-    * Slots enables some optimizations, by pre-allocating space for the defined
-      attributes in the underlying data structure,
     * Frozen makes the object immutable. This enables further optimizations,
       and makes it thread safe should we ever want to move in that direction.
 """
 
-import json
-import os.path
 import tarfile
-from dataclasses import dataclass, fields
-from enum import auto, unique
-from pathlib import Path
-from typing import Union
+from enum import Enum, auto, unique
+from functools import cached_property
+from pathlib import Path, PurePath
+from typing import TYPE_CHECKING, Annotated, Any, Literal, NamedTuple
 
-import warlock  # type: ignore[import-untyped]
 import yaml
+from pydantic import BaseModel, Field, ValidationError, field_validator, model_validator
 from typing_extensions import Self
 
-from framework.config.types import (
-    ConfigurationDict,
-    DPDKBuildConfigDict,
-    DPDKConfigurationDict,
-    NodeConfigDict,
-    PortConfigDict,
-    TestRunConfigDict,
-    TestSuiteConfigDict,
-    TrafficGeneratorConfigDict,
-)
 from framework.exception import ConfigurationError
-from framework.utils import StrEnum
+from framework.utils import REGEX_FOR_PCI_ADDRESS, StrEnum
+
+if TYPE_CHECKING:
+    from framework.test_suite import TestSuiteSpec
 
 
 @unique
@@ -118,15 +108,14 @@ class Compiler(StrEnum):
 
 
 @unique
-class TrafficGeneratorType(StrEnum):
+class TrafficGeneratorType(str, Enum):
     """The supported traffic generators."""
 
     #:
-    SCAPY = auto()
+    SCAPY = "SCAPY"
 
 
-@dataclass(slots=True, frozen=True)
-class HugepageConfiguration:
+class HugepageConfiguration(BaseModel, frozen=True, extra="forbid"):
     r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
@@ -138,12 +127,10 @@ class HugepageConfiguration:
     force_first_numa: bool
 
 
-@dataclass(slots=True, frozen=True)
-class PortConfig:
+class PortConfig(BaseModel, frozen=True, extra="forbid"):
     r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
-        node: The :class:`~framework.testbed_model.node.Node` where this port exists.
         pci: The PCI address of the port.
         os_driver_for_dpdk: The operating system driver name for use with DPDK.
         os_driver: The operating system driver name when the operating system controls the port.
@@ -152,70 +139,57 @@ class PortConfig:
         peer_pci: The PCI address of the port connected to this port.
     """
 
-    node: str
-    pci: str
-    os_driver_for_dpdk: str
-    os_driver: str
-    peer_node: str
-    peer_pci: str
-
-    @classmethod
-    def from_dict(cls, node: str, d: PortConfigDict) -> Self:
-        """A convenience method that creates the object from fewer inputs.
-
-        Args:
-            node: The node where this port exists.
-            d: The configuration dictionary.
-
-        Returns:
-            The port configuration instance.
-        """
-        return cls(node=node, **d)
-
-
-@dataclass(slots=True, frozen=True)
-class TrafficGeneratorConfig:
-    """The configuration of traffic generators.
-
-    The class will be expanded when more configuration is needed.
+    pci: str = Field(
+        description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS
+    )
+    os_driver_for_dpdk: str = Field(
+        description="The driver that the kernel should bind this device to for DPDK to use it.",
+        examples=["vfio-pci", "mlx5_core"],
+    )
+    os_driver: str = Field(
+        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
+    )
+    peer_node: str = Field(description="The name of the peer node this port is connected to.")
+    peer_pci: str = Field(
+        description="The PCI address of the peer port this port is connected to.",
+        pattern=REGEX_FOR_PCI_ADDRESS,
+    )
+
+
+class TrafficGeneratorConfig(BaseModel, frozen=True, extra="forbid"):
+    """A protocol required to define traffic generator types.
 
     Attributes:
-        traffic_generator_type: The type of the traffic generator.
+        type: The traffic generator type, the child class is required to define to be distinguished
+            among others.
     """
 
-    traffic_generator_type: TrafficGeneratorType
+    type: TrafficGeneratorType
 
-    @staticmethod
-    def from_dict(d: TrafficGeneratorConfigDict) -> "TrafficGeneratorConfig":
-        """A convenience method that produces traffic generator config of the proper type.
 
-        Args:
-            d: The configuration dictionary.
+class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="forbid"):
+    """Scapy traffic generator specific configuration."""
 
-        Returns:
-            The traffic generator configuration instance.
+    type: Literal[TrafficGeneratorType.SCAPY]
 
-        Raises:
-            ConfigurationError: An unknown traffic generator type was encountered.
-        """
-        match TrafficGeneratorType(d["type"]):
-            case TrafficGeneratorType.SCAPY:
-                return ScapyTrafficGeneratorConfig(
-                    traffic_generator_type=TrafficGeneratorType.SCAPY
-                )
-            case _:
-                raise ConfigurationError(f'Unknown traffic generator type "{d["type"]}".')
 
+#: A union type discriminating traffic generators by the `type` field.
+TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
 
-@dataclass(slots=True, frozen=True)
-class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
-    """Scapy traffic generator specific configuration."""
 
-    pass
+#: A field representing logical core ranges.
+LogicalCores = Annotated[
+    str,
+    Field(
+        description="Comma-separated list of logical cores to use. "
+        "An empty string means use all lcores.",
+        examples=["1,2,3,4,5,18-22", "10-15"],
+        pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
+    ),
+]
 
 
-@dataclass(slots=True, frozen=True)
-class NodeConfiguration:
+class NodeConfiguration(BaseModel, frozen=True, extra="forbid"):
     r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
@@ -234,285 +208,317 @@ class NodeConfiguration:
         ports: The ports that can be used in testing.
     """
 
-    name: str
-    hostname: str
-    user: str
-    password: str | None
+    name: str = Field(description="A unique identifier for this node.")
+    hostname: str = Field(description="The hostname or IP address of the node.")
+    user: str = Field(description="The login user to use to connect to this node.")
+    password: str | None = Field(
+        default=None,
+        description="The login password to use to connect to this node. "
+        "SSH keys are STRONGLY preferred, use only as last resort.",
+    )
     arch: Architecture
     os: OS
-    lcores: str
-    use_first_core: bool
-    hugepages: HugepageConfiguration | None
-    ports: list[PortConfig]
-
-    @staticmethod
-    def from_dict(
-        d: NodeConfigDict,
-    ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
-        """A convenience method that processes the inputs before creating a specialized instance.
-
-        Args:
-            d: The configuration dictionary.
-
-        Returns:
-            Either an SUT or TG configuration instance.
-        """
-        hugepage_config = None
-        if "hugepages_2mb" in d:
-            hugepage_config_dict = d["hugepages_2mb"]
-            if "force_first_numa" not in hugepage_config_dict:
-                hugepage_config_dict["force_first_numa"] = False
-            hugepage_config = HugepageConfiguration(**hugepage_config_dict)
-
-        # The calls here contain duplicated code which is here because Mypy doesn't
-        # properly support dictionary unpacking with TypedDicts
-        if "traffic_generator" in d:
-            return TGNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                traffic_generator=TrafficGeneratorConfig.from_dict(d["traffic_generator"]),
-            )
-        else:
-            return SutNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                memory_channels=d.get("memory_channels", 1),
-            )
+    lcores: LogicalCores = "1"
+    use_first_core: bool = Field(
+        default=False, description="DPDK won't use the first physical core if set to False."
+    )
+    hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
+    ports: list[PortConfig] = Field(min_length=1)
 
 
-@dataclass(slots=True, frozen=True)
-class SutNodeConfiguration(NodeConfiguration):
+class SutNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
     """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
 
     Attributes:
         memory_channels: The number of memory channels to use when running DPDK.
     """
 
-    memory_channels: int
+    memory_channels: int = Field(
+        default=1, description="Number of memory channels to use when running DPDK."
+    )
 
 
-@dataclass(slots=True, frozen=True)
-class TGNodeConfiguration(NodeConfiguration):
+class TGNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
     """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
 
     Attributes:
         traffic_generator: The configuration of the traffic generator present on the TG node.
     """
 
-    traffic_generator: TrafficGeneratorConfig
+    traffic_generator: TrafficGeneratorConfigTypes
+
+
+#: Union type for all the node configuration types.
+NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKBuildConfiguration:
-    """DPDK build configuration.
+def resolve_path(path: Path) -> Path:
+    """Resolve a path into a real path."""
+    return path.resolve()
 
-    The configuration used for building DPDK.
+
+class BaseDPDKLocation(BaseModel, frozen=True, extra="forbid"):
+    """DPDK location.
+
+    The path to the DPDK sources, build dir and type of location.
 
     Attributes:
-        arch: The target architecture to build for.
-        os: The target os to build for.
-        cpu: The target CPU to build for.
-        compiler: The compiler executable to use.
-        compiler_wrapper: This string will be put in front of the compiler when
-            executing the build. Useful for adding wrapper commands, such as ``ccache``.
-        name: The name of the compiler.
+        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
+            located on the SUT node, instead of the execution host.
     """
 
-    arch: Architecture
-    os: OS
-    cpu: CPUType
-    compiler: Compiler
-    compiler_wrapper: str
-    name: str
+    remote: bool = False
 
-    @classmethod
-    def from_dict(cls, d: DPDKBuildConfigDict) -> Self:
-        r"""A convenience method that processes the inputs before creating an instance.
 
-        `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
-        `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
+class LocalDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
+    """Local DPDK location parent class.
 
-        Args:
-            d: The configuration dictionary.
+    This class is meant to represent any location that is present only locally.
+    """
 
-        Returns:
-            The DPDK build configuration instance.
-        """
-        return cls(
-            arch=Architecture(d["arch"]),
-            os=OS(d["os"]),
-            cpu=CPUType(d["cpu"]),
-            compiler=Compiler(d["compiler"]),
-            compiler_wrapper=d.get("compiler_wrapper", ""),
-            name=f"{d['arch']}-{d['os']}-{d['cpu']}-{d['compiler']}",
-        )
+    remote: Literal[False] = False
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKLocation:
-    """DPDK location.
+class LocalDPDKTreeLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
+    """Local DPDK tree location.
 
-    The path to the DPDK sources, build dir and type of location.
+    This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
+    validation.
 
     Attributes:
-        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
-            must be provided.
-        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
-            provided.
-        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
-            located on the SUT node, instead of the execution host.
-        build_dir: If it's defined, DPDK has been pre-compiled and the build directory is located in
-            a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will be using
-            `build_options` from configuration to build the DPDK from source.
+        dpdk_tree: The path to the DPDK source tree directory.
     """
 
-    dpdk_tree: str | None
-    tarball: str | None
-    remote: bool
-    build_dir: str | None
+    dpdk_tree: Path
 
-    @classmethod
-    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
-        """A convenience method that processes and validates the inputs before creating an instance.
+    #: Resolve the local DPDK tree path
+    resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
 
-        Validate existence and format of `dpdk_tree` or `tarball` on local filesystem, if
-        `remote` is False.
+    @model_validator(mode="after")
+    def validate_dpdk_tree_path(self) -> Self:
+        """Validate the provided DPDK tree path."""
+        assert self.dpdk_tree.exists(), "DPDK tree not found in local filesystem."
+        assert self.dpdk_tree.is_dir(), "The DPDK tree path must be a directory."
+        return self
 
-        Args:
-            d: The configuration dictionary.
 
-        Returns:
-            The DPDK location instance.
+class LocalDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
+    """Local DPDK tarball location.
 
-        Raises:
-            ConfigurationError: If `dpdk_tree` or `tarball` not found in local filesystem or they
-                aren't in the right format.
-        """
-        dpdk_tree = d.get("dpdk_tree")
-        tarball = d.get("tarball")
-        remote = d.get("remote", False)
-
-        if not remote:
-            if dpdk_tree:
-                if not Path(dpdk_tree).exists():
-                    raise ConfigurationError(
-                        f"DPDK tree '{dpdk_tree}' not found in local filesystem."
-                    )
-
-                if not Path(dpdk_tree).is_dir():
-                    raise ConfigurationError(f"The DPDK tree '{dpdk_tree}' must be a directory.")
-
-                dpdk_tree = os.path.realpath(dpdk_tree)
-
-            if tarball:
-                if not Path(tarball).exists():
-                    raise ConfigurationError(
-                        f"DPDK tarball '{tarball}' not found in local filesystem."
-                    )
-
-                if not tarfile.is_tarfile(tarball):
-                    raise ConfigurationError(
-                        f"The DPDK tarball '{tarball}' must be a valid tar archive."
-                    )
-
-        return cls(
-            dpdk_tree=dpdk_tree,
-            tarball=tarball,
-            remote=remote,
-            build_dir=d.get("precompiled_build_dir"),
-        )
+    This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
+    validation.
+
+    Attributes:
+        tarball: The path to the DPDK tarball.
+    """
 
+    tarball: Path
 
-@dataclass
-class DPDKConfiguration:
-    """The configuration of the DPDK build.
+    #: Resolve the local tarball path
+    resolve_tarball_path = field_validator("tarball")(resolve_path)
 
-    The configuration contain the location of the DPDK and configuration used for
-    building it.
+    @model_validator(mode="after")
+    def validate_tarball_path(self) -> Self:
+        """Validate the provided tarball."""
+        assert self.tarball.exists(), "DPDK tarball not found in local filesystem."
+        assert tarfile.is_tarfile(self.tarball), "The DPDK tarball must be a valid tar archive."
+        return self
+
+
+class RemoteDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
+    """Remote DPDK location parent class.
+
+    This class is meant to represent any location that is present only remotely.
+    """
+
+    remote: Literal[True] = True
+
+
+class RemoteDPDKTreeLocation(RemoteDPDKLocation, frozen=True, extra="forbid"):
+    """Remote DPDK tree location.
+
+    This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
+
+    Attributes:
+        dpdk_tree: The path to the DPDK source tree directory.
+    """
+
+    dpdk_tree: PurePath
+
+
+class RemoteDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
+    """Remote DPDK tarball location.
+
+    This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
+    validation.
+
+    Attributes:
+        tarball: The path to the DPDK tarball.
+    """
+
+    tarball: PurePath
+
+
+#: Union type for different DPDK locations
+DPDKLocation = (
+    LocalDPDKTreeLocation
+    | LocalDPDKTarballLocation
+    | RemoteDPDKTreeLocation
+    | RemoteDPDKTarballLocation
+)
+
+
+class BaseDPDKBuildConfiguration(BaseModel, frozen=True, extra="forbid"):
+    """The base configuration for different types of build.
+
+    The configuration contain the location of the DPDK and configuration used for building it.
 
     Attributes:
         dpdk_location: The location of the DPDK tree.
-        dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
-            DTS will use pre-built DPDK from `build_dir` in a :class:`DPDKLocation`.
     """
 
     dpdk_location: DPDKLocation
-    dpdk_build_config: DPDKBuildConfiguration | None
 
-    @classmethod
-    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
 
-        Args:
-            d: The configuration dictionary.
+class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
+    """DPDK precompiled build configuration.
 
-        Returns:
-            The DPDK configuration.
-        """
-        return cls(
-            dpdk_location=DPDKLocation.from_dict(d),
-            dpdk_build_config=(
-                DPDKBuildConfiguration.from_dict(d["build_options"])
-                if d.get("build_options")
-                else None
-            ),
-        )
+    Attributes:
+        precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory
+            is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will
+            be using `dpdk_build_config` from configuration to build the DPDK from source.
+    """
+
+    precompiled_build_dir: str = Field(min_length=1)
+
+
+class DPDKBuildOptionsConfiguration(BaseModel, frozen=True, extra="forbid"):
+    """DPDK build options configuration.
+
+    The build options used for building DPDK.
+
+    Attributes:
+        arch: The target architecture to build for.
+        os: The target os to build for.
+        cpu: The target CPU to build for.
+        compiler: The compiler executable to use.
+        compiler_wrapper: This string will be put in front of the compiler when executing the build.
+            Useful for adding wrapper commands, such as ``ccache``.
+    """
+
+    arch: Architecture
+    os: OS
+    cpu: CPUType
+    compiler: Compiler
+    compiler_wrapper: str = ""
 
+    @cached_property
+    def name(self) -> str:
+        """The name of the compiler."""
+        return f"{self.arch}-{self.os}-{self.cpu}-{self.compiler}"
 
-@dataclass(slots=True, frozen=True)
-class TestSuiteConfig:
+
+class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
+    """DPDK uncompiled build configuration.
+
+    Attributes:
+        build_options: The build options to compile DPDK.
+    """
+
+    build_options: DPDKBuildOptionsConfiguration
+
+
+#: Union type for different build configurations
+DPDKBuildConfiguration = DPDKPrecompiledBuildConfiguration | DPDKUncompiledBuildConfiguration
+
+
+class TestSuiteConfig(BaseModel, frozen=True, extra="forbid"):
     """Test suite configuration.
 
-    Information about a single test suite to be executed.
+    Information about a single test suite to be executed. This can also be represented as a string
+    instead of a mapping, example:
+
+    .. code:: yaml
+
+        test_runs:
+        - test_suites:
+            # As string representation:
+            - hello_world # test all of `hello_world`, or
+            - hello_world hello_world_single_core # test only `hello_world_single_core`
+            # or as model fields:
+            - test_suite: hello_world
+              test_cases: [hello_world_single_core] # without this field all test cases are run
 
     Attributes:
-        test_suite: The name of the test suite module without the starting ``TestSuite_``.
-        test_cases: The names of test cases from this test suite to execute.
+        test_suite_name: The name of the test suite module without the starting ``TestSuite_``.
+        test_cases_names: The names of test cases from this test suite to execute.
             If empty, all test cases will be executed.
     """
 
-    test_suite: str
-    test_cases: list[str]
-
+    test_suite_name: str = Field(
+        title="Test suite name",
+        description="The identifying module name of the test suite without the prefix.",
+        alias="test_suite",
+    )
+    test_cases_names: list[str] = Field(
+        default_factory=list,
+        title="Test cases by name",
+        description="The identifying name of the test cases of the test suite.",
+        alias="test_cases",
+    )
+
+    @cached_property
+    def test_suite_spec(self) -> "TestSuiteSpec":
+        """The specification of the requested test suite."""
+        from framework.test_suite import find_by_name
+
+        test_suite_spec = find_by_name(self.test_suite_name)
+        assert (
+            test_suite_spec is not None
+        ), f"{self.test_suite_name} is not a valid test suite module name."
+        return test_suite_spec
+
+    @model_validator(mode="before")
     @classmethod
-    def from_dict(
-        cls,
-        entry: str | TestSuiteConfigDict,
-    ) -> Self:
-        """Create an instance from two different types.
+    def convert_from_string(cls, data: Any) -> Any:
+        """Convert the string representation of the model into a valid mapping."""
+        if isinstance(data, str):
+            [test_suite, *test_cases] = data.split()
+            return dict(test_suite=test_suite, test_cases=test_cases)
+        return data
+
+    @model_validator(mode="after")
+    def validate_names(self) -> Self:
+        """Validate the supplied test suite and test cases names.
+
+        This validator relies on the cached property `test_suite_spec` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        available_test_cases = map(
+            lambda t: t.name, self.test_suite_spec.class_obj.get_test_cases()
+        )
+        for requested_test_case in self.test_cases_names:
+            assert requested_test_case in available_test_cases, (
+                f"{requested_test_case} is not a valid test case "
+                f"of test suite {self.test_suite_name}."
+            )
 
-        Args:
-            entry: Either a suite name or a dictionary containing the config.
+        return self
 
-        Returns:
-            The test suite configuration instance.
-        """
-        if isinstance(entry, str):
-            return cls(test_suite=entry, test_cases=[])
-        elif isinstance(entry, dict):
-            return cls(test_suite=entry["suite"], test_cases=entry["cases"])
-        else:
-            raise TypeError(f"{type(entry)} is not valid for a test suite config.")
 
+class TestRunSUTNodeConfiguration(BaseModel, frozen=True, extra="forbid"):
+    """The SUT node configuration of a test run.
 
-@dataclass(slots=True, frozen=True)
-class TestRunConfiguration:
+    Attributes:
+        node_name: The SUT node to use in this test run.
+        vdevs: The names of virtual devices to test.
+    """
+
+    node_name: str
+    vdevs: list[str] = Field(default_factory=list)
+
+
+class TestRunConfiguration(BaseModel, frozen=True, extra="forbid"):
     """The configuration of a test run.
 
     The configuration contains testbed information, what tests to execute
@@ -524,144 +530,130 @@ class TestRunConfiguration:
         func: Whether to run functional tests.
         skip_smoke_tests: Whether to skip smoke tests.
         test_suites: The names of test suites and/or test cases to execute.
-        system_under_test_node: The SUT node to use in this test run.
-        traffic_generator_node: The TG node to use in this test run.
-        vdevs: The names of virtual devices to test.
+        system_under_test_node: The SUT node configuration to use in this test run.
+        traffic_generator_node: The TG node name to use in this test run.
         random_seed: The seed to use for pseudo-random generation.
     """
 
-    dpdk_config: DPDKConfiguration
-    perf: bool
-    func: bool
-    skip_smoke_tests: bool
-    test_suites: list[TestSuiteConfig]
-    system_under_test_node: SutNodeConfiguration
-    traffic_generator_node: TGNodeConfiguration
-    vdevs: list[str]
-    random_seed: int | None
-
-    @classmethod
-    def from_dict(
-        cls,
-        d: TestRunConfigDict,
-        node_map: dict[str, SutNodeConfiguration | TGNodeConfiguration],
-    ) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
-
-        The DPDK build and the test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
-
-        Args:
-            d: The test run configuration dictionary.
-            node_map: A dictionary mapping node names to their config objects.
-
-        Returns:
-            The test run configuration instance.
-        """
-        test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"]))
-        sut_name = d["system_under_test_node"]["node_name"]
-        skip_smoke_tests = d.get("skip_smoke_tests", False)
-        assert sut_name in node_map, f"Unknown SUT {sut_name} in test run {d}"
-        system_under_test_node = node_map[sut_name]
-        assert isinstance(
-            system_under_test_node, SutNodeConfiguration
-        ), f"Invalid SUT configuration {system_under_test_node}"
-
-        tg_name = d["traffic_generator_node"]
-        assert tg_name in node_map, f"Unknown TG {tg_name} in test run {d}"
-        traffic_generator_node = node_map[tg_name]
-        assert isinstance(
-            traffic_generator_node, TGNodeConfiguration
-        ), f"Invalid TG configuration {traffic_generator_node}"
-
-        vdevs = (
-            d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else []
-        )
-        random_seed = d.get("random_seed", None)
-        return cls(
-            dpdk_config=DPDKConfiguration.from_dict(d["dpdk_build"]),
-            perf=d["perf"],
-            func=d["func"],
-            skip_smoke_tests=skip_smoke_tests,
-            test_suites=test_suites,
-            system_under_test_node=system_under_test_node,
-            traffic_generator_node=traffic_generator_node,
-            vdevs=vdevs,
-            random_seed=random_seed,
-        )
-
-    def copy_and_modify(self, **kwargs) -> Self:
-        """Create a shallow copy with any of the fields modified.
+    dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
+    perf: bool = Field(description="Enable performance testing.")
+    func: bool = Field(description="Enable functional testing.")
+    skip_smoke_tests: bool = False
+    test_suites: list[TestSuiteConfig] = Field(min_length=1)
+    system_under_test_node: TestRunSUTNodeConfiguration
+    traffic_generator_node: str
+    random_seed: int | None = None
 
-        The only new data are those passed to this method.
-        The rest are copied from the object's fields calling the method.
 
-        Args:
-            **kwargs: The names and types of keyword arguments are defined
-                by the fields of the :class:`TestRunConfiguration` class.
+class TestRunWithNodesConfiguration(NamedTuple):
+    """Tuple containing the configuration of the test run and its associated nodes."""
 
-        Returns:
-            The copied and modified test run configuration.
-        """
-        new_config = {}
-        for field in fields(self):
-            if field.name in kwargs:
-                new_config[field.name] = kwargs[field.name]
-            else:
-                new_config[field.name] = getattr(self, field.name)
-
-        return type(self)(**new_config)
+    #:
+    test_run_config: TestRunConfiguration
+    #:
+    sut_node_config: SutNodeConfiguration
+    #:
+    tg_node_config: TGNodeConfiguration
 
 
-@dataclass(slots=True, frozen=True)
-class Configuration:
+class Configuration(BaseModel, extra="forbid"):
     """DTS testbed and test configuration.
 
-    The node configuration is not stored in this object. Rather, all used node configurations
-    are stored inside the test run configuration where the nodes are actually used.
-
     Attributes:
         test_runs: Test run configurations.
+        nodes: Node configurations.
     """
 
-    test_runs: list[TestRunConfiguration]
+    test_runs: list[TestRunConfiguration] = Field(min_length=1)
+    nodes: list[NodeConfigurationTypes] = Field(min_length=1)
 
-    @classmethod
-    def from_dict(cls, d: ConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
+    @cached_property
+    def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]:
+        """List of test runs with the associated nodes."""
+        test_runs_with_nodes = []
 
-        DPDK build and test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
+        for test_run_no, test_run in enumerate(self.test_runs):
+            sut_node_name = test_run.system_under_test_node.node_name
+            sut_node = next(filter(lambda n: n.name == sut_node_name, self.nodes), None)
 
-        Args:
-            d: The configuration dictionary.
+            assert sut_node is not None, (
+                f"test_runs.{test_run_no}.sut_node_config.node_name "
+                f"({test_run.system_under_test_node.node_name}) is not a valid node name"
+            )
+            assert isinstance(sut_node, SutNodeConfiguration), (
+                f"test_runs.{test_run_no}.sut_node_config.node_name is a valid node name, "
+                "but it is not a valid SUT node"
+            )
 
-        Returns:
-            The whole configuration instance.
-        """
-        nodes: list[SutNodeConfiguration | TGNodeConfiguration] = list(
-            map(NodeConfiguration.from_dict, d["nodes"])
-        )
-        assert len(nodes) > 0, "There must be a node to test"
+            tg_node_name = test_run.traffic_generator_node
+            tg_node = next(filter(lambda n: n.name == tg_node_name, self.nodes), None)
 
-        node_map = {node.name: node for node in nodes}
-        assert len(nodes) == len(node_map), "Duplicate node names are not allowed"
+            assert tg_node is not None, (
+                f"test_runs.{test_run_no}.tg_node_name "
+                f"({test_run.traffic_generator_node}) is not a valid node name"
+            )
+            assert isinstance(tg_node, TGNodeConfiguration), (
+                f"test_runs.{test_run_no}.tg_node_name is a valid node name, "
+                "but it is not a valid TG node"
+            )
 
-        test_runs: list[TestRunConfiguration] = list(
-            map(TestRunConfiguration.from_dict, d["test_runs"], [node_map for _ in d])
-        )
+            test_runs_with_nodes.append(TestRunWithNodesConfiguration(test_run, sut_node, tg_node))
+
+        return test_runs_with_nodes
+
+    @field_validator("nodes")
+    @classmethod
+    def validate_node_names(cls, nodes: list[NodeConfiguration]) -> list[NodeConfiguration]:
+        """Validate that the node names are unique."""
+        nodes_by_name: dict[str, int] = {}
+        for node_no, node in enumerate(nodes):
+            assert node.name not in nodes_by_name, (
+                f"node {node_no} cannot have the same name as node {nodes_by_name[node.name]} "
+                f"({node.name})"
+            )
+            nodes_by_name[node.name] = node_no
+
+        return nodes
+
+    @model_validator(mode="after")
+    def validate_ports(self) -> Self:
+        """Validate that the ports are all linked to valid ones."""
+        port_links: dict[tuple[str, str], Literal[False] | tuple[int, int]] = {
+            (node.name, port.pci): False for node in self.nodes for port in node.ports
+        }
+
+        for node_no, node in enumerate(self.nodes):
+            for port_no, port in enumerate(node.ports):
+                peer_port_identifier = (port.peer_node, port.peer_pci)
+                peer_port = port_links.get(peer_port_identifier, None)
+                assert peer_port is not None, (
+                    "invalid peer port specified for " f"nodes.{node_no}.ports.{port_no}"
+                )
+                assert peer_port is False, (
+                    f"the peer port specified for nodes.{node_no}.ports.{port_no} "
+                    f"is already linked to nodes.{peer_port[0]}.ports.{peer_port[1]}"
+                )
+                port_links[peer_port_identifier] = (node_no, port_no)
 
-        return cls(test_runs=test_runs)
+        return self
+
+    @model_validator(mode="after")
+    def validate_test_runs_with_nodes(self) -> Self:
+        """Validate the test runs to nodes associations.
+
+        This validator relies on the cached property `test_runs_with_nodes` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        if self.test_runs_with_nodes:
+            pass
+        return self
 
 
 def load_config(config_file_path: Path) -> Configuration:
     """Load DTS test run configuration from a file.
 
-    Load the YAML test run configuration file
-    and :download:`the configuration file schema <conf_yaml_schema.json>`,
-    validate the test run configuration file, and create a test run configuration object.
+    Load the YAML test run configuration file, validate it, and create a test run configuration
+    object.
 
     The YAML test run configuration file is specified in the :option:`--config-file` command line
     argument or the :envvar:`DTS_CFG_FILE` environment variable.
@@ -671,14 +663,14 @@ def load_config(config_file_path: Path) -> Configuration:
 
     Returns:
         The parsed test run configuration.
+
+    Raises:
+        ConfigurationError: If the supplied configuration file is invalid.
     """
     with open(config_file_path, "r") as f:
         config_data = yaml.safe_load(f)
 
-    schema_path = os.path.join(Path(__file__).parent.resolve(), "conf_yaml_schema.json")
-
-    with open(schema_path, "r") as f:
-        schema = json.load(f)
-    config = warlock.model_factory(schema, name="_Config")(config_data)
-    config_obj: Configuration = Configuration.from_dict(dict(config))  # type: ignore[arg-type]
-    return config_obj
+    try:
+        return Configuration.model_validate(config_data)
+    except ValidationError as e:
+        raise ConfigurationError("failed to load the supplied configuration") from e
diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
deleted file mode 100644
index cc3e78cef5..0000000000
--- a/dts/framework/config/conf_yaml_schema.json
+++ /dev/null
@@ -1,459 +0,0 @@
-{
-  "$schema": "https://json-schema.org/draft-07/schema",
-  "title": "DTS Config Schema",
-  "definitions": {
-    "node_name": {
-      "type": "string",
-      "description": "A unique identifier for a node"
-    },
-    "NIC": {
-      "type": "string",
-      "enum": [
-        "ALL",
-        "ConnectX3_MT4103",
-        "ConnectX4_LX_MT4117",
-        "ConnectX4_MT4115",
-        "ConnectX5_MT4119",
-        "ConnectX5_MT4121",
-        "I40E_10G-10G_BASE_T_BC",
-        "I40E_10G-10G_BASE_T_X722",
-        "I40E_10G-SFP_X722",
-        "I40E_10G-SFP_XL710",
-        "I40E_10G-X722_A0",
-        "I40E_1G-1G_BASE_T_X722",
-        "I40E_25G-25G_SFP28",
-        "I40E_40G-QSFP_A",
-        "I40E_40G-QSFP_B",
-        "IAVF-ADAPTIVE_VF",
-        "IAVF-VF",
-        "IAVF_10G-X722_VF",
-        "ICE_100G-E810C_QSFP",
-        "ICE_25G-E810C_SFP",
-        "ICE_25G-E810_XXV_SFP",
-        "IGB-I350_VF",
-        "IGB_1G-82540EM",
-        "IGB_1G-82545EM_COPPER",
-        "IGB_1G-82571EB_COPPER",
-        "IGB_1G-82574L",
-        "IGB_1G-82576",
-        "IGB_1G-82576_QUAD_COPPER",
-        "IGB_1G-82576_QUAD_COPPER_ET2",
-        "IGB_1G-82580_COPPER",
-        "IGB_1G-I210_COPPER",
-        "IGB_1G-I350_COPPER",
-        "IGB_1G-I354_SGMII",
-        "IGB_1G-PCH_LPTLP_I218_LM",
-        "IGB_1G-PCH_LPTLP_I218_V",
-        "IGB_1G-PCH_LPT_I217_LM",
-        "IGB_1G-PCH_LPT_I217_V",
-        "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
-        "IGC-I225_LM",
-        "IGC-I226_LM",
-        "IXGBE_10G-82599_SFP",
-        "IXGBE_10G-82599_SFP_SF_QP",
-        "IXGBE_10G-82599_T3_LOM",
-        "IXGBE_10G-82599_VF",
-        "IXGBE_10G-X540T",
-        "IXGBE_10G-X540_VF",
-        "IXGBE_10G-X550EM_A_SFP",
-        "IXGBE_10G-X550EM_X_10G_T",
-        "IXGBE_10G-X550EM_X_SFP",
-        "IXGBE_10G-X550EM_X_VF",
-        "IXGBE_10G-X550T",
-        "IXGBE_10G-X550_VF",
-        "brcm_57414",
-        "brcm_P2100G",
-        "cavium_0011",
-        "cavium_a034",
-        "cavium_a063",
-        "cavium_a064",
-        "fastlinq_ql41000",
-        "fastlinq_ql41000_vf",
-        "fastlinq_ql45000",
-        "fastlinq_ql45000_vf",
-        "hi1822",
-        "virtio"
-      ]
-    },
-
-    "ARCH": {
-      "type": "string",
-      "enum": [
-        "x86_64",
-        "arm64",
-        "ppc64le"
-      ]
-    },
-    "OS": {
-      "type": "string",
-      "enum": [
-        "linux"
-      ]
-    },
-    "cpu": {
-      "type": "string",
-      "description": "Native should be the default on x86",
-      "enum": [
-        "native",
-        "armv8a",
-        "dpaa2",
-        "thunderx",
-        "xgene1"
-      ]
-    },
-    "compiler": {
-      "type": "string",
-      "enum": [
-        "gcc",
-        "clang",
-        "icc",
-        "mscv"
-      ]
-    },
-    "build_options": {
-      "type": "object",
-      "properties": {
-        "arch": {
-          "type": "string",
-          "enum": [
-            "ALL",
-            "x86_64",
-            "arm64",
-            "ppc64le",
-            "other"
-          ]
-        },
-        "os": {
-          "$ref": "#/definitions/OS"
-        },
-        "cpu": {
-          "$ref": "#/definitions/cpu"
-        },
-        "compiler": {
-          "$ref": "#/definitions/compiler"
-        },
-        "compiler_wrapper": {
-          "type": "string",
-          "description": "This will be added before compiler to the CC variable when building DPDK. Optional."
-        }
-      },
-      "additionalProperties": false,
-      "required": [
-        "arch",
-        "os",
-        "cpu",
-        "compiler"
-      ]
-    },
-    "dpdk_build": {
-      "type": "object",
-      "description": "DPDK source and build configuration.",
-      "properties": {
-        "dpdk_tree": {
-          "type": "string",
-          "description": "The path to the DPDK source tree directory to test. Only one of `dpdk_tree` or `tarball` must be provided."
-        },
-        "tarball": {
-          "type": "string",
-          "description": "The path to the DPDK source tarball to test. Only one of `dpdk_tree` or `tarball` must be provided."
-        },
-        "remote": {
-          "type": "boolean",
-          "description": "Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball` is located on the SUT node, instead of the execution host."
-        },
-        "precompiled_build_dir": {
-          "type": "string",
-          "description": "If it's defined, DPDK has been pre-built and the build directory is located in a subdirectory of DPDK tree root directory. Otherwise, will be using a `build_options` to build the DPDK from source. Either this or `build_options` must be defined, but not both."
-        },
-        "build_options": {
-          "$ref": "#/definitions/build_options",
-          "description": "Either this or `precompiled_build_dir` must be defined, but not both. DPDK build configuration supported by DTS."
-        }
-      },
-      "allOf": [
-        {
-          "oneOf": [
-            {
-            "required": [
-              "dpdk_tree"
-              ]
-            },
-            {
-              "required": [
-                "tarball"
-              ]
-            }
-          ]
-        },
-        {
-          "oneOf": [
-            {
-              "required": [
-                "precompiled_build_dir"
-              ]
-            },
-            {
-              "required": [
-                "build_options"
-              ]
-            }
-          ]
-        }
-      ],
-      "additionalProperties": false
-    },
-    "hugepages_2mb": {
-      "type": "object",
-      "description": "Optional hugepage configuration. If not specified, hugepages won't be configured and DTS will use system configuration.",
-      "properties": {
-        "number_of": {
-          "type": "integer",
-          "description": "The number of hugepages to configure. Hugepage size will be the system default."
-        },
-        "force_first_numa": {
-          "type": "boolean",
-          "description": "Set to True to force configuring hugepages on the first NUMA node. Defaults to False."
-        }
-      },
-      "additionalProperties": false,
-      "required": [
-        "number_of"
-      ]
-    },
-    "mac_address": {
-      "type": "string",
-      "description": "A MAC address",
-      "pattern": "^([0-9A-Fa-f]{2}[:-]){5}([0-9A-Fa-f]{2})$"
-    },
-    "pci_address": {
-      "type": "string",
-      "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
-    },
-    "port_peer_address": {
-      "description": "Peer is a TRex port, and IXIA port or a PCI address",
-      "oneOf": [
-        {
-          "description": "PCI peer port",
-          "$ref": "#/definitions/pci_address"
-        }
-      ]
-    },
-    "test_suite": {
-      "type": "string",
-      "enum": [
-        "hello_world",
-        "os_udp",
-        "pmd_buffer_scatter",
-        "vlan"
-      ]
-    },
-    "test_target": {
-      "type": "object",
-      "properties": {
-        "suite": {
-          "$ref": "#/definitions/test_suite"
-        },
-        "cases": {
-          "type": "array",
-          "description": "If specified, only this subset of test suite's test cases will be run.",
-          "items": {
-            "type": "string"
-          },
-          "minimum": 1
-        }
-      },
-      "required": [
-        "suite"
-      ],
-      "additionalProperties": false
-    }
-  },
-  "type": "object",
-  "properties": {
-    "nodes": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "name": {
-            "type": "string",
-            "description": "A unique identifier for this node"
-          },
-          "hostname": {
-            "type": "string",
-            "description": "A hostname from which the node running DTS can access this node. This can also be an IP address."
-          },
-          "user": {
-            "type": "string",
-            "description": "The user to access this node with."
-          },
-          "password": {
-            "type": "string",
-            "description": "The password to use on this node. Use only as a last resort. SSH keys are STRONGLY preferred."
-          },
-          "arch": {
-            "$ref": "#/definitions/ARCH"
-          },
-          "os": {
-            "$ref": "#/definitions/OS"
-          },
-          "lcores": {
-            "type": "string",
-            "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
-            "description": "Optional comma-separated list of logical cores to use, e.g.: 1,2,3,4,5,18-22. Defaults to 1. An empty string means use all lcores."
-          },
-          "use_first_core": {
-            "type": "boolean",
-            "description": "Indicate whether DPDK should use the first physical core. It won't be used by default."
-          },
-          "memory_channels": {
-            "type": "integer",
-            "description": "How many memory channels to use. Optional, defaults to 1."
-          },
-          "hugepages_2mb": {
-            "$ref": "#/definitions/hugepages_2mb"
-          },
-          "ports": {
-            "type": "array",
-            "items": {
-              "type": "object",
-              "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.",
-              "properties": {
-                "pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The local PCI address of the port"
-                },
-                "os_driver_for_dpdk": {
-                  "type": "string",
-                  "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)"
-                },
-                "os_driver": {
-                  "type": "string",
-                  "description": "The driver normally used by this port (ex: i40e)"
-                },
-                "peer_node": {
-                  "type": "string",
-                  "description": "The name of the node the peer port is on"
-                },
-                "peer_pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The PCI address of the peer port"
-                }
-              },
-              "additionalProperties": false,
-              "required": [
-                "pci",
-                "os_driver_for_dpdk",
-                "os_driver",
-                "peer_node",
-                "peer_pci"
-              ]
-            },
-            "minimum": 1
-          },
-          "traffic_generator": {
-            "oneOf": [
-              {
-                "type": "object",
-                "description": "Scapy traffic generator. Used for functional testing.",
-                "properties": {
-                  "type": {
-                    "type": "string",
-                    "enum": [
-                      "SCAPY"
-                    ]
-                  }
-                }
-              }
-            ]
-          }
-        },
-        "additionalProperties": false,
-        "required": [
-          "name",
-          "hostname",
-          "user",
-          "arch",
-          "os"
-        ]
-      },
-      "minimum": 1
-    },
-    "test_runs": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "dpdk_build": {
-            "$ref": "#/definitions/dpdk_build"
-          },
-          "perf": {
-            "type": "boolean",
-            "description": "Enable performance testing."
-          },
-          "func": {
-            "type": "boolean",
-            "description": "Enable functional testing."
-          },
-          "test_suites": {
-            "type": "array",
-            "items": {
-              "oneOf": [
-                {
-                  "$ref": "#/definitions/test_suite"
-                },
-                {
-                  "$ref": "#/definitions/test_target"
-                }
-              ]
-            }
-          },
-          "skip_smoke_tests": {
-            "description": "Optional field that allows you to skip smoke testing",
-            "type": "boolean"
-          },
-          "system_under_test_node": {
-            "type":"object",
-            "properties": {
-              "node_name": {
-                "$ref": "#/definitions/node_name"
-              },
-              "vdevs": {
-                "description": "Optional list of names of vdevs to be used in the test run",
-                "type": "array",
-                "items": {
-                  "type": "string"
-                }
-              }
-            },
-            "required": [
-              "node_name"
-            ]
-          },
-          "traffic_generator_node": {
-            "$ref": "#/definitions/node_name"
-          },
-          "random_seed": {
-            "type": "integer",
-            "description": "Optional field. Allows you to set a seed for pseudo-random generation."
-          }
-        },
-        "additionalProperties": false,
-        "required": [
-          "dpdk_build",
-          "perf",
-          "func",
-          "test_suites",
-          "system_under_test_node",
-          "traffic_generator_node"
-        ]
-      },
-      "minimum": 1
-    }
-  },
-  "required": [
-    "test_runs",
-    "nodes"
-  ],
-  "additionalProperties": false
-}
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
deleted file mode 100644
index 02e738a61e..0000000000
--- a/dts/framework/config/types.py
+++ /dev/null
@@ -1,149 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-"""Configuration dictionary contents specification.
-
-These type definitions serve as documentation of the configuration dictionary contents.
-
-The definitions use the built-in :class:`~typing.TypedDict` construct.
-"""
-
-from typing import TypedDict
-
-
-class PortConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    pci: str
-    #:
-    os_driver_for_dpdk: str
-    #:
-    os_driver: str
-    #:
-    peer_node: str
-    #:
-    peer_pci: str
-
-
-class TrafficGeneratorConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    type: str
-
-
-class HugepageConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    number_of: int
-    #:
-    force_first_numa: bool
-
-
-class NodeConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    hugepages_2mb: HugepageConfigurationDict
-    #:
-    name: str
-    #:
-    hostname: str
-    #:
-    user: str
-    #:
-    password: str
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    lcores: str
-    #:
-    use_first_core: bool
-    #:
-    ports: list[PortConfigDict]
-    #:
-    memory_channels: int
-    #:
-    traffic_generator: TrafficGeneratorConfigDict
-
-
-class DPDKBuildConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    cpu: str
-    #:
-    compiler: str
-    #:
-    compiler_wrapper: str
-
-
-class DPDKConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    dpdk_tree: str | None
-    #:
-    tarball: str | None
-    #:
-    remote: bool
-    #:
-    precompiled_build_dir: str | None
-    #:
-    build_options: DPDKBuildConfigDict
-
-
-class TestSuiteConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    suite: str
-    #:
-    cases: list[str]
-
-
-class TestRunSUTConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    node_name: str
-    #:
-    vdevs: list[str]
-
-
-class TestRunConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    dpdk_build: DPDKConfigurationDict
-    #:
-    perf: bool
-    #:
-    func: bool
-    #:
-    skip_smoke_tests: bool
-    #:
-    test_suites: TestSuiteConfigDict
-    #:
-    system_under_test_node: TestRunSUTConfigDict
-    #:
-    traffic_generator_node: str
-    #:
-    random_seed: int
-
-
-class ConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    nodes: list[NodeConfigDict]
-    #:
-    test_runs: list[TestRunConfigDict]
diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 195622c653..c3d9a27a8c 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -30,7 +30,15 @@
 from framework.testbed_model.sut_node import SutNode
 from framework.testbed_model.tg_node import TGNode
 
-from .config import Configuration, TestRunConfiguration, TestSuiteConfig, load_config
+from .config import (
+    Configuration,
+    DPDKPrecompiledBuildConfiguration,
+    SutNodeConfiguration,
+    TestRunConfiguration,
+    TestSuiteConfig,
+    TGNodeConfiguration,
+    load_config,
+)
 from .exception import (
     BlockingTestSuiteError,
     ConfigurationError,
@@ -133,11 +141,10 @@ def run(self) -> None:
             self._result.update_setup(Result.PASS)
 
             # for all test run sections
-            for test_run_config in self._configuration.test_runs:
+            for test_run_with_nodes_config in self._configuration.test_runs_with_nodes:
+                test_run_config, sut_node_config, tg_node_config = test_run_with_nodes_config
                 self._logger.set_stage(DtsStage.test_run_setup)
-                self._logger.info(
-                    f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
-                )
+                self._logger.info(f"Running test run with SUT '{sut_node_config.name}'.")
                 self._init_random_seed(test_run_config)
                 test_run_result = self._result.add_test_run(test_run_config)
                 # we don't want to modify the original config, so create a copy
@@ -145,7 +152,7 @@ def run(self) -> None:
                     SETTINGS.test_suites if SETTINGS.test_suites else test_run_config.test_suites
                 )
                 if not test_run_config.skip_smoke_tests:
-                    test_run_test_suites[:0] = [TestSuiteConfig.from_dict("smoke_tests")]
+                    test_run_test_suites[:0] = [TestSuiteConfig(test_suite="smoke_tests")]
                 try:
                     test_suites_with_cases = self._get_test_suites_with_cases(
                         test_run_test_suites, test_run_config.func, test_run_config.perf
@@ -161,6 +168,8 @@ def run(self) -> None:
                     self._connect_nodes_and_run_test_run(
                         sut_nodes,
                         tg_nodes,
+                        sut_node_config,
+                        tg_node_config,
                         test_run_config,
                         test_run_result,
                         test_suites_with_cases,
@@ -223,10 +232,10 @@ def _get_test_suites_with_cases(
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
+            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
             test_cases: list[type[TestCase]] = []
             func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
-                test_suite_config.test_cases
+                test_suite_config.test_cases_names
             )
             if func:
                 test_cases.extend(func_test_cases)
@@ -305,6 +314,8 @@ def _connect_nodes_and_run_test_run(
         self,
         sut_nodes: dict[str, SutNode],
         tg_nodes: dict[str, TGNode],
+        sut_node_config: SutNodeConfiguration,
+        tg_node_config: TGNodeConfiguration,
         test_run_config: TestRunConfiguration,
         test_run_result: TestRunResult,
         test_suites_with_cases: Iterable[TestSuiteWithCases],
@@ -319,24 +330,26 @@ def _connect_nodes_and_run_test_run(
         Args:
             sut_nodes: A dictionary storing connected/to be connected SUT nodes.
             tg_nodes: A dictionary storing connected/to be connected TG nodes.
+            sut_node_config: The test run's SUT node configuration.
+            tg_node_config: The test run's TG node configuration.
             test_run_config: A test run configuration.
             test_run_result: The test run's result.
             test_suites_with_cases: The test suites with test cases to run.
         """
-        sut_node = sut_nodes.get(test_run_config.system_under_test_node.name)
-        tg_node = tg_nodes.get(test_run_config.traffic_generator_node.name)
+        sut_node = sut_nodes.get(sut_node_config.name)
+        tg_node = tg_nodes.get(tg_node_config.name)
 
         try:
             if not sut_node:
-                sut_node = SutNode(test_run_config.system_under_test_node)
+                sut_node = SutNode(sut_node_config)
                 sut_nodes[sut_node.name] = sut_node
             if not tg_node:
-                tg_node = TGNode(test_run_config.traffic_generator_node)
+                tg_node = TGNode(tg_node_config)
                 tg_nodes[tg_node.name] = tg_node
         except Exception as e:
-            failed_node = test_run_config.system_under_test_node.name
+            failed_node = test_run_config.system_under_test_node.node_name
             if sut_node:
-                failed_node = test_run_config.traffic_generator_node.name
+                failed_node = test_run_config.traffic_generator_node
             self._logger.exception(f"The Creation of node {failed_node} failed.")
             test_run_result.update_setup(Result.FAIL, e)
 
@@ -369,14 +382,22 @@ def _run_test_run(
             ConfigurationError: If the DPDK sources or build is not set up from config or settings.
         """
         self._logger.info(
-            f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
+            f"Running test run with SUT '{test_run_config.system_under_test_node.node_name}'."
         )
         test_run_result.add_sut_info(sut_node.node_info)
         try:
-            dpdk_location = SETTINGS.dpdk_location or test_run_config.dpdk_config.dpdk_location
-            sut_node.set_up_test_run(test_run_config, dpdk_location)
+            dpdk_build_config = test_run_config.dpdk_config
+            if new_location := SETTINGS.dpdk_location:
+                dpdk_build_config = dpdk_build_config.model_copy(
+                    update={"dpdk_location": new_location}
+                )
+            if dir := SETTINGS.precompiled_build_dir:
+                dpdk_build_config = DPDKPrecompiledBuildConfiguration(
+                    dpdk_location=dpdk_build_config.dpdk_location, precompiled_build_dir=dir
+                )
+            sut_node.set_up_test_run(test_run_config, dpdk_build_config)
             test_run_result.add_dpdk_build_info(sut_node.get_dpdk_build_info())
-            tg_node.set_up_test_run(test_run_config, dpdk_location)
+            tg_node.set_up_test_run(test_run_config, dpdk_build_config)
             test_run_result.update_setup(Result.PASS)
         except Exception as e:
             self._logger.exception("Test run setup failed.")
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index a452319b90..1253ed86ac 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -60,9 +60,8 @@
 .. option:: --precompiled-build-dir
 .. envvar:: DTS_PRECOMPILED_BUILD_DIR
 
-    Define the subdirectory under the DPDK tree root directory where the pre-compiled binaries are
-    located. If set, DTS will build DPDK under the `build` directory instead. Can only be used with
-    --dpdk-tree or --tarball.
+    Define the subdirectory under the DPDK tree root directory or tarball where the pre-compiled
+    binaries are located.
 
 .. option:: --test-suite
 .. envvar:: DTS_TEST_SUITES
@@ -95,13 +94,21 @@
 import argparse
 import os
 import sys
-import tarfile
 from argparse import Action, ArgumentDefaultsHelpFormatter, _get_action_name
 from dataclasses import dataclass, field
 from pathlib import Path
 from typing import Callable
 
-from .config import DPDKLocation, TestSuiteConfig
+from pydantic import ValidationError
+
+from .config import (
+    DPDKLocation,
+    LocalDPDKTarballLocation,
+    LocalDPDKTreeLocation,
+    RemoteDPDKTarballLocation,
+    RemoteDPDKTreeLocation,
+    TestSuiteConfig,
+)
 
 
 @dataclass(slots=True)
@@ -122,6 +129,8 @@ class Settings:
     #:
     dpdk_location: DPDKLocation | None = None
     #:
+    precompiled_build_dir: str | None = None
+    #:
     compile_timeout: float = 1200
     #:
     test_suites: list[TestSuiteConfig] = field(default_factory=list)
@@ -383,13 +392,11 @@ def _get_parser() -> _DTSArgumentParser:
 
     action = dpdk_build.add_argument(
         "--precompiled-build-dir",
-        help="Define the subdirectory under the DPDK tree root directory where the pre-compiled "
-        "binaries are located. If set, DTS will build DPDK under the `build` directory instead. "
-        "Can only be used with --dpdk-tree or --tarball.",
+        help="Define the subdirectory under the DPDK tree root directory or tarball where the "
+        "pre-compiled binaries are located.",
         metavar="DIR_NAME",
     )
     _add_env_var_to_action(action)
-    _required_with_one_of(parser, action, "dpdk_tarball_path", "dpdk_tree_path")
 
     action = parser.add_argument(
         "--compile-timeout",
@@ -442,61 +449,61 @@ def _get_parser() -> _DTSArgumentParser:
 
 
 def _process_dpdk_location(
+    parser: _DTSArgumentParser,
     dpdk_tree: str | None,
     tarball: str | None,
     remote: bool,
-    build_dir: str | None,
-):
+) -> DPDKLocation | None:
     """Process and validate DPDK build arguments.
 
     Ensures that either `dpdk_tree` or `tarball` is provided. Validate existence and format of
     `dpdk_tree` or `tarball` on local filesystem, if `remote` is False. Constructs and returns
-    the :class:`DPDKLocation` with the provided parameters if validation is successful.
+    any valid :class:`DPDKLocation` with the provided parameters if validation is successful.
 
     Args:
-        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
-            must be provided.
-        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
-            provided.
+        dpdk_tree: The path to the DPDK source tree directory.
+        tarball: The path to the DPDK tarball.
         remote: If :data:`True`, `dpdk_tree` or `tarball` is located on the SUT node, instead of the
             execution host.
-        build_dir: If it's defined, DPDK has been pre-built and the build directory is located in a
-            subdirectory of `dpdk_tree` or `tarball` root directory.
 
     Returns:
         A DPDK location if construction is successful, otherwise None.
-
-    Raises:
-        argparse.ArgumentTypeError: If `dpdk_tree` or `tarball` not found in local filesystem or
-            they aren't in the right format.
     """
-    if not (dpdk_tree or tarball):
-        return None
-
-    if not remote:
-        if dpdk_tree:
-            if not Path(dpdk_tree).exists():
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tree '{dpdk_tree}' not found in local filesystem."
-                )
-
-            if not Path(dpdk_tree).is_dir():
-                raise argparse.ArgumentTypeError(f"DPDK tree '{dpdk_tree}' must be a directory.")
-
-            dpdk_tree = os.path.realpath(dpdk_tree)
-
-        if tarball:
-            if not Path(tarball).exists():
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tarball '{tarball}' not found in local filesystem."
-                )
-
-            if not tarfile.is_tarfile(tarball):
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tarball '{tarball}' must be a valid tar archive."
-                )
-
-    return DPDKLocation(dpdk_tree=dpdk_tree, tarball=tarball, remote=remote, build_dir=build_dir)
+    if dpdk_tree:
+        action = parser.find_action("dpdk_tree", _is_from_env)
+
+        try:
+            if remote:
+                return RemoteDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
+            else:
+                return LocalDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
+        except ValidationError as e:
+            print(
+                "An error has occurred while validating the DPDK tree supplied in the "
+                f"{'environment variable' if action else 'arguments'}:",
+                file=sys.stderr,
+            )
+            print(e, file=sys.stderr)
+            sys.exit(1)
+
+    if tarball:
+        action = parser.find_action("tarball", _is_from_env)
+
+        try:
+            if remote:
+                return RemoteDPDKTarballLocation.model_validate({"tarball": tarball})
+            else:
+                return LocalDPDKTarballLocation.model_validate({"tarball": tarball})
+        except ValidationError as e:
+            print(
+                "An error has occurred while validating the DPDK tarball supplied in the "
+                f"{'environment variable' if action else 'arguments'}:",
+                file=sys.stderr,
+            )
+            print(e, file=sys.stderr)
+            sys.exit(1)
+
+    return None
 
 
 def _process_test_suites(
@@ -512,11 +519,24 @@ def _process_test_suites(
     Returns:
         A list of test suite configurations to execute.
     """
-    if parser.find_action("test_suites", _is_from_env):
+    action = parser.find_action("test_suites", _is_from_env)
+    if action:
         # Environment variable in the form of "SUITE1 CASE1 CASE2, SUITE2 CASE1, SUITE3, ..."
         args = [suite_with_cases.split() for suite_with_cases in args[0][0].split(",")]
 
-    return [TestSuiteConfig(test_suite, test_cases) for [test_suite, *test_cases] in args]
+    try:
+        return [
+            TestSuiteConfig(test_suite=test_suite, test_cases=test_cases)
+            for [test_suite, *test_cases] in args
+        ]
+    except ValidationError as e:
+        print(
+            "An error has occurred while validating the test suites supplied in the "
+            f"{'environment variable' if action else 'arguments'}:",
+            file=sys.stderr,
+        )
+        print(e, file=sys.stderr)
+        sys.exit(1)
 
 
 def get_settings() -> Settings:
@@ -536,7 +556,7 @@ def get_settings() -> Settings:
     args = parser.parse_args()
 
     args.dpdk_location = _process_dpdk_location(
-        args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source, args.precompiled_build_dir
+        parser, args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source
     )
     args.test_suites = _process_test_suites(parser, args.test_suites)
 
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index 62867fd80c..6031eaf937 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -17,7 +17,12 @@
 from ipaddress import IPv4Interface, IPv6Interface
 from typing import Union
 
-from framework.config import OS, DPDKLocation, NodeConfiguration, TestRunConfiguration
+from framework.config import (
+    OS,
+    DPDKBuildConfiguration,
+    NodeConfiguration,
+    TestRunConfiguration,
+)
 from framework.exception import ConfigurationError
 from framework.logger import DTSLogger, get_dts_logger
 
@@ -89,13 +94,15 @@ def __init__(self, node_config: NodeConfiguration):
         self._init_ports()
 
     def _init_ports(self) -> None:
-        self.ports = [Port(port_config) for port_config in self.config.ports]
+        self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
         self.main_session.update_ports(self.ports)
         for port in self.ports:
             self.configure_port_state(port)
 
     def set_up_test_run(
-        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+        self,
+        test_run_config: TestRunConfiguration,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Test run setup steps.
 
@@ -105,7 +112,7 @@ def set_up_test_run(
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
-            dpdk_location: The target source of the DPDK tree.
+            dpdk_build_config: The build configuration of DPDK.
         """
         self._setup_hugepages()
 
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 5f087f40d6..42ab4bb8fd 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -364,7 +364,7 @@ def extract_remote_tarball(
         """
 
     @abstractmethod
-    def is_remote_dir(self, remote_path: str) -> bool:
+    def is_remote_dir(self, remote_path: PurePath) -> bool:
         """Check if the `remote_path` is a directory.
 
         Args:
@@ -375,7 +375,7 @@ def is_remote_dir(self, remote_path: str) -> bool:
         """
 
     @abstractmethod
-    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
+    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
         """Check if the `remote_tarball_path` is a tar archive.
 
         Args:
diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
index 82c84cf4f8..817405bea4 100644
--- a/dts/framework/testbed_model/port.py
+++ b/dts/framework/testbed_model/port.py
@@ -54,7 +54,7 @@ class Port:
     mac_address: str = ""
     logical_name: str = ""
 
-    def __init__(self, config: PortConfig):
+    def __init__(self, node_name: str, config: PortConfig):
         """Initialize the port from `node_name` and `config`.
 
         Args:
@@ -62,7 +62,7 @@ def __init__(self, config: PortConfig):
             config: The test run configuration of the port.
         """
         self.identifier = PortIdentifier(
-            node=config.node,
+            node=node_name,
             pci=config.pci,
         )
         self.os_driver = config.os_driver
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 0d3abbc519..6b66f33e22 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -201,12 +201,12 @@ def extract_remote_tarball(
         if expected_dir:
             self.send_command(f"ls {expected_dir}", verify=True)
 
-    def is_remote_dir(self, remote_path: str) -> bool:
+    def is_remote_dir(self, remote_path: PurePath) -> bool:
         """Overrides :meth:`~.os_session.OSSession.is_remote_dir`."""
         result = self.send_command(f"test -d {remote_path}")
         return not result.return_code
 
-    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
+    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
         """Overrides :meth:`~.os_session.OSSession.is_remote_tarfile`."""
         result = self.send_command(f"tar -tvf {remote_tarball_path}")
         return not result.return_code
@@ -393,4 +393,8 @@ def get_node_info(self) -> NodeInfo:
             SETTINGS.timeout,
         ).stdout.split("\n")
         kernel_version = self.send_command("uname -r", SETTINGS.timeout).stdout
-        return NodeInfo(os_release_info[0].strip(), os_release_info[1].strip(), kernel_version)
+        return NodeInfo(
+            os_name=os_release_info[0].strip(),
+            os_version=os_release_info[1].strip(),
+            kernel_version=kernel_version,
+        )
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index a6c42b548c..57337c8e7d 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -15,11 +15,17 @@
 import os
 import time
 from dataclasses import dataclass
-from pathlib import PurePath
+from pathlib import Path, PurePath
 
 from framework.config import (
     DPDKBuildConfiguration,
-    DPDKLocation,
+    DPDKBuildOptionsConfiguration,
+    DPDKPrecompiledBuildConfiguration,
+    DPDKUncompiledBuildConfiguration,
+    LocalDPDKTarballLocation,
+    LocalDPDKTreeLocation,
+    RemoteDPDKTarballLocation,
+    RemoteDPDKTreeLocation,
     SutNodeConfiguration,
     TestRunConfiguration,
 )
@@ -178,7 +184,9 @@ def get_dpdk_build_info(self) -> DPDKBuildInfo:
         return DPDKBuildInfo(dpdk_version=self.dpdk_version, compiler_version=self.compiler_version)
 
     def set_up_test_run(
-        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+        self,
+        test_run_config: TestRunConfiguration,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Extend the test run setup with vdev config and DPDK build set up.
 
@@ -188,12 +196,12 @@ def set_up_test_run(
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
-            dpdk_location: The target source of the DPDK tree.
+            dpdk_build_config: The build configuration of DPDK.
         """
-        super().set_up_test_run(test_run_config, dpdk_location)
-        for vdev in test_run_config.vdevs:
+        super().set_up_test_run(test_run_config, dpdk_build_config)
+        for vdev in test_run_config.system_under_test_node.vdevs:
             self.virtual_devices.append(VirtualDevice(vdev))
-        self._set_up_dpdk(dpdk_location, test_run_config.dpdk_config.dpdk_build_config)
+        self._set_up_dpdk(dpdk_build_config)
 
     def tear_down_test_run(self) -> None:
         """Extend the test run teardown with virtual device teardown and DPDK teardown."""
@@ -202,7 +210,8 @@ def tear_down_test_run(self) -> None:
         self._tear_down_dpdk()
 
     def _set_up_dpdk(
-        self, dpdk_location: DPDKLocation, dpdk_build_config: DPDKBuildConfiguration | None
+        self,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Set up DPDK the SUT node and bind ports.
 
@@ -211,21 +220,26 @@ def _set_up_dpdk(
         are bound to those that DPDK needs.
 
         Args:
-            dpdk_location: The location of the DPDK tree.
-            dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
-                DTS will use pre-built DPDK from a :dataclass:`DPDKLocation`.
+            dpdk_build_config: A DPDK build configuration to test.
         """
-        self._set_remote_dpdk_tree_path(dpdk_location.dpdk_tree, dpdk_location.remote)
-        if not self._remote_dpdk_tree_path:
-            if dpdk_location.dpdk_tree:
-                self._copy_dpdk_tree(dpdk_location.dpdk_tree)
-            elif dpdk_location.tarball:
-                self._prepare_and_extract_dpdk_tarball(dpdk_location.tarball, dpdk_location.remote)
-
-        self._set_remote_dpdk_build_dir(dpdk_location.build_dir)
-        if not self.remote_dpdk_build_dir and dpdk_build_config:
-            self._configure_dpdk_build(dpdk_build_config)
-            self._build_dpdk()
+        match dpdk_build_config.dpdk_location:
+            case RemoteDPDKTreeLocation(dpdk_tree=dpdk_tree):
+                self._set_remote_dpdk_tree_path(dpdk_tree)
+            case LocalDPDKTreeLocation(dpdk_tree=dpdk_tree):
+                self._copy_dpdk_tree(dpdk_tree)
+            case RemoteDPDKTarballLocation(tarball=tarball):
+                self._validate_remote_dpdk_tarball(tarball)
+                self._prepare_and_extract_dpdk_tarball(tarball)
+            case LocalDPDKTarballLocation(tarball=tarball):
+                remote_tarball = self._copy_dpdk_tarball_to_remote(tarball)
+                self._prepare_and_extract_dpdk_tarball(remote_tarball)
+
+        match dpdk_build_config:
+            case DPDKPrecompiledBuildConfiguration(precompiled_build_dir=build_dir):
+                self._set_remote_dpdk_build_dir(build_dir)
+            case DPDKUncompiledBuildConfiguration(build_options=build_options):
+                self._configure_dpdk_build(build_options)
+                self._build_dpdk()
 
         self.bind_ports_to_driver()
 
@@ -238,37 +252,29 @@ def _tear_down_dpdk(self) -> None:
         self.compiler_version = None
         self.bind_ports_to_driver(for_dpdk=False)
 
-    def _set_remote_dpdk_tree_path(self, dpdk_tree: str | None, remote: bool):
+    def _set_remote_dpdk_tree_path(self, dpdk_tree: PurePath):
         """Set the path to the remote DPDK source tree based on the provided DPDK location.
 
-        If :data:`dpdk_tree` and :data:`remote` are defined, check existence of :data:`dpdk_tree`
-        on SUT node and sets the `_remote_dpdk_tree_path` property. Otherwise, sets nothing.
-
         Verify DPDK source tree existence on the SUT node, if exists sets the
         `_remote_dpdk_tree_path` property, otherwise sets nothing.
 
         Args:
             dpdk_tree: The path to the DPDK source tree directory.
-            remote: Indicates whether the `dpdk_tree` is already on the SUT node, instead of the
-                execution host.
 
         Raises:
             RemoteFileNotFoundError: If the DPDK source tree is expected to be on the SUT node but
                 is not found.
         """
-        if remote and dpdk_tree:
-            if not self.main_session.remote_path_exists(dpdk_tree):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
-                )
-            if not self.main_session.is_remote_dir(dpdk_tree):
-                raise ConfigurationError(
-                    f"Remote DPDK source tree '{dpdk_tree}' must be a directory."
-                )
-
-            self.__remote_dpdk_tree_path = PurePath(dpdk_tree)
-
-    def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
+        if not self.main_session.remote_path_exists(dpdk_tree):
+            raise RemoteFileNotFoundError(
+                f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
+            )
+        if not self.main_session.is_remote_dir(dpdk_tree):
+            raise ConfigurationError(f"Remote DPDK source tree '{dpdk_tree}' must be a directory.")
+
+        self.__remote_dpdk_tree_path = dpdk_tree
+
+    def _copy_dpdk_tree(self, dpdk_tree_path: Path) -> None:
         """Copy the DPDK source tree to the SUT.
 
         Args:
@@ -288,25 +294,45 @@ def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
             self._remote_tmp_dir, PurePath(dpdk_tree_path).name
         )
 
-    def _prepare_and_extract_dpdk_tarball(self, dpdk_tarball: str, remote: bool) -> None:
-        """Ensure the DPDK tarball is available on the SUT node and extract it.
+    def _validate_remote_dpdk_tarball(self, dpdk_tarball: PurePath) -> None:
+        """Validate the DPDK tarball on the SUT node.
 
-        This method ensures that the DPDK source tree tarball is available on the
-        SUT node. If the `dpdk_tarball` is local, it is copied to the SUT node. If the
-        `dpdk_tarball` is already on the SUT node, it verifies its existence.
-        The `dpdk_tarball` is then extracted on the SUT node.
+        Args:
+            dpdk_tarball: The path to the DPDK tarball on the SUT node.
 
-        This method sets the `_remote_dpdk_tree_path` property to the path of the
-        extracted DPDK tree on the SUT node.
+        Raises:
+            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but is
+                not found.
+            ConfigurationError: If the `dpdk_tarball` is a valid path but not a valid tar archive.
+        """
+        if not self.main_session.remote_path_exists(dpdk_tarball):
+            raise RemoteFileNotFoundError(f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT.")
+        if not self.main_session.is_remote_tarfile(dpdk_tarball):
+            raise ConfigurationError(f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive.")
+
+    def _copy_dpdk_tarball_to_remote(self, dpdk_tarball: Path) -> PurePath:
+        """Copy the local DPDK tarball to the SUT node.
 
         Args:
-            dpdk_tarball: The path to the DPDK tarball, either locally or on the SUT node.
-            remote: Indicates whether the `dpdk_tarball` is already on the SUT node, instead of the
-                execution host.
+            dpdk_tarball: The local path to the DPDK tarball.
 
-        Raises:
-            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but
-                is not found.
+        Returns:
+            The path of the copied tarball on the SUT node.
+        """
+        self._logger.info(
+            f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
+        )
+        self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
+        return self.main_session.join_remote_path(self._remote_tmp_dir, dpdk_tarball.name)
+
+    def _prepare_and_extract_dpdk_tarball(self, remote_tarball_path: PurePath) -> None:
+        """Prepare the remote DPDK tree path and extract the tarball.
+
+        This method extracts the remote tarball and sets the `_remote_dpdk_tree_path` property to
+        the path of the extracted DPDK tree on the SUT node.
+
+        Args:
+            remote_tarball_path: The path to the DPDK tarball on the SUT node.
         """
 
         def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
@@ -324,30 +350,9 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
                     return PurePath(str(remote_tarball_path).replace(suffixes_to_remove, ""))
             return remote_tarball_path.with_suffix("")
 
-        if remote:
-            if not self.main_session.remote_path_exists(dpdk_tarball):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT."
-                )
-            if not self.main_session.is_remote_tarfile(dpdk_tarball):
-                raise ConfigurationError(
-                    f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive."
-                )
-
-            remote_tarball_path = PurePath(dpdk_tarball)
-        else:
-            self._logger.info(
-                f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
-            )
-            self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
-
-            remote_tarball_path = self.main_session.join_remote_path(
-                self._remote_tmp_dir, PurePath(dpdk_tarball).name
-            )
-
         tarball_top_dir = self.main_session.get_tarball_top_dir(remote_tarball_path)
         self.__remote_dpdk_tree_path = self.main_session.join_remote_path(
-            PurePath(remote_tarball_path).parent,
+            remote_tarball_path.parent,
             tarball_top_dir or remove_tarball_suffix(remote_tarball_path),
         )
 
@@ -360,33 +365,32 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
             self._remote_dpdk_tree_path,
         )
 
-    def _set_remote_dpdk_build_dir(self, build_dir: str | None):
+    def _set_remote_dpdk_build_dir(self, build_dir: str):
         """Set the `remote_dpdk_build_dir` on the SUT.
 
-        If :data:`build_dir` is defined, check existence on the SUT node and sets the
+        Check existence on the SUT node and sets the
         `remote_dpdk_build_dir` property by joining the `_remote_dpdk_tree_path` and `build_dir`.
         Otherwise, sets nothing.
 
         Args:
-            build_dir: If it's defined, DPDK has been pre-built and the build directory is located
+            build_dir: DPDK has been pre-built and the build directory is located
                 in a subdirectory of `dpdk_tree` or `tarball` root directory.
 
         Raises:
             RemoteFileNotFoundError: If the `build_dir` is expected but does not exist on the SUT
                 node.
         """
-        if build_dir:
-            remote_dpdk_build_dir = self.main_session.join_remote_path(
-                self._remote_dpdk_tree_path, build_dir
+        remote_dpdk_build_dir = self.main_session.join_remote_path(
+            self._remote_dpdk_tree_path, build_dir
+        )
+        if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
+            raise RemoteFileNotFoundError(
+                f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
             )
-            if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
-                )
 
-            self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
+        self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
 
-    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
+    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildOptionsConfiguration) -> None:
         """Populate common environment variables and set the DPDK build related properties.
 
         This method sets `compiler_version` for additional information and `remote_dpdk_build_dir`
diff --git a/dts/framework/testbed_model/topology.py b/dts/framework/testbed_model/topology.py
index d38ae36c2a..17b333e76a 100644
--- a/dts/framework/testbed_model/topology.py
+++ b/dts/framework/testbed_model/topology.py
@@ -99,7 +99,16 @@ def __init__(self, sut_ports: Iterable[Port], tg_ports: Iterable[Port]):
                     port_links.append(PortLink(sut_port=sut_port, tg_port=tg_port))
 
         self.type = TopologyType.get_from_value(len(port_links))
-        dummy_port = Port(PortConfig("", "", "", "", "", ""))
+        dummy_port = Port(
+            "",
+            PortConfig(
+                pci="0000:00:00.0",
+                os_driver_for_dpdk="",
+                os_driver="",
+                peer_node="",
+                peer_pci="0000:00:00.0",
+            ),
+        )
         self.tg_port_egress = dummy_port
         self.sut_port_ingress = dummy_port
         self.sut_port_egress = dummy_port
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
index a319fa5320..945f6bbbbb 100644
--- a/dts/framework/testbed_model/traffic_generator/__init__.py
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -38,6 +38,4 @@ def create_traffic_generator(
         case ScapyTrafficGeneratorConfig():
             return ScapyTrafficGenerator(tg_node, traffic_generator_config, privileged=True)
         case _:
-            raise ConfigurationError(
-                f"Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
-            )
+            raise ConfigurationError(f"Unknown traffic generator: {traffic_generator_config.type}")
diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 469a12a780..5ac61cd4e1 100644
--- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -45,7 +45,7 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig, **kwargs):
         """
         self._config = config
         self._tg_node = tg_node
-        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
+        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.type}")
         super().__init__(tg_node, **kwargs)
 
     def send_packet(self, packet: Packet, port: Port) -> None:
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index 78a39e32c7..e862e3ac66 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -28,7 +28,7 @@
 
 from .exception import InternalError
 
-REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
+REGEX_FOR_PCI_ADDRESS: str = r"[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}"
 _REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC: str = r"(?:[\da-fA-F]{2}[:-]){5}[\da-fA-F]{2}"
 _REGEX_FOR_DOT_SEP_MAC: str = r"(?:[\da-fA-F]{4}.){2}[\da-fA-F]{4}"
 REGEX_FOR_MAC_ADDRESS: str = rf"{_REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC}|{_REGEX_FOR_DOT_SEP_MAC}"
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
index d7870bd40f..bc3a2a6bf9 100644
--- a/dts/tests/TestSuite_smoke_tests.py
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -127,7 +127,7 @@ def test_device_bound_to_driver(self) -> None:
         path_to_devbind = self.sut_node.path_to_devbind_script
 
         all_nics_in_dpdk_devbind = self.sut_node.main_session.send_command(
-            f"{path_to_devbind} --status | awk '{REGEX_FOR_PCI_ADDRESS}'",
+            f"{path_to_devbind} --status | awk '/{REGEX_FOR_PCI_ADDRESS}/'",
             SETTINGS.timeout,
         ).stdout
 
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v4 5/8] dts: remove warlock dependency
  2024-10-28 17:49 ` [PATCH v4 0/8] dts: Pydantic configuration Luca Vizzarro
                     ` (3 preceding siblings ...)
  2024-10-28 17:49   ` [PATCH v4 4/8] dts: use pydantic in the configuration Luca Vizzarro
@ 2024-10-28 17:49   ` Luca Vizzarro
  2024-10-31 20:23     ` Nicholas Pratte
  2024-10-28 17:49   ` [PATCH v4 6/8] dts: add autodoc pydantic Luca Vizzarro
                     ` (2 subsequent siblings)
  7 siblings, 1 reply; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-28 17:49 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

Since pydantic has completely replaced warlock, there is no more need to
keep it as a dependency. This removes it.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/poetry.lock    | 227 +--------------------------------------------
 dts/pyproject.toml |   1 -
 2 files changed, 1 insertion(+), 227 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index 56c50ad52c..9f7db60793 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -34,24 +34,6 @@ files = [
     {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
 ]
 
-[[package]]
-name = "attrs"
-version = "23.1.0"
-description = "Classes Without Boilerplate"
-optional = false
-python-versions = ">=3.7"
-files = [
-    {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
-    {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
-]
-
-[package.extras]
-cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
-dev = ["attrs[docs,tests]", "pre-commit"]
-docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
-tests = ["attrs[tests-no-zope]", "zope-interface"]
-tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
-
 [[package]]
 name = "babel"
 version = "2.13.1"
@@ -491,66 +473,6 @@ MarkupSafe = ">=2.0"
 [package.extras]
 i18n = ["Babel (>=2.7)"]
 
-[[package]]
-name = "jsonpatch"
-version = "1.33"
-description = "Apply JSON-Patches (RFC 6902)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpatch-1.33-py2.py3-none-any.whl", hash = "sha256:0ae28c0cd062bbd8b8ecc26d7d164fbbea9652a1a3693f3b956c1eae5145dade"},
-    {file = "jsonpatch-1.33.tar.gz", hash = "sha256:9fcd4009c41e6d12348b4a0ff2563ba56a2923a7dfee731d004e212e1ee5030c"},
-]
-
-[package.dependencies]
-jsonpointer = ">=1.9"
-
-[[package]]
-name = "jsonpointer"
-version = "2.4"
-description = "Identify specific nodes in a JSON document (RFC 6901)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpointer-2.4-py2.py3-none-any.whl", hash = "sha256:15d51bba20eea3165644553647711d150376234112651b4f1811022aecad7d7a"},
-    {file = "jsonpointer-2.4.tar.gz", hash = "sha256:585cee82b70211fa9e6043b7bb89db6e1aa49524340dde8ad6b63206ea689d88"},
-]
-
-[[package]]
-name = "jsonschema"
-version = "4.18.4"
-description = "An implementation of JSON Schema validation for Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema-4.18.4-py3-none-any.whl", hash = "sha256:971be834317c22daaa9132340a51c01b50910724082c2c1a2ac87eeec153a3fe"},
-    {file = "jsonschema-4.18.4.tar.gz", hash = "sha256:fb3642735399fa958c0d2aad7057901554596c63349f4f6b283c493cf692a25d"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-jsonschema-specifications = ">=2023.03.6"
-referencing = ">=0.28.4"
-rpds-py = ">=0.7.1"
-
-[package.extras]
-format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
-format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
-
-[[package]]
-name = "jsonschema-specifications"
-version = "2023.7.1"
-description = "The JSON Schema meta-schemas and vocabularies, exposed as a Registry"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema_specifications-2023.7.1-py3-none-any.whl", hash = "sha256:05adf340b659828a004220a9613be00fa3f223f2b82002e273dee62fd50524b1"},
-    {file = "jsonschema_specifications-2023.7.1.tar.gz", hash = "sha256:c91a50404e88a1f6ba40636778e2ee08f6e24c5613fe4c53ac24578a5a7f72bb"},
-]
-
-[package.dependencies]
-referencing = ">=0.28.0"
-
 [[package]]
 name = "markupsafe"
 version = "2.1.3"
@@ -1073,21 +995,6 @@ files = [
     {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
 ]
 
-[[package]]
-name = "referencing"
-version = "0.30.0"
-description = "JSON Referencing + Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "referencing-0.30.0-py3-none-any.whl", hash = "sha256:c257b08a399b6c2f5a3510a50d28ab5dbc7bbde049bcaf954d43c446f83ab548"},
-    {file = "referencing-0.30.0.tar.gz", hash = "sha256:47237742e990457f7512c7d27486394a9aadaf876cbfaa4be65b27b4f4d47c6b"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-rpds-py = ">=0.7.0"
-
 [[package]]
 name = "requests"
 version = "2.31.0"
@@ -1109,112 +1016,6 @@ urllib3 = ">=1.21.1,<3"
 socks = ["PySocks (>=1.5.6,!=1.5.7)"]
 use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
 
-[[package]]
-name = "rpds-py"
-version = "0.9.2"
-description = "Python bindings to Rust's persistent data structures (rpds)"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:ab6919a09c055c9b092798ce18c6c4adf49d24d4d9e43a92b257e3f2548231e7"},
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d55777a80f78dd09410bd84ff8c95ee05519f41113b2df90a69622f5540c4f8b"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a216b26e5af0a8e265d4efd65d3bcec5fba6b26909014effe20cd302fd1138fa"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:29cd8bfb2d716366a035913ced99188a79b623a3512292963d84d3e06e63b496"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:44659b1f326214950a8204a248ca6199535e73a694be8d3e0e869f820767f12f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:745f5a43fdd7d6d25a53ab1a99979e7f8ea419dfefebcab0a5a1e9095490ee5e"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a987578ac5214f18b99d1f2a3851cba5b09f4a689818a106c23dbad0dfeb760f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf4151acb541b6e895354f6ff9ac06995ad9e4175cbc6d30aaed08856558201f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:03421628f0dc10a4119d714a17f646e2837126a25ac7a256bdf7c3943400f67f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:13b602dc3e8dff3063734f02dcf05111e887f301fdda74151a93dbbc249930fe"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:fae5cb554b604b3f9e2c608241b5d8d303e410d7dfb6d397c335f983495ce7f6"},
-    {file = "rpds_py-0.9.2-cp310-none-win32.whl", hash = "sha256:47c5f58a8e0c2c920cc7783113df2fc4ff12bf3a411d985012f145e9242a2764"},
-    {file = "rpds_py-0.9.2-cp310-none-win_amd64.whl", hash = "sha256:4ea6b73c22d8182dff91155af018b11aac9ff7eca085750455c5990cb1cfae6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:e564d2238512c5ef5e9d79338ab77f1cbbda6c2d541ad41b2af445fb200385e3"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f411330a6376fb50e5b7a3e66894e4a39e60ca2e17dce258d53768fea06a37bd"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e7521f5af0233e89939ad626b15278c71b69dc1dfccaa7b97bd4cdf96536bb7"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8d3335c03100a073883857e91db9f2e0ef8a1cf42dc0369cbb9151c149dbbc1b"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d25b1c1096ef0447355f7293fbe9ad740f7c47ae032c2884113f8e87660d8f6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a5d3fbd02efd9cf6a8ffc2f17b53a33542f6b154e88dd7b42ef4a4c0700fdad"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c5934e2833afeaf36bd1eadb57256239785f5af0220ed8d21c2896ec4d3a765f"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:095b460e117685867d45548fbd8598a8d9999227e9061ee7f012d9d264e6048d"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:91378d9f4151adc223d584489591dbb79f78814c0734a7c3bfa9c9e09978121c"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:24a81c177379300220e907e9b864107614b144f6c2a15ed5c3450e19cf536fae"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:de0b6eceb46141984671802d412568d22c6bacc9b230174f9e55fc72ef4f57de"},
-    {file = "rpds_py-0.9.2-cp311-none-win32.whl", hash = "sha256:700375326ed641f3d9d32060a91513ad668bcb7e2cffb18415c399acb25de2ab"},
-    {file = "rpds_py-0.9.2-cp311-none-win_amd64.whl", hash = "sha256:0766babfcf941db8607bdaf82569ec38107dbb03c7f0b72604a0b346b6eb3298"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_10_7_x86_64.whl", hash = "sha256:b1440c291db3f98a914e1afd9d6541e8fc60b4c3aab1a9008d03da4651e67386"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0f2996fbac8e0b77fd67102becb9229986396e051f33dbceada3debaacc7033f"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f30d205755566a25f2ae0382944fcae2f350500ae4df4e795efa9e850821d82"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:159fba751a1e6b1c69244e23ba6c28f879a8758a3e992ed056d86d74a194a0f3"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a1f044792e1adcea82468a72310c66a7f08728d72a244730d14880cd1dabe36b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9251eb8aa82e6cf88510530b29eef4fac825a2b709baf5b94a6094894f252387"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01899794b654e616c8625b194ddd1e5b51ef5b60ed61baa7a2d9c2ad7b2a4238"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0c43f8ae8f6be1d605b0465671124aa8d6a0e40f1fb81dcea28b7e3d87ca1e1"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:207f57c402d1f8712618f737356e4b6f35253b6d20a324d9a47cb9f38ee43a6b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b52e7c5ae35b00566d244ffefba0f46bb6bec749a50412acf42b1c3f402e2c90"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:978fa96dbb005d599ec4fd9ed301b1cc45f1a8f7982d4793faf20b404b56677d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:6aa8326a4a608e1c28da191edd7c924dff445251b94653988efb059b16577a4d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:aad51239bee6bff6823bbbdc8ad85136c6125542bbc609e035ab98ca1e32a192"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4bd4dc3602370679c2dfb818d9c97b1137d4dd412230cfecd3c66a1bf388a196"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dd9da77c6ec1f258387957b754f0df60766ac23ed698b61941ba9acccd3284d1"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:190ca6f55042ea4649ed19c9093a9be9d63cd8a97880106747d7147f88a49d18"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:876bf9ed62323bc7dcfc261dbc5572c996ef26fe6406b0ff985cbcf460fc8a4c"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa2818759aba55df50592ecbc95ebcdc99917fa7b55cc6796235b04193eb3c55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9ea4d00850ef1e917815e59b078ecb338f6a8efda23369677c54a5825dbebb55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:5855c85eb8b8a968a74dc7fb014c9166a05e7e7a8377fb91d78512900aadd13d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:14c408e9d1a80dcb45c05a5149e5961aadb912fff42ca1dd9b68c0044904eb32"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:65a0583c43d9f22cb2130c7b110e695fff834fd5e832a776a107197e59a1898e"},
-    {file = "rpds_py-0.9.2-cp38-none-win32.whl", hash = "sha256:71f2f7715935a61fa3e4ae91d91b67e571aeb5cb5d10331ab681256bda2ad920"},
-    {file = "rpds_py-0.9.2-cp38-none-win_amd64.whl", hash = "sha256:674c704605092e3ebbbd13687b09c9f78c362a4bc710343efe37a91457123044"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:07e2c54bef6838fa44c48dfbc8234e8e2466d851124b551fc4e07a1cfeb37260"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f7fdf55283ad38c33e35e2855565361f4bf0abd02470b8ab28d499c663bc5d7c"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:890ba852c16ace6ed9f90e8670f2c1c178d96510a21b06d2fa12d8783a905193"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:50025635ba8b629a86d9d5474e650da304cb46bbb4d18690532dd79341467846"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:517cbf6e67ae3623c5127206489d69eb2bdb27239a3c3cc559350ef52a3bbf0b"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0836d71ca19071090d524739420a61580f3f894618d10b666cf3d9a1688355b1"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c439fd54b2b9053717cca3de9583be6584b384d88d045f97d409f0ca867d80f"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f68996a3b3dc9335037f82754f9cdbe3a95db42bde571d8c3be26cc6245f2324"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7d68dc8acded354c972116f59b5eb2e5864432948e098c19fe6994926d8e15c3"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:f963c6b1218b96db85fc37a9f0851eaf8b9040aa46dec112611697a7023da535"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5a46859d7f947061b4010e554ccd1791467d1b1759f2dc2ec9055fa239f1bc26"},
-    {file = "rpds_py-0.9.2-cp39-none-win32.whl", hash = "sha256:e07e5dbf8a83c66783a9fe2d4566968ea8c161199680e8ad38d53e075df5f0d0"},
-    {file = "rpds_py-0.9.2-cp39-none-win_amd64.whl", hash = "sha256:682726178138ea45a0766907957b60f3a1bf3acdf212436be9733f28b6c5af3c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_10_7_x86_64.whl", hash = "sha256:196cb208825a8b9c8fc360dc0f87993b8b260038615230242bf18ec84447c08d"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:c7671d45530fcb6d5e22fd40c97e1e1e01965fc298cbda523bb640f3d923b387"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83b32f0940adec65099f3b1c215ef7f1d025d13ff947975a055989cb7fd019a4"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7f67da97f5b9eac838b6980fc6da268622e91f8960e083a34533ca710bec8611"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03975db5f103997904c37e804e5f340c8fdabbb5883f26ee50a255d664eed58c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:987b06d1cdb28f88a42e4fb8a87f094e43f3c435ed8e486533aea0bf2e53d931"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c861a7e4aef15ff91233751619ce3a3d2b9e5877e0fcd76f9ea4f6847183aa16"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:02938432352359805b6da099c9c95c8a0547fe4b274ce8f1a91677401bb9a45f"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:ef1f08f2a924837e112cba2953e15aacfccbbfcd773b4b9b4723f8f2ddded08e"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:35da5cc5cb37c04c4ee03128ad59b8c3941a1e5cd398d78c37f716f32a9b7f67"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:141acb9d4ccc04e704e5992d35472f78c35af047fa0cfae2923835d153f091be"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_10_7_x86_64.whl", hash = "sha256:79f594919d2c1a0cc17d1988a6adaf9a2f000d2e1048f71f298b056b1018e872"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:a06418fe1155e72e16dddc68bb3780ae44cebb2912fbd8bb6ff9161de56e1798"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b2eb034c94b0b96d5eddb290b7b5198460e2d5d0c421751713953a9c4e47d10"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b08605d248b974eb02f40bdcd1a35d3924c83a2a5e8f5d0fa5af852c4d960af"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a0805911caedfe2736935250be5008b261f10a729a303f676d3d5fea6900c96a"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab2299e3f92aa5417d5e16bb45bb4586171c1327568f638e8453c9f8d9e0f020"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c8d7594e38cf98d8a7df25b440f684b510cf4627fe038c297a87496d10a174f"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8b9ec12ad5f0a4625db34db7e0005be2632c1013b253a4a60e8302ad4d462afd"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:1fcdee18fea97238ed17ab6478c66b2095e4ae7177e35fb71fbe561a27adf620"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_i686.whl", hash = "sha256:933a7d5cd4b84f959aedeb84f2030f0a01d63ae6cf256629af3081cf3e3426e8"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:686ba516e02db6d6f8c279d1641f7067ebb5dc58b1d0536c4aaebb7bf01cdc5d"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_10_7_x86_64.whl", hash = "sha256:0173c0444bec0a3d7d848eaeca2d8bd32a1b43f3d3fde6617aac3731fa4be05f"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:d576c3ef8c7b2d560e301eb33891d1944d965a4d7a2eacb6332eee8a71827db6"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed89861ee8c8c47d6beb742a602f912b1bb64f598b1e2f3d758948721d44d468"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1054a08e818f8e18910f1bee731583fe8f899b0a0a5044c6e680ceea34f93876"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99e7c4bb27ff1aab90dcc3e9d37ee5af0231ed98d99cb6f5250de28889a3d502"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c545d9d14d47be716495076b659db179206e3fd997769bc01e2d550eeb685596"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9039a11bca3c41be5a58282ed81ae422fa680409022b996032a43badef2a3752"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fb39aca7a64ad0c9490adfa719dbeeb87d13be137ca189d2564e596f8ba32c07"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:2d8b3b3a2ce0eaa00c5bbbb60b6713e94e7e0becab7b3db6c5c77f979e8ed1f1"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:99b1c16f732b3a9971406fbfe18468592c5a3529585a45a35adbc1389a529a03"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:c27ee01a6c3223025f4badd533bea5e87c988cb0ba2811b690395dfe16088cfe"},
-    {file = "rpds_py-0.9.2.tar.gz", hash = "sha256:8d70e8f14900f2657c249ea4def963bed86a29b81f81f5b76b5a9215680de945"},
-]
-
 [[package]]
 name = "scapy"
 version = "2.5.0"
@@ -1472,17 +1273,6 @@ files = [
     {file = "types_PyYAML-6.0.12.11-py3-none-any.whl", hash = "sha256:a461508f3096d1d5810ec5ab95d7eeecb651f3a15b71959999988942063bf01d"},
 ]
 
-[[package]]
-name = "typing-extensions"
-version = "4.11.0"
-description = "Backported and Experimental Type Hints for Python 3.8+"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "typing_extensions-4.11.0-py3-none-any.whl", hash = "sha256:c1f94d72897edaf4ce775bb7558d5b79d8126906a14ea5ed1635921406c0387a"},
-    {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
-]
-
 [[package]]
 name = "typing-extensions"
 version = "4.12.2"
@@ -1511,22 +1301,7 @@ secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.
 socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
 zstd = ["zstandard (>=0.18.0)"]
 
-[[package]]
-name = "warlock"
-version = "2.0.1"
-description = "Python object model built on JSON schema and JSON patch."
-optional = false
-python-versions = ">=3.7,<4.0"
-files = [
-    {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"},
-    {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"},
-]
-
-[package.dependencies]
-jsonpatch = ">=1,<2"
-jsonschema = ">=4,<5"
-
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827"
+content-hash = "310e2d3725e20ffc6ef017db92e8000c042eb2ac98a1a5eb441de17c87417e9f"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 6c2d1ca8a4..9a3fb02ee9 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -20,7 +20,6 @@ documentation = "https://doc.dpdk.org/guides/tools/dts.html"
 
 [tool.poetry.dependencies]
 python = "^3.10"
-warlock = "^2.0.1"
 PyYAML = "^6.0"
 types-PyYAML = "^6.0.8"
 fabric = "^2.7.1"
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v4 6/8] dts: add autodoc pydantic
  2024-10-28 17:49 ` [PATCH v4 0/8] dts: Pydantic configuration Luca Vizzarro
                     ` (4 preceding siblings ...)
  2024-10-28 17:49   ` [PATCH v4 5/8] dts: remove warlock dependency Luca Vizzarro
@ 2024-10-28 17:49   ` Luca Vizzarro
  2024-10-31 20:52     ` Nicholas Pratte
  2024-10-28 17:49   ` [PATCH v4 7/8] dts: improve configuration API docs Luca Vizzarro
  2024-10-28 17:49   ` [PATCH v4 8/8] dts: use TestSuiteSpec class imports Luca Vizzarro
  7 siblings, 1 reply; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-28 17:49 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

Add and enable the autodoc-pydantic sphinx extension. Pydantic models
are not correctly recognised by autodoc, causing the generated docs to
lack all the actual model information. The autodoc-pydantic sphinx
extension fixes the original behaviour by correctly formatting them.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 doc/guides/conf.py       |  13 +++
 doc/guides/tools/dts.rst | 187 ++-------------------------------------
 dts/poetry.lock          |  59 +++++++++++-
 dts/pyproject.toml       |   1 +
 4 files changed, 79 insertions(+), 181 deletions(-)

diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index b553d9d5bf..71fed45b3d 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -60,6 +60,19 @@
 # DTS API docs additional configuration
 if environ.get('DTS_DOC_BUILD'):
     extensions = ['sphinx.ext.napoleon', 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx']
+
+    # Pydantic models require autodoc_pydantic for the right formatting
+    try:
+        import sphinxcontrib.autodoc_pydantic
+
+        extensions.append("sphinxcontrib.autodoc_pydantic")
+    except ImportError:
+        print(
+            "The DTS API doc dependencies are missing. The generated output won't be "
+            "as intended, and autodoc may throw unexpected warnings.",
+            file=stderr,
+        )
+
     # Napoleon enables the Google format of Python doscstrings.
     napoleon_numpy_docstring = False
     napoleon_attr_annotations = True
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index c52de1808c..7ccca63ae8 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -204,9 +204,10 @@ node, and then run the tests with the newly built binaries.
 Configuring DTS
 ~~~~~~~~~~~~~~~
 
-DTS configuration is split into nodes and test runs and build targets within test runs,
-and follows a defined schema as described in `Configuration Schema`_.
-By default, DTS will try to use the ``dts/conf.yaml`` :ref:`config file <configuration_schema_example>`,
+DTS configuration is split into nodes and test runs, and must respect the the model definitions as
+documented in the DTS API docs under the ``config`` page. The root of the configuration is
+represented by the ``Configuration`` model.
+By default, DTS will try to use the ``dts/conf.yaml`` :ref:`config file <configuration_example>`,
 which is a template that illustrates what can be configured in DTS.
 
 The user must have :ref:`administrator privileges <sut_admin_user>`
@@ -470,184 +471,10 @@ The output is generated in ``build/doc/api/dts/html``.
 
    Make sure to fix any Sphinx warnings when adding or updating docstrings.
 
+.. _configuration_example:
 
-Configuration Schema
---------------------
-
-Definitions
-~~~~~~~~~~~
-
-_`Node name`
-   *string* – A unique identifier for a node.
-   **Examples**: ``SUT1``, ``TG1``.
-
-_`ARCH`
-   *string* – The CPU architecture.
-   **Supported values**: ``x86_64``, ``arm64``, ``ppc64le``.
-
-_`CPU`
-   *string* – The CPU microarchitecture. Use ``native`` for x86.
-   **Supported values**: ``native``, ``armv8a``, ``dpaa2``, ``thunderx``, ``xgene1``.
-
-_`OS`
-   *string* – The operating system. **Supported values**: ``linux``.
-
-_`Compiler`
-   *string* – The compiler used for building DPDK.
-   **Supported values**: ``gcc``, ``clang``, ``icc``, ``mscv``.
-
-_`Build target`
-   *mapping* – Build targets supported by DTS for building DPDK, described as:
-
-   ==================== =================================================================
-   ``arch``             See `ARCH`_
-   ``os``               See `OS`_
-   ``cpu``              See `CPU`_
-   ``compiler``         See `Compiler`_
-   ``compiler_wrapper`` *string* – Value prepended to the CC variable for the DPDK build.
-
-                        **Example**: ``ccache``
-   ==================== =================================================================
-
-_`hugepages_2mb`
-   *mapping* – hugepages_2mb described as:
-
-   ==================== ================================================================
-   ``number_of``        *integer* – The number of 2MB hugepages to configure.
-
-                        Hugepage size will be the system default.
-   ``force_first_numa`` (*optional*, defaults to ``false``) – If ``true``, it forces the
-
-                        configuration of hugepages on the first NUMA node.
-   ==================== ================================================================
-
-_`Network port`
-   *mapping* – the NIC port described as:
-
-   ====================== =================================================================================
-   ``pci``                *string* – the local PCI address of the port. **Example**: ``0000:00:08.0``
-   ``os_driver_for_dpdk`` | *string* – this port's device driver when using with DPDK
-                          | When setting up the SUT, DTS will bind the network device to this driver
-                          | for compatibility with DPDK.
-
-                          **Examples**: ``vfio-pci``, ``mlx5_core``
-   ``os_driver``          | *string* – this port's device driver when **not** using with DPDK
-                          | When tearing down the tests on the SUT, DTS will bind the network device
-                          | *back* to this driver. This driver is meant to be the one that the SUT would
-                          | normally use for this device, or whichever driver it is preferred to leave the
-                          | device bound to after testing.
-                          | This also represents the driver that is used in conjunction with the traffic
-                          | generator software.
-
-                          **Examples**: ``i40e``, ``mlx5_core``
-   ``peer_node``          *string* – the name of the peer node connected to this port.
-   ``peer_pci``           *string* – the PCI address of the peer node port. **Example**: ``000a:01:00.1``
-   ====================== =================================================================================
-
-_`Test suite`
-   *string* – name of the test suite to run. **Examples**: ``hello_world``, ``os_udp``
-
-_`Test target`
-   *mapping* – selects specific test cases to run from a test suite. Mapping is described as follows:
-
-   ========= ===============================================================================================
-   ``suite`` See `Test suite`_
-   ``cases`` (*optional*) *sequence* of *string* – list of the selected test cases in the test suite to run.
-
-             Unknown test cases will be silently ignored.
-   ========= ===============================================================================================
-
-
-Properties
-~~~~~~~~~~
-
-The configuration requires listing all the test run environments and nodes
-involved in the testing. These can be defined with the following mappings:
-
-``test runs``
-   `sequence <https://docs.python.org/3/library/stdtypes.html#sequence-types-list-tuple-range>`_ listing
-   the test run environments. Each entry is described as per the following
-   `mapping <https://docs.python.org/3/library/stdtypes.html#mapping-types-dict>`_:
-
-   +----------------------------+-------------------------------------------------------------------+
-   | ``build_targets``          | *sequence* of `Build target`_                                     |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``perf``                   | *boolean* – Enable performance testing.                           |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``func``                   | *boolean* – Enable functional testing.                            |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``test_suites``            | *sequence* of **one of** `Test suite`_ **or** `Test target`_      |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``skip_smoke_tests``       | (*optional*) *boolean* – Allows you to skip smoke testing         |
-   |                            | if ``true``.                                                      |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``system_under_test_node`` | System under test node specified with:                            |
-   |                            +---------------+---------------------------------------------------+
-   |                            | ``node_name`` | See `Node name`_                                  |
-   |                            +---------------+---------------------------------------------------+
-   |                            | ``vdevs``     | (*optional*) *sequence* of *string*               |
-   |                            |               |                                                   |
-   |                            |               | List of virtual devices passed with the ``--vdev``|
-   |                            |               | argument to DPDK. **Example**: ``crypto_openssl`` |
-   +----------------------------+---------------+---------------------------------------------------+
-   | ``traffic_generator_node`` | Node name for the traffic generator node.                         |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``random_seed``            | (*optional*) *int* – Set a seed for pseudo-random generation.     |
-   +----------------------------+-------------------------------------------------------------------+
-
-``nodes``
-   `sequence <https://docs.python.org/3/library/stdtypes.html#sequence-types-list-tuple-range>`_ listing
-   the nodes. Each entry is described as per the following
-   `mapping <https://docs.python.org/3/library/stdtypes.html#mapping-types-dict>`_:
-
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``name``              | See `Node name`_                                                                      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``hostname``          | *string* – The network hostname or IP address of this node.                           |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``user``              | *string* – The SSH user credential to use to login to this node.                      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``password``          | (*optional*) *string* – The SSH password credential for this node.                    |
-   |                       |                                                                                       |
-   |                       | **NB**: Use only as last resort. SSH keys are **strongly** preferred.                 |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``arch``              | The architecture of this node. See `ARCH`_ for supported values.                      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``os``                | The operating system of this node. See `OS`_ for supported values.                    |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``lcores``            | | (*optional*, defaults to 1) *string* – Comma-separated list of logical              |
-   |                       | | cores to use. An empty string means use all lcores.                                 |
-   |                       |                                                                                       |
-   |                       | **Example**: ``1,2,3,4,5,18-22``                                                      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``use_first_core``    | (*optional*, defaults to ``false``) *boolean*                                         |
-   |                       |                                                                                       |
-   |                       | Indicates whether DPDK should use only the first physical core or not.                |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``memory_channels``   | (*optional*, defaults to 1) *integer*                                                 |
-   |                       |                                                                                       |
-   |                       | The number of the memory channels to use.                                             |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``hugepages_2mb``     | (*optional*) See `hugepages_2mb`_. If unset, 2MB hugepages won't be configured        |
-   |                       |                                                                                       |
-   |                       | in favour of the system configuration.                                                |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``ports``             | | *sequence* of `Network port`_ – Describe ports that are **directly** paired with    |
-   |                       | | other nodes used in conjunction with this one. Both ends of the links must be       |
-   |                       | | described. If there any inconsistencies DTS won't run.                              |
-   |                       |                                                                                       |
-   |                       | **Example**: port 1 of node ``SUT1`` is connected to port 1 of node ``TG1`` etc.      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``traffic_generator`` | (*optional*) Traffic generator, if any, setup on this node described as:              |
-   |                       +----------+----------------------------------------------------------------------------+
-   |                       | ``type`` | *string* – **Supported values**: *SCAPY*                                   |
-   +-----------------------+----------+----------------------------------------------------------------------------+
-
-
-.. _configuration_schema_example:
-
-Example
-~~~~~~~
+Configuration Example
+---------------------
 
 The following example (which can be found in ``dts/conf.yaml``) sets up two nodes:
 
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 9f7db60793..ee564676b4 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -34,6 +34,29 @@ files = [
     {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
 ]
 
+[[package]]
+name = "autodoc-pydantic"
+version = "2.2.0"
+description = "Seamlessly integrate pydantic models in your Sphinx documentation."
+optional = false
+python-versions = "<4.0.0,>=3.8.1"
+files = [
+    {file = "autodoc_pydantic-2.2.0-py3-none-any.whl", hash = "sha256:8c6a36fbf6ed2700ea9c6d21ea76ad541b621fbdf16b5a80ee04673548af4d95"},
+]
+
+[package.dependencies]
+pydantic = ">=2.0,<3.0.0"
+pydantic-settings = ">=2.0,<3.0.0"
+Sphinx = ">=4.0"
+
+[package.extras]
+docs = ["myst-parser (>=3.0.0,<4.0.0)", "sphinx-copybutton (>=0.5.0,<0.6.0)", "sphinx-rtd-theme (>=2.0.0,<3.0.0)", "sphinx-tabs (>=3,<4)", "sphinxcontrib-mermaid (>=0.9.0,<0.10.0)"]
+erdantic = ["erdantic (<2.0)"]
+linting = ["ruff (>=0.4.0,<0.5.0)"]
+security = ["pip-audit (>=2.7.2,<3.0.0)"]
+test = ["coverage (>=7,<8)", "defusedxml (>=0.7.1)", "pytest (>=8.0.0,<9.0.0)", "pytest-sugar (>=1.0.0,<2.0.0)"]
+type-checking = ["mypy (>=1.9,<2.0)", "types-docutils (>=0.20,<0.21)", "typing-extensions (>=4.11,<5.0)"]
+
 [[package]]
 name = "babel"
 version = "2.13.1"
@@ -829,6 +852,26 @@ files = [
 [package.dependencies]
 typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
 
+[[package]]
+name = "pydantic-settings"
+version = "2.6.0"
+description = "Settings management using Pydantic"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic_settings-2.6.0-py3-none-any.whl", hash = "sha256:4a819166f119b74d7f8c765196b165f95cc7487ce58ea27dec8a5a26be0970e0"},
+    {file = "pydantic_settings-2.6.0.tar.gz", hash = "sha256:44a1804abffac9e6a30372bb45f6cafab945ef5af25e66b1c634c01dd39e0188"},
+]
+
+[package.dependencies]
+pydantic = ">=2.7.0"
+python-dotenv = ">=0.21.0"
+
+[package.extras]
+azure-key-vault = ["azure-identity (>=1.16.0)", "azure-keyvault-secrets (>=4.8.0)"]
+toml = ["tomli (>=2.0.1)"]
+yaml = ["pyyaml (>=6.0.1)"]
+
 [[package]]
 name = "pydocstyle"
 version = "6.1.1"
@@ -935,6 +978,20 @@ cffi = ">=1.4.1"
 docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"]
 tests = ["hypothesis (>=3.27.0)", "pytest (>=3.2.1,!=3.3.0)"]
 
+[[package]]
+name = "python-dotenv"
+version = "1.0.1"
+description = "Read key-value pairs from a .env file and set them as environment variables"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "python-dotenv-1.0.1.tar.gz", hash = "sha256:e324ee90a023d808f1959c46bcbc04446a10ced277783dc6ee09987c37ec10ca"},
+    {file = "python_dotenv-1.0.1-py3-none-any.whl", hash = "sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a"},
+]
+
+[package.extras]
+cli = ["click (>=5.0)"]
+
 [[package]]
 name = "pyyaml"
 version = "6.0.1"
@@ -1304,4 +1361,4 @@ zstd = ["zstandard (>=0.18.0)"]
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "310e2d3725e20ffc6ef017db92e8000c042eb2ac98a1a5eb441de17c87417e9f"
+content-hash = "fe9a9fdf7b43e8dce2fb5ee600921d4047fef2f4037a78bbd150f71df202493e"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 9a3fb02ee9..f69c70877a 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -44,6 +44,7 @@ optional = true
 sphinx = "<=7"
 sphinx-rtd-theme = ">=1.2.2"
 pyelftools = "^0.31"
+autodoc-pydantic = "^2.2.0"
 
 [build-system]
 requires = ["poetry-core>=1.0.0"]
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v4 7/8] dts: improve configuration API docs
  2024-10-28 17:49 ` [PATCH v4 0/8] dts: Pydantic configuration Luca Vizzarro
                     ` (5 preceding siblings ...)
  2024-10-28 17:49   ` [PATCH v4 6/8] dts: add autodoc pydantic Luca Vizzarro
@ 2024-10-28 17:49   ` Luca Vizzarro
  2024-11-04 17:34     ` Nicholas Pratte
  2024-10-28 17:49   ` [PATCH v4 8/8] dts: use TestSuiteSpec class imports Luca Vizzarro
  7 siblings, 1 reply; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-28 17:49 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

Pydantic models are not treated the same way as dataclasses by autodoc.
As a consequence the docstrings need to be applied directly to each
field. Otherwise the generated API documentation page would present two
entries per each field with each their own differences.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 doc/guides/tools/dts.rst         |   5 +-
 dts/framework/config/__init__.py | 253 +++++++++++--------------------
 2 files changed, 88 insertions(+), 170 deletions(-)

diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 7ccca63ae8..ac12c5c4fa 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -1,5 +1,6 @@
 ..  SPDX-License-Identifier: BSD-3-Clause
     Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
+    Copyright(c) 2024 Arm Limited
 
 DPDK Test Suite
 ===============
@@ -327,8 +328,8 @@ where we deviate or where some additional clarification is helpful:
    * The ``dataclass.dataclass`` decorator changes how the attributes are processed.
      The dataclass attributes which result in instance variables/attributes
      should also be recorded in the ``Attributes:`` section.
-   * Class variables/attributes, on the other hand, should be documented with ``#:``
-     above the type annotated line.
+   * Class variables/attributes and Pydantic model fields, on the other hand, should be documented
+     with ``#:`` above the type annotated line.
      The description may be omitted if the meaning is obvious.
    * The ``Enum`` and ``TypedDict`` also process the attributes in particular ways
      and should be documented with ``#:`` as well.
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index c86bfaaabf..d7d3907a33 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -116,54 +116,34 @@ class TrafficGeneratorType(str, Enum):
 
 
 class HugepageConfiguration(BaseModel, frozen=True, extra="forbid"):
-    r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
-
-    Attributes:
-        number_of: The number of hugepages to allocate.
-        force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.
-    """
+    r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s."""
 
+    #: The number of hugepages to allocate.
     number_of: int
+    #: If :data:`True`, the hugepages will be configured on the first NUMA node.
     force_first_numa: bool
 
 
 class PortConfig(BaseModel, frozen=True, extra="forbid"):
-    r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
-
-    Attributes:
-        pci: The PCI address of the port.
-        os_driver_for_dpdk: The operating system driver name for use with DPDK.
-        os_driver: The operating system driver name when the operating system controls the port.
-        peer_node: The :class:`~framework.testbed_model.node.Node` of the port
-            connected to this port.
-        peer_pci: The PCI address of the port connected to this port.
-    """
+    r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s."""
 
-    pci: str = Field(
-        description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS
-    )
-    os_driver_for_dpdk: str = Field(
-        description="The driver that the kernel should bind this device to for DPDK to use it.",
-        examples=["vfio-pci", "mlx5_core"],
-    )
-    os_driver: str = Field(
-        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
-    )
-    peer_node: str = Field(description="The name of the peer node this port is connected to.")
-    peer_pci: str = Field(
-        description="The PCI address of the peer port this port is connected to.",
-        pattern=REGEX_FOR_PCI_ADDRESS,
-    )
+    #: The PCI address of the port.
+    pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS)
+    #: The driver that the kernel should bind this device to for DPDK to use it.
+    os_driver_for_dpdk: str = Field(examples=["vfio-pci", "mlx5_core"])
+    #: The operating system driver name when the operating system controls the port.
+    os_driver: str = Field(examples=["i40e", "ice", "mlx5_core"])
+    #: The name of the peer node this port is connected to.
+    peer_node: str
+    #: The PCI address of the peer port connected to this port.
+    peer_pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS)
 
 
 class TrafficGeneratorConfig(BaseModel, frozen=True, extra="forbid"):
-    """A protocol required to define traffic generator types.
-
-    Attributes:
-        type: The traffic generator type, the child class is required to define to be distinguished
-            among others.
-    """
+    """A protocol required to define traffic generator types."""
 
+    #: The traffic generator type the child class is required to define to be distinguished among
+    #: others.
     type: TrafficGeneratorType
 
 
@@ -176,13 +156,10 @@ class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="fo
 #: A union type discriminating traffic generators by the `type` field.
 TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
 
-
-#: A field representing logical core ranges.
+#: Comma-separated list of logical cores to use. An empty string means use all lcores.
 LogicalCores = Annotated[
     str,
     Field(
-        description="Comma-separated list of logical cores to use. "
-        "An empty string means use all lcores.",
         examples=["1,2,3,4,5,18-22", "10-15"],
         pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
     ),
@@ -190,61 +167,41 @@ class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="fo
 
 
 class NodeConfiguration(BaseModel, frozen=True, extra="forbid"):
-    r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
-
-    Attributes:
-        name: The name of the :class:`~framework.testbed_model.node.Node`.
-        hostname: The hostname of the :class:`~framework.testbed_model.node.Node`.
-            Can be an IP or a domain name.
-        user: The name of the user used to connect to
-            the :class:`~framework.testbed_model.node.Node`.
-        password: The password of the user. The use of passwords is heavily discouraged.
-            Please use keys instead.
-        arch: The architecture of the :class:`~framework.testbed_model.node.Node`.
-        os: The operating system of the :class:`~framework.testbed_model.node.Node`.
-        lcores: A comma delimited list of logical cores to use when running DPDK.
-        use_first_core: If :data:`True`, the first logical core won't be used.
-        hugepages: An optional hugepage configuration.
-        ports: The ports that can be used in testing.
-    """
-
-    name: str = Field(description="A unique identifier for this node.")
-    hostname: str = Field(description="The hostname or IP address of the node.")
-    user: str = Field(description="The login user to use to connect to this node.")
-    password: str | None = Field(
-        default=None,
-        description="The login password to use to connect to this node. "
-        "SSH keys are STRONGLY preferred, use only as last resort.",
-    )
+    r"""The configuration of :class:`~framework.testbed_model.node.Node`\s."""
+
+    #: The name of the :class:`~framework.testbed_model.node.Node`.
+    name: str
+    #: The hostname of the :class:`~framework.testbed_model.node.Node`. Can also be an IP address.
+    hostname: str
+    #: The name of the user used to connect to the :class:`~framework.testbed_model.node.Node`.
+    user: str
+    #: The password of the user. The use of passwords is heavily discouraged, please use SSH keys.
+    password: str | None = None
+    #: The architecture of the :class:`~framework.testbed_model.node.Node`.
     arch: Architecture
+    #: The operating system of the :class:`~framework.testbed_model.node.Node`.
     os: OS
+    #: A comma delimited list of logical cores to use when running DPDK.
     lcores: LogicalCores = "1"
-    use_first_core: bool = Field(
-        default=False, description="DPDK won't use the first physical core if set to False."
-    )
+    #: If :data:`True`, the first logical core won't be used.
+    use_first_core: bool = False
+    #: An optional hugepage configuration.
     hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
+    #: The ports that can be used in testing.
     ports: list[PortConfig] = Field(min_length=1)
 
 
 class SutNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
-    """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
+    """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration."""
 
-    Attributes:
-        memory_channels: The number of memory channels to use when running DPDK.
-    """
-
-    memory_channels: int = Field(
-        default=1, description="Number of memory channels to use when running DPDK."
-    )
+    #: The number of memory channels to use when running DPDK.
+    memory_channels: int = 1
 
 
 class TGNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
-    """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
-
-    Attributes:
-        traffic_generator: The configuration of the traffic generator present on the TG node.
-    """
+    """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration."""
 
+    #: The configuration of the traffic generator present on the TG node.
     traffic_generator: TrafficGeneratorConfigTypes
 
 
@@ -258,20 +215,18 @@ def resolve_path(path: Path) -> Path:
 
 
 class BaseDPDKLocation(BaseModel, frozen=True, extra="forbid"):
-    """DPDK location.
+    """DPDK location base class.
 
-    The path to the DPDK sources, build dir and type of location.
-
-    Attributes:
-        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
-            located on the SUT node, instead of the execution host.
+    The path to the DPDK sources and type of location.
     """
 
+    #: Specifies whether to find DPDK on the SUT node or on the local host. Which are respectively
+    #: represented by :class:`RemoteDPDKLocation` and :class:`LocalDPDKTreeLocation`.
     remote: bool = False
 
 
 class LocalDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
-    """Local DPDK location parent class.
+    """Local DPDK location base class.
 
     This class is meant to represent any location that is present only locally.
     """
@@ -284,14 +239,12 @@ class LocalDPDKTreeLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
 
     This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
     validation.
-
-    Attributes:
-        dpdk_tree: The path to the DPDK source tree directory.
     """
 
+    #: The path to the DPDK source tree directory on the local host passed as string.
     dpdk_tree: Path
 
-    #: Resolve the local DPDK tree path
+    #: Resolve the local DPDK tree path.
     resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
 
     @model_validator(mode="after")
@@ -307,14 +260,12 @@ class LocalDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
 
     This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
     validation.
-
-    Attributes:
-        tarball: The path to the DPDK tarball.
     """
 
+    #: The path to the DPDK tarball on the local host passed as string.
     tarball: Path
 
-    #: Resolve the local tarball path
+    #: Resolve the local tarball path.
     resolve_tarball_path = field_validator("tarball")(resolve_path)
 
     @model_validator(mode="after")
@@ -326,7 +277,7 @@ def validate_tarball_path(self) -> Self:
 
 
 class RemoteDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
-    """Remote DPDK location parent class.
+    """Remote DPDK location base class.
 
     This class is meant to represent any location that is present only remotely.
     """
@@ -338,11 +289,9 @@ class RemoteDPDKTreeLocation(RemoteDPDKLocation, frozen=True, extra="forbid"):
     """Remote DPDK tree location.
 
     This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
-
-    Attributes:
-        dpdk_tree: The path to the DPDK source tree directory.
     """
 
+    #: The path to the DPDK source tree directory on the remote node passed as string.
     dpdk_tree: PurePath
 
 
@@ -351,11 +300,9 @@ class RemoteDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
 
     This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
     validation.
-
-    Attributes:
-        tarball: The path to the DPDK tarball.
     """
 
+    #: The path to the DPDK tarball on the remote node passed as string.
     tarball: PurePath
 
 
@@ -372,23 +319,17 @@ class BaseDPDKBuildConfiguration(BaseModel, frozen=True, extra="forbid"):
     """The base configuration for different types of build.
 
     The configuration contain the location of the DPDK and configuration used for building it.
-
-    Attributes:
-        dpdk_location: The location of the DPDK tree.
     """
 
+    #: The location of the DPDK tree.
     dpdk_location: DPDKLocation
 
 
 class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
-    """DPDK precompiled build configuration.
-
-    Attributes:
-        precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory
-            is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will
-            be using `dpdk_build_config` from configuration to build the DPDK from source.
-    """
+    """DPDK precompiled build configuration."""
 
+    #: If it's defined, DPDK has been pre-compiled and the build directory is located in a
+    #: subdirectory of `~dpdk_location.dpdk_tree` or `~dpdk_location.tarball` root directory.
     precompiled_build_dir: str = Field(min_length=1)
 
 
@@ -396,20 +337,18 @@ class DPDKBuildOptionsConfiguration(BaseModel, frozen=True, extra="forbid"):
     """DPDK build options configuration.
 
     The build options used for building DPDK.
-
-    Attributes:
-        arch: The target architecture to build for.
-        os: The target os to build for.
-        cpu: The target CPU to build for.
-        compiler: The compiler executable to use.
-        compiler_wrapper: This string will be put in front of the compiler when executing the build.
-            Useful for adding wrapper commands, such as ``ccache``.
     """
 
+    #: The target architecture to build for.
     arch: Architecture
+    #: The target OS to build for.
     os: OS
+    #: The target CPU to build for.
     cpu: CPUType
+    #: The compiler executable to use.
     compiler: Compiler
+    #: This string will be put in front of the compiler when executing the build. Useful for adding
+    #: wrapper commands, such as ``ccache``.
     compiler_wrapper: str = ""
 
     @cached_property
@@ -419,12 +358,9 @@ def name(self) -> str:
 
 
 class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
-    """DPDK uncompiled build configuration.
-
-    Attributes:
-        build_options: The build options to compile DPDK.
-    """
+    """DPDK uncompiled build configuration."""
 
+    #: The build options to compiled DPDK with.
     build_options: DPDKBuildOptionsConfiguration
 
 
@@ -448,24 +384,13 @@ class TestSuiteConfig(BaseModel, frozen=True, extra="forbid"):
             # or as model fields:
             - test_suite: hello_world
               test_cases: [hello_world_single_core] # without this field all test cases are run
-
-    Attributes:
-        test_suite_name: The name of the test suite module without the starting ``TestSuite_``.
-        test_cases_names: The names of test cases from this test suite to execute.
-            If empty, all test cases will be executed.
     """
 
-    test_suite_name: str = Field(
-        title="Test suite name",
-        description="The identifying module name of the test suite without the prefix.",
-        alias="test_suite",
-    )
-    test_cases_names: list[str] = Field(
-        default_factory=list,
-        title="Test cases by name",
-        description="The identifying name of the test cases of the test suite.",
-        alias="test_cases",
-    )
+    #: The name of the test suite module without the starting ``TestSuite_``.
+    test_suite_name: str = Field(alias="test_suite")
+    #: The names of test cases from this test suite to execute. If empty, all test cases will be
+    #: executed.
+    test_cases_names: list[str] = Field(default_factory=list, alias="test_cases")
 
     @cached_property
     def test_suite_spec(self) -> "TestSuiteSpec":
@@ -507,14 +432,11 @@ def validate_names(self) -> Self:
 
 
 class TestRunSUTNodeConfiguration(BaseModel, frozen=True, extra="forbid"):
-    """The SUT node configuration of a test run.
-
-    Attributes:
-        node_name: The SUT node to use in this test run.
-        vdevs: The names of virtual devices to test.
-    """
+    """The SUT node configuration of a test run."""
 
+    #: The SUT node to use in this test run.
     node_name: str
+    #: The names of virtual devices to test.
     vdevs: list[str] = Field(default_factory=list)
 
 
@@ -523,25 +445,23 @@ class TestRunConfiguration(BaseModel, frozen=True, extra="forbid"):
 
     The configuration contains testbed information, what tests to execute
     and with what DPDK build.
-
-    Attributes:
-        dpdk_config: The DPDK configuration used to test.
-        perf: Whether to run performance tests.
-        func: Whether to run functional tests.
-        skip_smoke_tests: Whether to skip smoke tests.
-        test_suites: The names of test suites and/or test cases to execute.
-        system_under_test_node: The SUT node configuration to use in this test run.
-        traffic_generator_node: The TG node name to use in this test run.
-        random_seed: The seed to use for pseudo-random generation.
     """
 
+    #: The DPDK configuration used to test.
     dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
-    perf: bool = Field(description="Enable performance testing.")
-    func: bool = Field(description="Enable functional testing.")
+    #: Whether to run performance tests.
+    perf: bool
+    #: Whether to run functional tests.
+    func: bool
+    #: Whether to skip smoke tests.
     skip_smoke_tests: bool = False
+    #: The names of test suites and/or test cases to execute.
     test_suites: list[TestSuiteConfig] = Field(min_length=1)
+    #: The SUT node configuration to use in this test run.
     system_under_test_node: TestRunSUTNodeConfiguration
+    #: The TG node name to use in this test run.
     traffic_generator_node: str
+    #: The seed to use for pseudo-random generation.
     random_seed: int | None = None
 
 
@@ -557,14 +477,11 @@ class TestRunWithNodesConfiguration(NamedTuple):
 
 
 class Configuration(BaseModel, extra="forbid"):
-    """DTS testbed and test configuration.
-
-    Attributes:
-        test_runs: Test run configurations.
-        nodes: Node configurations.
-    """
+    """DTS testbed and test configuration."""
 
+    #: Test run configurations.
     test_runs: list[TestRunConfiguration] = Field(min_length=1)
+    #: Node configurations.
     nodes: list[NodeConfigurationTypes] = Field(min_length=1)
 
     @cached_property
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v4 8/8] dts: use TestSuiteSpec class imports
  2024-10-28 17:49 ` [PATCH v4 0/8] dts: Pydantic configuration Luca Vizzarro
                     ` (6 preceding siblings ...)
  2024-10-28 17:49   ` [PATCH v4 7/8] dts: improve configuration API docs Luca Vizzarro
@ 2024-10-28 17:49   ` Luca Vizzarro
  2024-11-04 17:50     ` Nicholas Pratte
  7 siblings, 1 reply; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-28 17:49 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

The introduction of TestSuiteSpec adds auto-discovery of test suites,
which are also automatically imported. This causes double imports as the
runner loads the test suites. This changes the behaviour of the runner
to load the imported classes from TestSuiteSpec instead of importing
them again.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/runner.py | 84 ++++-------------------------------------
 1 file changed, 7 insertions(+), 77 deletions(-)

diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index c3d9a27a8c..5f5837a132 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -2,6 +2,7 @@
 # Copyright(c) 2010-2019 Intel Corporation
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
+# Copyright(c) 2024 Arm Limited
 
 """Test suite runner module.
 
@@ -17,8 +18,6 @@
 and the test case stage runs test cases individually.
 """
 
-import importlib
-import inspect
 import os
 import random
 import sys
@@ -39,12 +38,7 @@
     TGNodeConfiguration,
     load_config,
 )
-from .exception import (
-    BlockingTestSuiteError,
-    ConfigurationError,
-    SSHTimeoutError,
-    TestCaseVerifyError,
-)
+from .exception import BlockingTestSuiteError, SSHTimeoutError, TestCaseVerifyError
 from .logger import DTSLogger, DtsStage, get_dts_logger
 from .settings import SETTINGS
 from .test_result import (
@@ -215,11 +209,10 @@ def _get_test_suites_with_cases(
         func: bool,
         perf: bool,
     ) -> list[TestSuiteWithCases]:
-        """Test suites with test cases discovery.
+        """Get test suites with selected cases.
 
-        The test suites with test cases defined in the user configuration are discovered
-        and stored for future use so that we don't import the modules twice and so that
-        the list of test suites with test cases is available for recording right away.
+        The test suites with test cases defined in the user configuration are selected
+        and the corresponding functions and classes are gathered.
 
         Args:
             test_suite_configs: Test suite configurations.
@@ -227,12 +220,12 @@ def _get_test_suites_with_cases(
             perf: Whether to include performance test cases in the final list.
 
         Returns:
-            The discovered test suites, each with test cases.
+            The test suites, each with test cases.
         """
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
+            test_suite_class = test_suite_config.test_suite_spec.class_obj
             test_cases: list[type[TestCase]] = []
             func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
                 test_suite_config.test_cases_names
@@ -245,71 +238,8 @@ def _get_test_suites_with_cases(
             test_suites_with_cases.append(
                 TestSuiteWithCases(test_suite_class=test_suite_class, test_cases=test_cases)
             )
-
         return test_suites_with_cases
 
-    def _get_test_suite_class(self, module_name: str) -> type[TestSuite]:
-        """Find the :class:`TestSuite` class in `module_name`.
-
-        The full module name is `module_name` prefixed with `self._test_suite_module_prefix`.
-        The module name is a standard filename with words separated with underscores.
-        Search the `module_name` for a :class:`TestSuite` class which starts
-        with `self._test_suite_class_prefix`, continuing with CamelCase `module_name`.
-        The first matching class is returned.
-
-        The CamelCase convention applies to abbreviations, acronyms, initialisms and so on::
-
-            OS -> Os
-            TCP -> Tcp
-
-        Args:
-            module_name: The module name without prefix where to search for the test suite.
-
-        Returns:
-            The found test suite class.
-
-        Raises:
-            ConfigurationError: If the corresponding module is not found or
-                a valid :class:`TestSuite` is not found in the module.
-        """
-
-        def is_test_suite(object) -> bool:
-            """Check whether `object` is a :class:`TestSuite`.
-
-            The `object` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
-
-            Args:
-                object: The object to be checked.
-
-            Returns:
-                :data:`True` if `object` is a subclass of `TestSuite`.
-            """
-            try:
-                if issubclass(object, TestSuite) and object is not TestSuite:
-                    return True
-            except TypeError:
-                return False
-            return False
-
-        testsuite_module_path = f"{self._test_suite_module_prefix}{module_name}"
-        try:
-            test_suite_module = importlib.import_module(testsuite_module_path)
-        except ModuleNotFoundError as e:
-            raise ConfigurationError(
-                f"Test suite module '{testsuite_module_path}' not found."
-            ) from e
-
-        camel_case_suite_name = "".join(
-            [suite_word.capitalize() for suite_word in module_name.split("_")]
-        )
-        full_suite_name_to_find = f"{self._test_suite_class_prefix}{camel_case_suite_name}"
-        for class_name, class_obj in inspect.getmembers(test_suite_module, is_test_suite):
-            if class_name == full_suite_name_to_find:
-                return class_obj
-        raise ConfigurationError(
-            f"Couldn't find any valid test suites in {test_suite_module.__name__}."
-        )
-
     def _connect_nodes_and_run_test_run(
         self,
         sut_nodes: dict[str, SutNode],
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 3/5] dts: use Pydantic in the configuration
  2024-09-30 17:56   ` Nicholas Pratte
@ 2024-10-29 12:41     ` Luca Vizzarro
  0 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-29 12:41 UTC (permalink / raw)
  To: Nicholas Pratte
  Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

On 30/09/2024 18:56, Nicholas Pratte wrote:
>> +the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
>> +dataclass model. The Pydantic model is also available as
> 
> Out of curiosity, what is the reason for maintaining use of
> dataclasses here as opposed to creating BaseModel subclasses for the
> Pydantic library? I suppose both implementations would lead to the
> same result, but is it mostly for the sake of familiarity and
> consistency that we're still using dataclasses here?

The original idea was as you said, familiarity and consistency, but it 
actually brings extra headaches and it's unnecessary. I've used 
BaseModel for everything in the newer versions.

>>   @unique
>> -class TrafficGeneratorType(StrEnum):
>> +class TrafficGeneratorType(str, Enum):
>>       """The supported traffic generators."""
>>
>>       #:
>> -    SCAPY = auto()
>> +    SCAPY = "SCAPY"
> 
> Going off of Juraj's comments, would you be able to provide an deeper
> explanation as how this new parameterization of str and enum works
> with respect to the Pydantic field discriminators?

This is something I am not able to explain unfortunately. For some 
reason – I am not 100% sure how Pydantic works in this case – the 
original implementation was incompatible with the discriminator. 
Retaining the original setup made Pydantic use values of:

  "<TrafficGeneratorType.SCAPY: SCAPY>"

instead of plain "SCAPY", for some reason. So it's an incompatibility issue.

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 3/5] dts: use Pydantic in the configuration
  2024-09-30 21:45   ` Dean Marx
@ 2024-10-29 12:51     ` Luca Vizzarro
  0 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-29 12:51 UTC (permalink / raw)
  To: Dean Marx; +Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

On 30/09/2024 22:45, Dean Marx wrote:
>     -@dataclass(slots=True, frozen=True)
>     +@dataclass(slots=True, frozen=True, kw_only=True,
>     config=ConfigDict(extra="forbid"))
> 
> 
> Up to you but I think it might be worth specifying what some of these 
> extra pydantic args are for if we're going to keep the name of the 
> decorator as "dataclass." For example, this ConfigDict "forbid" argument 
> seems to be commonly used, same with the "before/after" modes with the 
> model_validator. Maybe a brief description somewhere in the docstrings, 
> just so others can see how it differs from the previous implementation 
> even without experience using pydantic.

Your concern is understandable, but I don't want to risk repeating what 
the Pydantic docs already say. As this is a feature of Pydantic itself, 
it should be looked up on its documentation. On the other hand, saying 
why we are doing would make perfect sense, but it's also quite obvious 
at the same time. This is because `extra="forbid"` just means to forbid 
any extra fields provided (in the configuration) during deserialization.

>     +    @model_validator(mode="before")
>           @classmethod
>     -    def from_dict(
>     -        cls,
>     -        entry: str | TestSuiteConfigDict,
>     -    ) -> Self:
>     -        """Create an instance from two different types.
>     +    def convert_from_string(cls, data: Any) -> Any:
>     +        """Convert the string representation into a valid mapping."""
>     +        if isinstance(data, str):
>     +            [test_suite, *test_cases] = data.split()
>     +            return dict(test_suite=test_suite, test_cases=test_cases)
>     +        return data
> 
> 
> Again this is completely your call, but might be worth explaining in the 
> docstrings why this "before" method is used here while the other 
> validators are running with "after."

Similarly, this comes with an understanding of 
deserialization/validation/creation phases. Pydantic actually have their 
own meaning of "validators" which could be confusing at first glance, 
and may be worth reading into it. Here we are just re-using the same 
terminology for consistency. The validators themselves explain what they 
do. The phases need to be understood from Pydantic's documentation in 
order to understand... Pydantic code.

In the code you quoted, the model doc say that instead of a mapping of 
fields, meaning:

   test_suite: hello_world
   test_cases:
   - hello_world_single_core

a string could be provided:

   hello_world hello_world_single_core

This validator is set in before mode because we can't instantiate a 
Pydantic model with a string, and we need to parse the string to provide 
the valid mappings for Pydantic. The before mode happens before the 
model is validated by Pydantic and the raw input is provided, the after 
mode happens afterwards and the instance is given instead.

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 4/5] dts: use TestSuiteSpec class imports
  2024-09-17 11:39   ` Juraj Linkeš
@ 2024-10-29 12:52     ` Luca Vizzarro
  0 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-29 12:52 UTC (permalink / raw)
  To: Juraj Linkeš, dev; +Cc: Honnappa Nagarahalli, Paul Szczepanek

On 17/09/2024 12:39, Juraj Linkeš wrote:
>> @@ -229,139 +221,34 @@ def _get_test_suites_with_cases(
> 
>> +            filtered_test_cases: list[TestCase] = [
>> +                test_case
>> +                for test_case in test_suite_spec.test_cases
>> +                if not test_suite_config.test_cases_names
>> +                or test_case.name in test_suite_config.test_cases_names
>> +            ]
> 
> Ah, looks like TestSuiteSpec doesn't contain the subset we want to test. 
> Could we adapt it this way? I think we don't really care about test 
> cases we don't want to test.

This is a bit out of scope, as the TestSuiteSpec only defines the 
specification of the test suites as they are defined.

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 4/5] dts: use TestSuiteSpec class imports
  2024-10-01 17:12   ` Dean Marx
@ 2024-10-29 12:54     ` Luca Vizzarro
  0 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-29 12:54 UTC (permalink / raw)
  To: Dean Marx; +Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

On 01/10/2024 18:12, Dean Marx wrote:
> 
>     +            filtered_test_cases: list[TestCase] = [
>     +                test_case
>     +                for test_case in test_suite_spec.test_cases
>     +                if not test_suite_config.test_cases_names
>     +                or test_case.name <http://test_case.name> in
>     test_suite_config.test_cases_names
>     +            ]
> 
> 
> Just wondering, what's the plan with the filtered test cases? I'm 
> assuming they're stored here so we can report which cases were skipped 
> after runtime?

I believe this behaviour changed in the latest version I've sent. But 
the idea is that now we have TestCases as objects instead of strings, 
and the purpose of this is to retrieve all the objects from the 
user-provided strings.

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 4/5] dts: use TestSuiteSpec class imports
  2024-10-01 20:45   ` Nicholas Pratte
@ 2024-10-29 12:56     ` Luca Vizzarro
  2024-11-04 17:49       ` Nicholas Pratte
  0 siblings, 1 reply; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-29 12:56 UTC (permalink / raw)
  To: Nicholas Pratte
  Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

On 01/10/2024 21:45, Nicholas Pratte wrote:
> The code you have here makes sense, and I like the implementation as
> it removes a lot of fluff in DTSRunner. I know Jurja mentioned in an
> earlier patch in this series that this functionality intersects with
> the capabilities series, but I'm missing a lot of context to
> understand that fully. Maybe you could provide some insight? I'll make
> sure to analyse this deeper in my own time as well. Beyond that:

Most of the intersection comes from the fact that this series adds auto 
discovery of test suites and test cases, therefore treating the test 
cases as objects with labels for processing and filtering.

In his capability patches Juraj also needed to turn test cases into 
objects to add runtime metadata to them, such as required capabilities.

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 1/5] dts: add TestSuiteSpec class and discovery
  2024-09-16 13:00   ` Juraj Linkeš
@ 2024-10-29 12:57     ` Luca Vizzarro
  0 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-29 12:57 UTC (permalink / raw)
  To: Juraj Linkeš, dev; +Cc: Honnappa Nagarahalli, Paul Szczepanek

Thank you Juraj for the comments.

Most if not all suggestions have been applied in the new versions.

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 3/5] dts: use Pydantic in the configuration
  2024-09-17 11:13   ` Juraj Linkeš
@ 2024-10-29 13:00     ` Luca Vizzarro
  0 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-10-29 13:00 UTC (permalink / raw)
  To: Juraj Linkeš, dev; +Cc: Honnappa Nagarahalli, Paul Szczepanek

Likewise, thank you Juraj for the comments.

Most if not all suggestions have been applied in the new versions.

On 17/09/2024 12:13, Juraj Linkeš wrote:
>> - the config schema is no longer used for validation but kept as an
>>    alternative format for the developer
> 
> If it's not used, we should remove it right away (in this patch). I see 
> that it's updated in v5, but we can just add it back.

You are right, and this is now removed in the new versions.

>> diff --git a/dts/framework/runner.py b/dts/framework/runner.py
>> @@ -231,10 +234,10 @@ def _get_test_suites_with_cases(
>>           test_suites_with_cases = []
>>           for test_suite_config in test_suite_configs:
>> -            test_suite_class = 
>> self._get_test_suite_class(test_suite_config.test_suite)
>> +            test_suite_class = 
>> self._get_test_suite_class(test_suite_config.test_suite_name)
> 
> We've already done all the validation and importing at this point and we 
> should be able to use test_suite_config.test_suite_spec, right? The same 
> is true for TestSuiteWithCases, which holds the same information.

This is correct.

> Looks like you removed _get_test_suite_class in a subsequent patch, but 
> we should think about getting rid of TestSuiteWithCases, as it was 
> conceived to do what TestSuiteSpec is doing.

I believe the two have different ideas, one just describes the 
specification, and one extracts a subset for runtime. It is a good idea 
to unify stuff anyways, so it may something we could do in the near future.

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v4 1/8] dts: add pydantic dependency
  2024-10-28 17:49   ` [PATCH v4 1/8] dts: add pydantic dependency Luca Vizzarro
@ 2024-10-31 18:42     ` Nicholas Pratte
  0 siblings, 0 replies; 83+ messages in thread
From: Nicholas Pratte @ 2024-10-31 18:42 UTC (permalink / raw)
  To: Luca Vizzarro; +Cc: dev, Paul Szczepanek, Patrick Robb

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

On Mon, Oct 28, 2024 at 1:51 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> As part of configuration validation and deserialization improvements,
> this adds pydantic as a project dependency. Pydantic is a library that
> caters to all of the aforementioned needs, while improving the process
> and code.
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>  dts/poetry.lock    | 171 ++++++++++++++++++++++++++++++++++++++++++++-
>  dts/pyproject.toml |   1 +
>  2 files changed, 170 insertions(+), 2 deletions(-)
>
> diff --git a/dts/poetry.lock b/dts/poetry.lock
> index cf5f6569c6..56c50ad52c 100644
> --- a/dts/poetry.lock
> +++ b/dts/poetry.lock
> @@ -1,4 +1,4 @@
> -# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
> +# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand.
>
>  [[package]]
>  name = "aenum"
> @@ -23,6 +23,17 @@ files = [
>      {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
>  ]
>
> +[[package]]
> +name = "annotated-types"
> +version = "0.7.0"
> +description = "Reusable constraint types to use with typing.Annotated"
> +optional = false
> +python-versions = ">=3.8"
> +files = [
> +    {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"},
> +    {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
> +]
> +
>  [[package]]
>  name = "attrs"
>  version = "23.1.0"
> @@ -567,6 +578,16 @@ files = [
>      {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
>      {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
>      {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
> +    {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"},
> +    {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"},
> +    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"},
> +    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"},
> +    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"},
> +    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"},
> +    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"},
> +    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"},
> +    {file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"},
> +    {file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"},
>      {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
>      {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
>      {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
> @@ -762,6 +783,130 @@ files = [
>      {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
>  ]
>
> +[[package]]
> +name = "pydantic"
> +version = "2.9.2"
> +description = "Data validation using Python type hints"
> +optional = false
> +python-versions = ">=3.8"
> +files = [
> +    {file = "pydantic-2.9.2-py3-none-any.whl", hash = "sha256:f048cec7b26778210e28a0459867920654d48e5e62db0958433636cde4254f12"},
> +    {file = "pydantic-2.9.2.tar.gz", hash = "sha256:d155cef71265d1e9807ed1c32b4c8deec042a44a50a4188b25ac67ecd81a9c0f"},
> +]
> +
> +[package.dependencies]
> +annotated-types = ">=0.6.0"
> +pydantic-core = "2.23.4"
> +typing-extensions = [
> +    {version = ">=4.12.2", markers = "python_version >= \"3.13\""},
> +    {version = ">=4.6.1", markers = "python_version < \"3.13\""},
> +]
> +
> +[package.extras]
> +email = ["email-validator (>=2.0.0)"]
> +timezone = ["tzdata"]
> +
> +[[package]]
> +name = "pydantic-core"
> +version = "2.23.4"
> +description = "Core functionality for Pydantic validation and serialization"
> +optional = false
> +python-versions = ">=3.8"
> +files = [
> +    {file = "pydantic_core-2.23.4-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:b10bd51f823d891193d4717448fab065733958bdb6a6b351967bd349d48d5c9b"},
> +    {file = "pydantic_core-2.23.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4fc714bdbfb534f94034efaa6eadd74e5b93c8fa6315565a222f7b6f42ca1166"},
> +    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63e46b3169866bd62849936de036f901a9356e36376079b05efa83caeaa02ceb"},
> +    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed1a53de42fbe34853ba90513cea21673481cd81ed1be739f7f2efb931b24916"},
> +    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cfdd16ab5e59fc31b5e906d1a3f666571abc367598e3e02c83403acabc092e07"},
> +    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:255a8ef062cbf6674450e668482456abac99a5583bbafb73f9ad469540a3a232"},
> +    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a7cd62e831afe623fbb7aabbb4fe583212115b3ef38a9f6b71869ba644624a2"},
> +    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f09e2ff1f17c2b51f2bc76d1cc33da96298f0a036a137f5440ab3ec5360b624f"},
> +    {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e38e63e6f3d1cec5a27e0afe90a085af8b6806ee208b33030e65b6516353f1a3"},
> +    {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0dbd8dbed2085ed23b5c04afa29d8fd2771674223135dc9bc937f3c09284d071"},
> +    {file = "pydantic_core-2.23.4-cp310-none-win32.whl", hash = "sha256:6531b7ca5f951d663c339002e91aaebda765ec7d61b7d1e3991051906ddde119"},
> +    {file = "pydantic_core-2.23.4-cp310-none-win_amd64.whl", hash = "sha256:7c9129eb40958b3d4500fa2467e6a83356b3b61bfff1b414c7361d9220f9ae8f"},
> +    {file = "pydantic_core-2.23.4-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:77733e3892bb0a7fa797826361ce8a9184d25c8dffaec60b7ffe928153680ba8"},
> +    {file = "pydantic_core-2.23.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1b84d168f6c48fabd1f2027a3d1bdfe62f92cade1fb273a5d68e621da0e44e6d"},
> +    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:df49e7a0861a8c36d089c1ed57d308623d60416dab2647a4a17fe050ba85de0e"},
> +    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ff02b6d461a6de369f07ec15e465a88895f3223eb75073ffea56b84d9331f607"},
> +    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:996a38a83508c54c78a5f41456b0103c30508fed9abcad0a59b876d7398f25fd"},
> +    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d97683ddee4723ae8c95d1eddac7c192e8c552da0c73a925a89fa8649bf13eea"},
> +    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:216f9b2d7713eb98cb83c80b9c794de1f6b7e3145eef40400c62e86cee5f4e1e"},
> +    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6f783e0ec4803c787bcea93e13e9932edab72068f68ecffdf86a99fd5918878b"},
> +    {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d0776dea117cf5272382634bd2a5c1b6eb16767c223c6a5317cd3e2a757c61a0"},
> +    {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d5f7a395a8cf1621939692dba2a6b6a830efa6b3cee787d82c7de1ad2930de64"},
> +    {file = "pydantic_core-2.23.4-cp311-none-win32.whl", hash = "sha256:74b9127ffea03643e998e0c5ad9bd3811d3dac8c676e47db17b0ee7c3c3bf35f"},
> +    {file = "pydantic_core-2.23.4-cp311-none-win_amd64.whl", hash = "sha256:98d134c954828488b153d88ba1f34e14259284f256180ce659e8d83e9c05eaa3"},
> +    {file = "pydantic_core-2.23.4-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:f3e0da4ebaef65158d4dfd7d3678aad692f7666877df0002b8a522cdf088f231"},
> +    {file = "pydantic_core-2.23.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f69a8e0b033b747bb3e36a44e7732f0c99f7edd5cea723d45bc0d6e95377ffee"},
> +    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:723314c1d51722ab28bfcd5240d858512ffd3116449c557a1336cbe3919beb87"},
> +    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bb2802e667b7051a1bebbfe93684841cc9351004e2badbd6411bf357ab8d5ac8"},
> +    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d18ca8148bebe1b0a382a27a8ee60350091a6ddaf475fa05ef50dc35b5df6327"},
> +    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33e3d65a85a2a4a0dc3b092b938a4062b1a05f3a9abde65ea93b233bca0e03f2"},
> +    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:128585782e5bfa515c590ccee4b727fb76925dd04a98864182b22e89a4e6ed36"},
> +    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:68665f4c17edcceecc112dfed5dbe6f92261fb9d6054b47d01bf6371a6196126"},
> +    {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:20152074317d9bed6b7a95ade3b7d6054845d70584216160860425f4fbd5ee9e"},
> +    {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:9261d3ce84fa1d38ed649c3638feefeae23d32ba9182963e465d58d62203bd24"},
> +    {file = "pydantic_core-2.23.4-cp312-none-win32.whl", hash = "sha256:4ba762ed58e8d68657fc1281e9bb72e1c3e79cc5d464be146e260c541ec12d84"},
> +    {file = "pydantic_core-2.23.4-cp312-none-win_amd64.whl", hash = "sha256:97df63000f4fea395b2824da80e169731088656d1818a11b95f3b173747b6cd9"},
> +    {file = "pydantic_core-2.23.4-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7530e201d10d7d14abce4fb54cfe5b94a0aefc87da539d0346a484ead376c3cc"},
> +    {file = "pydantic_core-2.23.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:df933278128ea1cd77772673c73954e53a1c95a4fdf41eef97c2b779271bd0bd"},
> +    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cb3da3fd1b6a5d0279a01877713dbda118a2a4fc6f0d821a57da2e464793f05"},
> +    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42c6dcb030aefb668a2b7009c85b27f90e51e6a3b4d5c9bc4c57631292015b0d"},
> +    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:696dd8d674d6ce621ab9d45b205df149399e4bb9aa34102c970b721554828510"},
> +    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2971bb5ffe72cc0f555c13e19b23c85b654dd2a8f7ab493c262071377bfce9f6"},
> +    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8394d940e5d400d04cad4f75c0598665cbb81aecefaca82ca85bd28264af7f9b"},
> +    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0dff76e0602ca7d4cdaacc1ac4c005e0ce0dcfe095d5b5259163a80d3a10d327"},
> +    {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7d32706badfe136888bdea71c0def994644e09fff0bfe47441deaed8e96fdbc6"},
> +    {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ed541d70698978a20eb63d8c5d72f2cc6d7079d9d90f6b50bad07826f1320f5f"},
> +    {file = "pydantic_core-2.23.4-cp313-none-win32.whl", hash = "sha256:3d5639516376dce1940ea36edf408c554475369f5da2abd45d44621cb616f769"},
> +    {file = "pydantic_core-2.23.4-cp313-none-win_amd64.whl", hash = "sha256:5a1504ad17ba4210df3a045132a7baeeba5a200e930f57512ee02909fc5c4cb5"},
> +    {file = "pydantic_core-2.23.4-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:d4488a93b071c04dc20f5cecc3631fc78b9789dd72483ba15d423b5b3689b555"},
> +    {file = "pydantic_core-2.23.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:81965a16b675b35e1d09dd14df53f190f9129c0202356ed44ab2728b1c905658"},
> +    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ffa2ebd4c8530079140dd2d7f794a9d9a73cbb8e9d59ffe24c63436efa8f271"},
> +    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:61817945f2fe7d166e75fbfb28004034b48e44878177fc54d81688e7b85a3665"},
> +    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:29d2c342c4bc01b88402d60189f3df065fb0dda3654744d5a165a5288a657368"},
> +    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5e11661ce0fd30a6790e8bcdf263b9ec5988e95e63cf901972107efc49218b13"},
> +    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9d18368b137c6295db49ce7218b1a9ba15c5bc254c96d7c9f9e924a9bc7825ad"},
> +    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ec4e55f79b1c4ffb2eecd8a0cfba9955a2588497d96851f4c8f99aa4a1d39b12"},
> +    {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:374a5e5049eda9e0a44c696c7ade3ff355f06b1fe0bb945ea3cac2bc336478a2"},
> +    {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5c364564d17da23db1106787675fc7af45f2f7b58b4173bfdd105564e132e6fb"},
> +    {file = "pydantic_core-2.23.4-cp38-none-win32.whl", hash = "sha256:d7a80d21d613eec45e3d41eb22f8f94ddc758a6c4720842dc74c0581f54993d6"},
> +    {file = "pydantic_core-2.23.4-cp38-none-win_amd64.whl", hash = "sha256:5f5ff8d839f4566a474a969508fe1c5e59c31c80d9e140566f9a37bba7b8d556"},
> +    {file = "pydantic_core-2.23.4-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a4fa4fc04dff799089689f4fd502ce7d59de529fc2f40a2c8836886c03e0175a"},
> +    {file = "pydantic_core-2.23.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a7df63886be5e270da67e0966cf4afbae86069501d35c8c1b3b6c168f42cb36"},
> +    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dcedcd19a557e182628afa1d553c3895a9f825b936415d0dbd3cd0bbcfd29b4b"},
> +    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f54b118ce5de9ac21c363d9b3caa6c800341e8c47a508787e5868c6b79c9323"},
> +    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:86d2f57d3e1379a9525c5ab067b27dbb8a0642fb5d454e17a9ac434f9ce523e3"},
> +    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:de6d1d1b9e5101508cb37ab0d972357cac5235f5c6533d1071964c47139257df"},
> +    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1278e0d324f6908e872730c9102b0112477a7f7cf88b308e4fc36ce1bdb6d58c"},
> +    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9a6b5099eeec78827553827f4c6b8615978bb4b6a88e5d9b93eddf8bb6790f55"},
> +    {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:e55541f756f9b3ee346b840103f32779c695a19826a4c442b7954550a0972040"},
> +    {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a5c7ba8ffb6d6f8f2ab08743be203654bb1aaa8c9dcb09f82ddd34eadb695605"},
> +    {file = "pydantic_core-2.23.4-cp39-none-win32.whl", hash = "sha256:37b0fe330e4a58d3c58b24d91d1eb102aeec675a3db4c292ec3928ecd892a9a6"},
> +    {file = "pydantic_core-2.23.4-cp39-none-win_amd64.whl", hash = "sha256:1498bec4c05c9c787bde9125cfdcc63a41004ff167f495063191b863399b1a29"},
> +    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:f455ee30a9d61d3e1a15abd5068827773d6e4dc513e795f380cdd59932c782d5"},
> +    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1e90d2e3bd2c3863d48525d297cd143fe541be8bbf6f579504b9712cb6b643ec"},
> +    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e203fdf807ac7e12ab59ca2bfcabb38c7cf0b33c41efeb00f8e5da1d86af480"},
> +    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e08277a400de01bc72436a0ccd02bdf596631411f592ad985dcee21445bd0068"},
> +    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f220b0eea5965dec25480b6333c788fb72ce5f9129e8759ef876a1d805d00801"},
> +    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:d06b0c8da4f16d1d1e352134427cb194a0a6e19ad5db9161bf32b2113409e728"},
> +    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:ba1a0996f6c2773bd83e63f18914c1de3c9dd26d55f4ac302a7efe93fb8e7433"},
> +    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:9a5bce9d23aac8f0cf0836ecfc033896aa8443b501c58d0602dbfd5bd5b37753"},
> +    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:78ddaaa81421a29574a682b3179d4cf9e6d405a09b99d93ddcf7e5239c742e21"},
> +    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:883a91b5dd7d26492ff2f04f40fbb652de40fcc0afe07e8129e8ae779c2110eb"},
> +    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:88ad334a15b32a791ea935af224b9de1bf99bcd62fabf745d5f3442199d86d59"},
> +    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:233710f069d251feb12a56da21e14cca67994eab08362207785cf8c598e74577"},
> +    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:19442362866a753485ba5e4be408964644dd6a09123d9416c54cd49171f50744"},
> +    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:624e278a7d29b6445e4e813af92af37820fafb6dcc55c012c834f9e26f9aaaef"},
> +    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f5ef8f42bec47f21d07668a043f077d507e5bf4e668d5c6dfe6aaba89de1a5b8"},
> +    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:aea443fffa9fbe3af1a9ba721a87f926fe548d32cab71d188a6ede77d0ff244e"},
> +    {file = "pydantic_core-2.23.4.tar.gz", hash = "sha256:2584f7cf844ac4d970fba483a717dbe10c1c1c96a969bf65d61ffe94df1b2863"},
> +]
> +
> +[package.dependencies]
> +typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
> +
>  [[package]]
>  name = "pydocstyle"
>  version = "6.1.1"
> @@ -880,6 +1025,7 @@ files = [
>      {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
>      {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
>      {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
> +    {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
>      {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
>      {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
>      {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
> @@ -887,8 +1033,16 @@ files = [
>      {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
>      {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
>      {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
> +    {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
>      {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
>      {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
> +    {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
> +    {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
> +    {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
> +    {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
> +    {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
> +    {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
> +    {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
>      {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
>      {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
>      {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
> @@ -905,6 +1059,7 @@ files = [
>      {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
>      {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
>      {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
> +    {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
>      {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
>      {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
>      {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
> @@ -912,6 +1067,7 @@ files = [
>      {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
>      {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
>      {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
> +    {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
>      {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
>      {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
>      {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
> @@ -1327,6 +1483,17 @@ files = [
>      {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
>  ]
>
> +[[package]]
> +name = "typing-extensions"
> +version = "4.12.2"
> +description = "Backported and Experimental Type Hints for Python 3.8+"
> +optional = false
> +python-versions = ">=3.8"
> +files = [
> +    {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"},
> +    {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"},
> +]
> +
>  [[package]]
>  name = "urllib3"
>  version = "2.0.7"
> @@ -1362,4 +1529,4 @@ jsonschema = ">=4,<5"
>  [metadata]
>  lock-version = "2.0"
>  python-versions = "^3.10"
> -content-hash = "6f20ce05310df93fed1d392160d1653ae5de5c6f260a5865eb3c6111a7c2b394"
> +content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827"
> diff --git a/dts/pyproject.toml b/dts/pyproject.toml
> index 506380ac2f..6c2d1ca8a4 100644
> --- a/dts/pyproject.toml
> +++ b/dts/pyproject.toml
> @@ -28,6 +28,7 @@ scapy = "^2.5.0"
>  pydocstyle = "6.1.1"
>  typing-extensions = "^4.11.0"
>  aenum = "^3.1.15"
> +pydantic = "^2.9.2"
>
>  [tool.poetry.group.dev.dependencies]
>  mypy = "^1.10.0"
> --
> 2.43.0
>

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v4 2/8] dts: add TestSuiteSpec class and discovery
  2024-10-28 17:49   ` [PATCH v4 2/8] dts: add TestSuiteSpec class and discovery Luca Vizzarro
@ 2024-10-31 19:32     ` Nicholas Pratte
  2024-10-31 20:21     ` Nicholas Pratte
  1 sibling, 0 replies; 83+ messages in thread
From: Nicholas Pratte @ 2024-10-31 19:32 UTC (permalink / raw)
  To: Luca Vizzarro; +Cc: dev, Paul Szczepanek, Patrick Robb

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

On Mon, Oct 28, 2024 at 1:51 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> Currently there is a lack of a definition which identifies all the test
> suites available to test. This change intends to simplify the process to
> discover all the test suites and idenfity them.
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>  dts/framework/runner.py                   |   2 +-
>  dts/framework/test_suite.py               | 189 +++++++++++++++++++---
>  dts/framework/testbed_model/capability.py |  12 +-
>  3 files changed, 177 insertions(+), 26 deletions(-)
>
> diff --git a/dts/framework/runner.py b/dts/framework/runner.py
> index 8bbe698eaf..195622c653 100644
> --- a/dts/framework/runner.py
> +++ b/dts/framework/runner.py
> @@ -225,7 +225,7 @@ def _get_test_suites_with_cases(
>          for test_suite_config in test_suite_configs:
>              test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
>              test_cases: list[type[TestCase]] = []
> -            func_test_cases, perf_test_cases = test_suite_class.get_test_cases(
> +            func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
>                  test_suite_config.test_cases
>              )
>              if func:
> diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
> index cbe3b30ffc..936eb2cede 100644
> --- a/dts/framework/test_suite.py
> +++ b/dts/framework/test_suite.py
> @@ -1,6 +1,7 @@
>  # SPDX-License-Identifier: BSD-3-Clause
>  # Copyright(c) 2010-2014 Intel Corporation
>  # Copyright(c) 2023 PANTHEON.tech s.r.o.
> +# Copyright(c) 2024 Arm Limited
>
>  """Features common to all test suites.
>
> @@ -16,13 +17,20 @@
>  import inspect
>  from collections import Counter
>  from collections.abc import Callable, Sequence
> +from dataclasses import dataclass
>  from enum import Enum, auto
> +from functools import cached_property
> +from importlib import import_module
>  from ipaddress import IPv4Interface, IPv6Interface, ip_interface
> +from pkgutil import iter_modules
> +from types import ModuleType
>  from typing import ClassVar, Protocol, TypeVar, Union, cast
>
> +from pydantic.alias_generators import to_pascal
>  from scapy.layers.inet import IP  # type: ignore[import-untyped]
>  from scapy.layers.l2 import Ether  # type: ignore[import-untyped]
>  from scapy.packet import Packet, Padding, raw  # type: ignore[import-untyped]
> +from typing_extensions import Self
>
>  from framework.testbed_model.capability import TestProtocol
>  from framework.testbed_model.port import Port
> @@ -33,7 +41,7 @@
>      PacketFilteringConfig,
>  )
>
> -from .exception import ConfigurationError, TestCaseVerifyError
> +from .exception import ConfigurationError, InternalError, TestCaseVerifyError
>  from .logger import DTSLogger, get_dts_logger
>  from .utils import get_packet_summaries
>
> @@ -112,10 +120,24 @@ def __init__(
>          self._tg_ip_address_ingress = ip_interface("192.168.101.3/24")
>
>      @classmethod
> -    def get_test_cases(
> +    def get_test_cases(cls) -> list[type["TestCase"]]:
> +        """A list of all the available test cases."""
> +
> +        def is_test_case(function: Callable) -> bool:
> +            if inspect.isfunction(function):
> +                # TestCase is not used at runtime, so we can't use isinstance() with `function`.
> +                # But function.test_type exists.
> +                if hasattr(function, "test_type"):
> +                    return isinstance(function.test_type, TestCaseType)
> +            return False
> +
> +        return [test_case for _, test_case in inspect.getmembers(cls, is_test_case)]
> +
> +    @classmethod
> +    def filter_test_cases(
>          cls, test_case_sublist: Sequence[str] | None = None
>      ) -> tuple[set[type["TestCase"]], set[type["TestCase"]]]:
> -        """Filter `test_case_subset` from this class.
> +        """Filter `test_case_sublist` from this class.
>
>          Test cases are regular (or bound) methods decorated with :func:`func_test`
>          or :func:`perf_test`.
> @@ -129,17 +151,8 @@ def get_test_cases(
>              as methods are bound to instances and this method only has access to the class.
>
>          Raises:
> -            ConfigurationError: If a test case from `test_case_subset` is not found.
> +            ConfigurationError: If a test case from `test_case_sublist` is not found.
>          """
> -
> -        def is_test_case(function: Callable) -> bool:
> -            if inspect.isfunction(function):
> -                # TestCase is not used at runtime, so we can't use isinstance() with `function`.
> -                # But function.test_type exists.
> -                if hasattr(function, "test_type"):
> -                    return isinstance(function.test_type, TestCaseType)
> -            return False
> -
>          if test_case_sublist is None:
>              test_case_sublist = []
>
> @@ -149,22 +162,22 @@ def is_test_case(function: Callable) -> bool:
>          func_test_cases = set()
>          perf_test_cases = set()
>
> -        for test_case_name, test_case_function in inspect.getmembers(cls, is_test_case):
> -            if test_case_name in test_case_sublist_copy:
> +        for test_case in cls.get_test_cases():
> +            if test_case.name in test_case_sublist_copy:
>                  # if test_case_sublist_copy is non-empty, remove the found test case
>                  # so that we can look at the remainder at the end
> -                test_case_sublist_copy.remove(test_case_name)
> +                test_case_sublist_copy.remove(test_case.name)
>              elif test_case_sublist:
>                  # the original list not being empty means we're filtering test cases
> -                # since we didn't remove test_case_name in the previous branch,
> +                # since we didn't remove test_case.name in the previous branch,
>                  # it doesn't match the filter and we don't want to remove it
>                  continue
>
> -            match test_case_function.test_type:
> +            match test_case.test_type:
>                  case TestCaseType.PERFORMANCE:
> -                    perf_test_cases.add(test_case_function)
> +                    perf_test_cases.add(test_case)
>                  case TestCaseType.FUNCTIONAL:
> -                    func_test_cases.add(test_case_function)
> +                    func_test_cases.add(test_case)
>
>          if test_case_sublist_copy:
>              raise ConfigurationError(
> @@ -536,6 +549,8 @@ class TestCase(TestProtocol, Protocol[TestSuiteMethodType]):
>      test case function to :class:`TestCase` and sets common variables.
>      """
>
> +    #:
> +    name: ClassVar[str]
>      #:
>      test_type: ClassVar[TestCaseType]
>      #: necessary for mypy so that it can treat this class as the function it's shadowing
> @@ -560,6 +575,7 @@ def make_decorator(
>
>          def _decorator(func: TestSuiteMethodType) -> type[TestCase]:
>              test_case = cast(type[TestCase], func)
> +            test_case.name = func.__name__
>              test_case.skip = cls.skip
>              test_case.skip_reason = cls.skip_reason
>              test_case.required_capabilities = set()
> @@ -575,3 +591,136 @@ def _decorator(func: TestSuiteMethodType) -> type[TestCase]:
>  func_test: Callable = TestCase.make_decorator(TestCaseType.FUNCTIONAL)
>  #: The decorator for performance test cases.
>  perf_test: Callable = TestCase.make_decorator(TestCaseType.PERFORMANCE)
> +
> +
> +@dataclass
> +class TestSuiteSpec:
> +    """A class defining the specification of a test suite.
> +
> +    Apart from defining all the specs of a test suite, a helper function :meth:`discover_all` is
> +    provided to automatically discover all the available test suites.
> +
> +    Attributes:
> +        module_name: The name of the test suite's module.
> +    """
> +
> +    #:
> +    TEST_SUITES_PACKAGE_NAME = "tests"
> +    #:
> +    TEST_SUITE_MODULE_PREFIX = "TestSuite_"
> +    #:
> +    TEST_SUITE_CLASS_PREFIX = "Test"
> +    #:
> +    TEST_CASE_METHOD_PREFIX = "test_"
> +    #:
> +    FUNC_TEST_CASE_REGEX = r"test_(?!perf_)"
> +    #:
> +    PERF_TEST_CASE_REGEX = r"test_perf_"
> +
> +    module_name: str
> +
> +    @cached_property
> +    def name(self) -> str:
> +        """The name of the test suite's module."""
> +        return self.module_name[len(self.TEST_SUITE_MODULE_PREFIX) :]
> +
> +    @cached_property
> +    def module(self) -> ModuleType:
> +        """A reference to the test suite's module."""
> +        return import_module(f"{self.TEST_SUITES_PACKAGE_NAME}.{self.module_name}")
> +
> +    @cached_property
> +    def class_name(self) -> str:
> +        """The name of the test suite's class."""
> +        return f"{self.TEST_SUITE_CLASS_PREFIX}{to_pascal(self.name)}"
> +
> +    @cached_property
> +    def class_obj(self) -> type[TestSuite]:
> +        """A reference to the test suite's class."""
> +
> +        def is_test_suite(obj) -> bool:
> +            """Check whether `obj` is a :class:`TestSuite`.
> +
> +            The `obj` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
> +
> +            Args:
> +                obj: The object to be checked.
> +
> +            Returns:
> +                :data:`True` if `obj` is a subclass of `TestSuite`.
> +            """
> +            try:
> +                if issubclass(obj, TestSuite) and obj is not TestSuite:
> +                    return True
> +            except TypeError:
> +                return False
> +            return False
> +
> +        for class_name, class_obj in inspect.getmembers(self.module, is_test_suite):
> +            if class_name == self.class_name:
> +                return class_obj
> +
> +        raise InternalError(
> +            f"Expected class {self.class_name} not found in module {self.module_name}."
> +        )
> +
> +    @classmethod
> +    def discover_all(
> +        cls, package_name: str | None = None, module_prefix: str | None = None
> +    ) -> list[Self]:
> +        """Discover all the test suites.
> +
> +        The test suites are discovered in the provided `package_name`. The full module name,
> +        expected under that package, is prefixed with `module_prefix`.
> +        The module name is a standard filename with words separated with underscores.
> +        For each module found, search for a :class:`TestSuite` class which starts
> +        with :attr:`~TestSuiteSpec.TEST_SUITE_CLASS_PREFIX`, continuing with the module name in
> +        PascalCase.
> +
> +        The PascalCase convention applies to abbreviations, acronyms, initialisms and so on::
> +
> +            OS -> Os
> +            TCP -> Tcp
> +
> +        Args:
> +            package_name: The name of the package where to find the test suites. If :data:`None`,
> +                the :attr:`~TestSuiteSpec.TEST_SUITES_PACKAGE_NAME` is used.
> +            module_prefix: The name prefix defining the test suite module. If :data:`None`, the
> +                :attr:`~TestSuiteSpec.TEST_SUITE_MODULE_PREFIX` constant is used.
> +
> +        Returns:
> +            A list containing all the discovered test suites.
> +        """
> +        if package_name is None:
> +            package_name = cls.TEST_SUITES_PACKAGE_NAME
> +        if module_prefix is None:
> +            module_prefix = cls.TEST_SUITE_MODULE_PREFIX
> +
> +        test_suites = []
> +
> +        test_suites_pkg = import_module(package_name)
> +        for _, module_name, is_pkg in iter_modules(test_suites_pkg.__path__):
> +            if not module_name.startswith(module_prefix) or is_pkg:
> +                continue
> +
> +            test_suite = cls(module_name)
> +            try:
> +                if test_suite.class_obj:
> +                    test_suites.append(test_suite)
> +            except InternalError as err:
> +                get_dts_logger().warning(err)
> +
> +        return test_suites
> +
> +
> +AVAILABLE_TEST_SUITES: list[TestSuiteSpec] = TestSuiteSpec.discover_all()
> +"""Constant to store all the available, discovered and imported test suites.
> +
> +The test suites should be gathered from this list to avoid importing more than once.
> +"""
> +
> +
> +def find_by_name(name: str) -> TestSuiteSpec | None:
> +    """Find a requested test suite by name from the available ones."""
> +    test_suites = filter(lambda t: t.name == name, AVAILABLE_TEST_SUITES)
> +    return next(test_suites, None)
> diff --git a/dts/framework/testbed_model/capability.py b/dts/framework/testbed_model/capability.py
> index 2207957a7a..0d5f0e0b32 100644
> --- a/dts/framework/testbed_model/capability.py
> +++ b/dts/framework/testbed_model/capability.py
> @@ -47,9 +47,9 @@ def test_scatter_mbuf_2048(self):
>
>  import inspect
>  from abc import ABC, abstractmethod
> -from collections.abc import MutableSet, Sequence
> +from collections.abc import MutableSet
>  from dataclasses import dataclass
> -from typing import Callable, ClassVar, Protocol
> +from typing import TYPE_CHECKING, Callable, ClassVar, Protocol
>
>  from typing_extensions import Self
>
> @@ -66,6 +66,9 @@ def test_scatter_mbuf_2048(self):
>  from .sut_node import SutNode
>  from .topology import Topology, TopologyType
>
> +if TYPE_CHECKING:
> +    from framework.test_suite import TestCase
> +
>
>  class Capability(ABC):
>      """The base class for various capabilities.
> @@ -354,8 +357,7 @@ def set_required(self, test_case_or_suite: type["TestProtocol"]) -> None:
>          if inspect.isclass(test_case_or_suite):
>              if self.topology_type is not TopologyType.default:
>                  self.add_to_required(test_case_or_suite)
> -                func_test_cases, perf_test_cases = test_case_or_suite.get_test_cases()
> -                for test_case in func_test_cases | perf_test_cases:
> +                for test_case in test_case_or_suite.get_test_cases():
>                      if test_case.topology_type.topology_type is TopologyType.default:
>                          # test case topology has not been set, use the one set by the test suite
>                          self.add_to_required(test_case)
> @@ -446,7 +448,7 @@ class TestProtocol(Protocol):
>      required_capabilities: ClassVar[set[Capability]] = set()
>
>      @classmethod
> -    def get_test_cases(cls, test_case_sublist: Sequence[str] | None = None) -> tuple[set, set]:
> +    def get_test_cases(cls) -> list[type["TestCase"]]:
>          """Get test cases. Should be implemented by subclasses containing test cases.
>
>          Raises:
> --
> 2.43.0
>

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v4 3/8] dts: refactor build and node info classes
  2024-10-28 17:49   ` [PATCH v4 3/8] dts: refactor build and node info classes Luca Vizzarro
@ 2024-10-31 20:16     ` Nicholas Pratte
  2024-11-06 18:02       ` Luca Vizzarro
  0 siblings, 1 reply; 83+ messages in thread
From: Nicholas Pratte @ 2024-10-31 20:16 UTC (permalink / raw)
  To: Luca Vizzarro; +Cc: dev, Paul Szczepanek, Patrick Robb

It took me a second to appreciate what the goal of separating this is,
but it makes complete sense to me now. This was an oversight on my end
as well when I was working on the config changes in once of the
patches I was assigned. Interestingly enough, I ran into a similar
problem with circular dependencies a long time ago when I was
attempting to do some 'arch' discovery changes, which I still intend
to implement. I think putting the node config in the os_session
component makes sense from a readability standpoint as well.

On Mon, Oct 28, 2024 at 1:51 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> The DPDKBuildInfo and NodeInfo classes, representing information
> gathered in runtime, were erroneously placed in the configuration
> package. This moves them in more appropriate modules.
>
> NodeInfo, specifically, ia moved to os_session instead of node mostly

Small typo here, change 'ia' to 'is'.

> as a consequence of circular dependencies. And given os_session is the
> top-most module to reference it, it appears to be the most suitable
> place outside of node.

As I said, this makes sense to me, but I wonder if it might make sense
to change 'NodeInfo' to 'OSSessionInfo' or something like that. I'd
imagine that if any attributes were to be tacked on in the future they
would probably be os related, but maybe there would be system
information, and in this case "OSSessionInfo" might be a good middle
ground. There are existing changes that I've done where arch is
discovered during runtime, and this could probably be placed in this
'NodeInfo' class as well when I get around to revising it. My only
concern is whether or not having "NodeConfiguration" and "NodeInfo"
classes floating around might make the framework more confusing to
read.

<snip>

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v4 4/8] dts: use pydantic in the configuration
  2024-10-28 17:49   ` [PATCH v4 4/8] dts: use pydantic in the configuration Luca Vizzarro
@ 2024-10-31 20:20     ` Nicholas Pratte
  0 siblings, 0 replies; 83+ messages in thread
From: Nicholas Pratte @ 2024-10-31 20:20 UTC (permalink / raw)
  To: Luca Vizzarro; +Cc: dev, Paul Szczepanek, Patrick Robb

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

On Mon, Oct 28, 2024 at 1:51 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> This change brings in pydantic in place of warlock. Pydantic offers
> a built-in model validation system in the classes, which allows for
> a more resilient and simpler code. As a consequence of this change:
>
> - most validation is now built-in
> - further validation is added to verify:
>   - cross referencing of node names and ports
>   - test suite and test cases names
> - dictionaries representing the config schema are removed
> - the config schema is no longer used and therefore dropped
> - the TrafficGeneratorType enum has been changed from inheriting
>   StrEnum to the native str and Enum. This change was necessary to
>   enable the discriminator for object unions
> - the structure of the classes has been slightly changed to perfectly
>   match the structure of the configuration files
> - the test suite argument catches the ValidationError that
>   TestSuiteConfig can now raise
> - the DPDK location has been wrapped under another configuration
>   mapping `dpdk_location`
> - the DPDK locations are now structured and enforced by classes,
>   further simplifying the validation and handling thanks to
>   pattern matching
>
> Bugzilla ID: 1508
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>  doc/api/dts/conf_yaml_schema.json             |   1 -
>  doc/api/dts/framework.config.rst              |   6 -
>  doc/api/dts/framework.config.types.rst        |   8 -
>  dts/conf.yaml                                 |  11 +-
>  dts/framework/config/__init__.py              | 822 +++++++++---------
>  dts/framework/config/conf_yaml_schema.json    | 459 ----------
>  dts/framework/config/types.py                 | 149 ----
>  dts/framework/runner.py                       |  57 +-
>  dts/framework/settings.py                     | 124 +--
>  dts/framework/testbed_model/node.py           |  15 +-
>  dts/framework/testbed_model/os_session.py     |   4 +-
>  dts/framework/testbed_model/port.py           |   4 +-
>  dts/framework/testbed_model/posix_session.py  |  10 +-
>  dts/framework/testbed_model/sut_node.py       | 182 ++--
>  dts/framework/testbed_model/topology.py       |  11 +-
>  .../traffic_generator/__init__.py             |   4 +-
>  .../traffic_generator/traffic_generator.py    |   2 +-
>  dts/framework/utils.py                        |   2 +-
>  dts/tests/TestSuite_smoke_tests.py            |   2 +-
>  19 files changed, 653 insertions(+), 1220 deletions(-)
>  delete mode 120000 doc/api/dts/conf_yaml_schema.json
>  delete mode 100644 doc/api/dts/framework.config.types.rst
>  delete mode 100644 dts/framework/config/conf_yaml_schema.json
>  delete mode 100644 dts/framework/config/types.py
>
> diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json
> deleted file mode 120000
> index 5978642d76..0000000000
> --- a/doc/api/dts/conf_yaml_schema.json
> +++ /dev/null
> @@ -1 +0,0 @@
> -../../../dts/framework/config/conf_yaml_schema.json
> \ No newline at end of file
> diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst
> index 261997aefa..cc266276c1 100644
> --- a/doc/api/dts/framework.config.rst
> +++ b/doc/api/dts/framework.config.rst
> @@ -6,9 +6,3 @@ config - Configuration Package
>  .. automodule:: framework.config
>     :members:
>     :show-inheritance:
> -
> -.. toctree::
> -   :hidden:
> -   :maxdepth: 1
> -
> -   framework.config.types
> diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst
> deleted file mode 100644
> index a50a0c874a..0000000000
> --- a/doc/api/dts/framework.config.types.rst
> +++ /dev/null
> @@ -1,8 +0,0 @@
> -.. SPDX-License-Identifier: BSD-3-Clause
> -
> -config.types - Configuration Types
> -==================================
> -
> -.. automodule:: framework.config.types
> -   :members:
> -   :show-inheritance:
> diff --git a/dts/conf.yaml b/dts/conf.yaml
> index 8a65a481d6..2496262854 100644
> --- a/dts/conf.yaml
> +++ b/dts/conf.yaml
> @@ -5,11 +5,12 @@
>  test_runs:
>    # define one test run environment
>    - dpdk_build:
> -      # dpdk_tree: Commented out because `tarball` is defined.
> -      tarball: dpdk-tarball.tar.xz
> -      # Either `dpdk_tree` or `tarball` can be defined, but not both.
> -      remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
> -                    # is located on the SUT node, instead of the execution host.
> +      dpdk_location:
> +        # dpdk_tree: Commented out because `tarball` is defined.
> +        tarball: dpdk-tarball.tar.xz
> +        # Either `dpdk_tree` or `tarball` can be defined, but not both.
> +        remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
> +                      # is located on the SUT node, instead of the execution host.
>
>        # precompiled_build_dir: Commented out because `build_options` is defined.
>        build_options:
> diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
> index 7403ccbf14..c86bfaaabf 100644
> --- a/dts/framework/config/__init__.py
> +++ b/dts/framework/config/__init__.py
> @@ -2,17 +2,18 @@
>  # Copyright(c) 2010-2021 Intel Corporation
>  # Copyright(c) 2022-2023 University of New Hampshire
>  # Copyright(c) 2023 PANTHEON.tech s.r.o.
> +# Copyright(c) 2024 Arm Limited
>
>  """Testbed configuration and test suite specification.
>
>  This package offers classes that hold real-time information about the testbed, hold test run
>  configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
> -the YAML test run configuration file
> -and validates it according to :download:`the schema <conf_yaml_schema.json>`.
> +the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
> +model.
>
>  The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
> -this package. The allowed keys and types inside this dictionary are defined in
> -the :doc:`types <framework.config.types>` module.
> +this package. The allowed keys and types inside this dictionary map directly to the
> +:class:`Configuration` model, its fields and sub-models.
>
>  The test run configuration has two main sections:
>
> @@ -24,39 +25,28 @@
>
>  The real-time information about testbed is supposed to be gathered at runtime.
>
> -The classes defined in this package make heavy use of :mod:`dataclasses`.
> -All of them use slots and are frozen:
> +The classes defined in this package make heavy use of :mod:`pydantic`.
> +Nearly all of them are frozen:
>
> -    * Slots enables some optimizations, by pre-allocating space for the defined
> -      attributes in the underlying data structure,
>      * Frozen makes the object immutable. This enables further optimizations,
>        and makes it thread safe should we ever want to move in that direction.
>  """
>
> -import json
> -import os.path
>  import tarfile
> -from dataclasses import dataclass, fields
> -from enum import auto, unique
> -from pathlib import Path
> -from typing import Union
> +from enum import Enum, auto, unique
> +from functools import cached_property
> +from pathlib import Path, PurePath
> +from typing import TYPE_CHECKING, Annotated, Any, Literal, NamedTuple
>
> -import warlock  # type: ignore[import-untyped]
>  import yaml
> +from pydantic import BaseModel, Field, ValidationError, field_validator, model_validator
>  from typing_extensions import Self
>
> -from framework.config.types import (
> -    ConfigurationDict,
> -    DPDKBuildConfigDict,
> -    DPDKConfigurationDict,
> -    NodeConfigDict,
> -    PortConfigDict,
> -    TestRunConfigDict,
> -    TestSuiteConfigDict,
> -    TrafficGeneratorConfigDict,
> -)
>  from framework.exception import ConfigurationError
> -from framework.utils import StrEnum
> +from framework.utils import REGEX_FOR_PCI_ADDRESS, StrEnum
> +
> +if TYPE_CHECKING:
> +    from framework.test_suite import TestSuiteSpec
>
>
>  @unique
> @@ -118,15 +108,14 @@ class Compiler(StrEnum):
>
>
>  @unique
> -class TrafficGeneratorType(StrEnum):
> +class TrafficGeneratorType(str, Enum):
>      """The supported traffic generators."""
>
>      #:
> -    SCAPY = auto()
> +    SCAPY = "SCAPY"
>
>
> -@dataclass(slots=True, frozen=True)
> -class HugepageConfiguration:
> +class HugepageConfiguration(BaseModel, frozen=True, extra="forbid"):
>      r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
>
>      Attributes:
> @@ -138,12 +127,10 @@ class HugepageConfiguration:
>      force_first_numa: bool
>
>
> -@dataclass(slots=True, frozen=True)
> -class PortConfig:
> +class PortConfig(BaseModel, frozen=True, extra="forbid"):
>      r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
>
>      Attributes:
> -        node: The :class:`~framework.testbed_model.node.Node` where this port exists.
>          pci: The PCI address of the port.
>          os_driver_for_dpdk: The operating system driver name for use with DPDK.
>          os_driver: The operating system driver name when the operating system controls the port.
> @@ -152,70 +139,57 @@ class PortConfig:
>          peer_pci: The PCI address of the port connected to this port.
>      """
>
> -    node: str
> -    pci: str
> -    os_driver_for_dpdk: str
> -    os_driver: str
> -    peer_node: str
> -    peer_pci: str
> -
> -    @classmethod
> -    def from_dict(cls, node: str, d: PortConfigDict) -> Self:
> -        """A convenience method that creates the object from fewer inputs.
> -
> -        Args:
> -            node: The node where this port exists.
> -            d: The configuration dictionary.
> -
> -        Returns:
> -            The port configuration instance.
> -        """
> -        return cls(node=node, **d)
> -
> -
> -@dataclass(slots=True, frozen=True)
> -class TrafficGeneratorConfig:
> -    """The configuration of traffic generators.
> -
> -    The class will be expanded when more configuration is needed.
> +    pci: str = Field(
> +        description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS
> +    )
> +    os_driver_for_dpdk: str = Field(
> +        description="The driver that the kernel should bind this device to for DPDK to use it.",
> +        examples=["vfio-pci", "mlx5_core"],
> +    )
> +    os_driver: str = Field(
> +        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
> +    )
> +    peer_node: str = Field(description="The name of the peer node this port is connected to.")
> +    peer_pci: str = Field(
> +        description="The PCI address of the peer port this port is connected to.",
> +        pattern=REGEX_FOR_PCI_ADDRESS,
> +    )
> +
> +
> +class TrafficGeneratorConfig(BaseModel, frozen=True, extra="forbid"):
> +    """A protocol required to define traffic generator types.
>
>      Attributes:
> -        traffic_generator_type: The type of the traffic generator.
> +        type: The traffic generator type, the child class is required to define to be distinguished
> +            among others.
>      """
>
> -    traffic_generator_type: TrafficGeneratorType
> +    type: TrafficGeneratorType
>
> -    @staticmethod
> -    def from_dict(d: TrafficGeneratorConfigDict) -> "TrafficGeneratorConfig":
> -        """A convenience method that produces traffic generator config of the proper type.
>
> -        Args:
> -            d: The configuration dictionary.
> +class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="forbid"):
> +    """Scapy traffic generator specific configuration."""
>
> -        Returns:
> -            The traffic generator configuration instance.
> +    type: Literal[TrafficGeneratorType.SCAPY]
>
> -        Raises:
> -            ConfigurationError: An unknown traffic generator type was encountered.
> -        """
> -        match TrafficGeneratorType(d["type"]):
> -            case TrafficGeneratorType.SCAPY:
> -                return ScapyTrafficGeneratorConfig(
> -                    traffic_generator_type=TrafficGeneratorType.SCAPY
> -                )
> -            case _:
> -                raise ConfigurationError(f'Unknown traffic generator type "{d["type"]}".')
>
> +#: A union type discriminating traffic generators by the `type` field.
> +TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
>
> -@dataclass(slots=True, frozen=True)
> -class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
> -    """Scapy traffic generator specific configuration."""
>
> -    pass
> +#: A field representing logical core ranges.
> +LogicalCores = Annotated[
> +    str,
> +    Field(
> +        description="Comma-separated list of logical cores to use. "
> +        "An empty string means use all lcores.",
> +        examples=["1,2,3,4,5,18-22", "10-15"],
> +        pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
> +    ),
> +]
>
>
> -@dataclass(slots=True, frozen=True)
> -class NodeConfiguration:
> +class NodeConfiguration(BaseModel, frozen=True, extra="forbid"):
>      r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
>
>      Attributes:
> @@ -234,285 +208,317 @@ class NodeConfiguration:
>          ports: The ports that can be used in testing.
>      """
>
> -    name: str
> -    hostname: str
> -    user: str
> -    password: str | None
> +    name: str = Field(description="A unique identifier for this node.")
> +    hostname: str = Field(description="The hostname or IP address of the node.")
> +    user: str = Field(description="The login user to use to connect to this node.")
> +    password: str | None = Field(
> +        default=None,
> +        description="The login password to use to connect to this node. "
> +        "SSH keys are STRONGLY preferred, use only as last resort.",
> +    )
>      arch: Architecture
>      os: OS
> -    lcores: str
> -    use_first_core: bool
> -    hugepages: HugepageConfiguration | None
> -    ports: list[PortConfig]
> -
> -    @staticmethod
> -    def from_dict(
> -        d: NodeConfigDict,
> -    ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
> -        """A convenience method that processes the inputs before creating a specialized instance.
> -
> -        Args:
> -            d: The configuration dictionary.
> -
> -        Returns:
> -            Either an SUT or TG configuration instance.
> -        """
> -        hugepage_config = None
> -        if "hugepages_2mb" in d:
> -            hugepage_config_dict = d["hugepages_2mb"]
> -            if "force_first_numa" not in hugepage_config_dict:
> -                hugepage_config_dict["force_first_numa"] = False
> -            hugepage_config = HugepageConfiguration(**hugepage_config_dict)
> -
> -        # The calls here contain duplicated code which is here because Mypy doesn't
> -        # properly support dictionary unpacking with TypedDicts
> -        if "traffic_generator" in d:
> -            return TGNodeConfiguration(
> -                name=d["name"],
> -                hostname=d["hostname"],
> -                user=d["user"],
> -                password=d.get("password"),
> -                arch=Architecture(d["arch"]),
> -                os=OS(d["os"]),
> -                lcores=d.get("lcores", "1"),
> -                use_first_core=d.get("use_first_core", False),
> -                hugepages=hugepage_config,
> -                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
> -                traffic_generator=TrafficGeneratorConfig.from_dict(d["traffic_generator"]),
> -            )
> -        else:
> -            return SutNodeConfiguration(
> -                name=d["name"],
> -                hostname=d["hostname"],
> -                user=d["user"],
> -                password=d.get("password"),
> -                arch=Architecture(d["arch"]),
> -                os=OS(d["os"]),
> -                lcores=d.get("lcores", "1"),
> -                use_first_core=d.get("use_first_core", False),
> -                hugepages=hugepage_config,
> -                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
> -                memory_channels=d.get("memory_channels", 1),
> -            )
> +    lcores: LogicalCores = "1"
> +    use_first_core: bool = Field(
> +        default=False, description="DPDK won't use the first physical core if set to False."
> +    )
> +    hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
> +    ports: list[PortConfig] = Field(min_length=1)
>
>
> -@dataclass(slots=True, frozen=True)
> -class SutNodeConfiguration(NodeConfiguration):
> +class SutNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
>      """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
>
>      Attributes:
>          memory_channels: The number of memory channels to use when running DPDK.
>      """
>
> -    memory_channels: int
> +    memory_channels: int = Field(
> +        default=1, description="Number of memory channels to use when running DPDK."
> +    )
>
>
> -@dataclass(slots=True, frozen=True)
> -class TGNodeConfiguration(NodeConfiguration):
> +class TGNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
>      """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
>
>      Attributes:
>          traffic_generator: The configuration of the traffic generator present on the TG node.
>      """
>
> -    traffic_generator: TrafficGeneratorConfig
> +    traffic_generator: TrafficGeneratorConfigTypes
> +
> +
> +#: Union type for all the node configuration types.
> +NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
>
>
> -@dataclass(slots=True, frozen=True)
> -class DPDKBuildConfiguration:
> -    """DPDK build configuration.
> +def resolve_path(path: Path) -> Path:
> +    """Resolve a path into a real path."""
> +    return path.resolve()
>
> -    The configuration used for building DPDK.
> +
> +class BaseDPDKLocation(BaseModel, frozen=True, extra="forbid"):
> +    """DPDK location.
> +
> +    The path to the DPDK sources, build dir and type of location.
>
>      Attributes:
> -        arch: The target architecture to build for.
> -        os: The target os to build for.
> -        cpu: The target CPU to build for.
> -        compiler: The compiler executable to use.
> -        compiler_wrapper: This string will be put in front of the compiler when
> -            executing the build. Useful for adding wrapper commands, such as ``ccache``.
> -        name: The name of the compiler.
> +        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
> +            located on the SUT node, instead of the execution host.
>      """
>
> -    arch: Architecture
> -    os: OS
> -    cpu: CPUType
> -    compiler: Compiler
> -    compiler_wrapper: str
> -    name: str
> +    remote: bool = False
>
> -    @classmethod
> -    def from_dict(cls, d: DPDKBuildConfigDict) -> Self:
> -        r"""A convenience method that processes the inputs before creating an instance.
>
> -        `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
> -        `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
> +class LocalDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
> +    """Local DPDK location parent class.
>
> -        Args:
> -            d: The configuration dictionary.
> +    This class is meant to represent any location that is present only locally.
> +    """
>
> -        Returns:
> -            The DPDK build configuration instance.
> -        """
> -        return cls(
> -            arch=Architecture(d["arch"]),
> -            os=OS(d["os"]),
> -            cpu=CPUType(d["cpu"]),
> -            compiler=Compiler(d["compiler"]),
> -            compiler_wrapper=d.get("compiler_wrapper", ""),
> -            name=f"{d['arch']}-{d['os']}-{d['cpu']}-{d['compiler']}",
> -        )
> +    remote: Literal[False] = False
>
>
> -@dataclass(slots=True, frozen=True)
> -class DPDKLocation:
> -    """DPDK location.
> +class LocalDPDKTreeLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
> +    """Local DPDK tree location.
>
> -    The path to the DPDK sources, build dir and type of location.
> +    This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
> +    validation.
>
>      Attributes:
> -        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
> -            must be provided.
> -        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
> -            provided.
> -        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
> -            located on the SUT node, instead of the execution host.
> -        build_dir: If it's defined, DPDK has been pre-compiled and the build directory is located in
> -            a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will be using
> -            `build_options` from configuration to build the DPDK from source.
> +        dpdk_tree: The path to the DPDK source tree directory.
>      """
>
> -    dpdk_tree: str | None
> -    tarball: str | None
> -    remote: bool
> -    build_dir: str | None
> +    dpdk_tree: Path
>
> -    @classmethod
> -    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
> -        """A convenience method that processes and validates the inputs before creating an instance.
> +    #: Resolve the local DPDK tree path
> +    resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
>
> -        Validate existence and format of `dpdk_tree` or `tarball` on local filesystem, if
> -        `remote` is False.
> +    @model_validator(mode="after")
> +    def validate_dpdk_tree_path(self) -> Self:
> +        """Validate the provided DPDK tree path."""
> +        assert self.dpdk_tree.exists(), "DPDK tree not found in local filesystem."
> +        assert self.dpdk_tree.is_dir(), "The DPDK tree path must be a directory."
> +        return self
>
> -        Args:
> -            d: The configuration dictionary.
>
> -        Returns:
> -            The DPDK location instance.
> +class LocalDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
> +    """Local DPDK tarball location.
>
> -        Raises:
> -            ConfigurationError: If `dpdk_tree` or `tarball` not found in local filesystem or they
> -                aren't in the right format.
> -        """
> -        dpdk_tree = d.get("dpdk_tree")
> -        tarball = d.get("tarball")
> -        remote = d.get("remote", False)
> -
> -        if not remote:
> -            if dpdk_tree:
> -                if not Path(dpdk_tree).exists():
> -                    raise ConfigurationError(
> -                        f"DPDK tree '{dpdk_tree}' not found in local filesystem."
> -                    )
> -
> -                if not Path(dpdk_tree).is_dir():
> -                    raise ConfigurationError(f"The DPDK tree '{dpdk_tree}' must be a directory.")
> -
> -                dpdk_tree = os.path.realpath(dpdk_tree)
> -
> -            if tarball:
> -                if not Path(tarball).exists():
> -                    raise ConfigurationError(
> -                        f"DPDK tarball '{tarball}' not found in local filesystem."
> -                    )
> -
> -                if not tarfile.is_tarfile(tarball):
> -                    raise ConfigurationError(
> -                        f"The DPDK tarball '{tarball}' must be a valid tar archive."
> -                    )
> -
> -        return cls(
> -            dpdk_tree=dpdk_tree,
> -            tarball=tarball,
> -            remote=remote,
> -            build_dir=d.get("precompiled_build_dir"),
> -        )
> +    This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
> +    validation.
> +
> +    Attributes:
> +        tarball: The path to the DPDK tarball.
> +    """
>
> +    tarball: Path
>
> -@dataclass
> -class DPDKConfiguration:
> -    """The configuration of the DPDK build.
> +    #: Resolve the local tarball path
> +    resolve_tarball_path = field_validator("tarball")(resolve_path)
>
> -    The configuration contain the location of the DPDK and configuration used for
> -    building it.
> +    @model_validator(mode="after")
> +    def validate_tarball_path(self) -> Self:
> +        """Validate the provided tarball."""
> +        assert self.tarball.exists(), "DPDK tarball not found in local filesystem."
> +        assert tarfile.is_tarfile(self.tarball), "The DPDK tarball must be a valid tar archive."
> +        return self
> +
> +
> +class RemoteDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
> +    """Remote DPDK location parent class.
> +
> +    This class is meant to represent any location that is present only remotely.
> +    """
> +
> +    remote: Literal[True] = True
> +
> +
> +class RemoteDPDKTreeLocation(RemoteDPDKLocation, frozen=True, extra="forbid"):
> +    """Remote DPDK tree location.
> +
> +    This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
> +
> +    Attributes:
> +        dpdk_tree: The path to the DPDK source tree directory.
> +    """
> +
> +    dpdk_tree: PurePath
> +
> +
> +class RemoteDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
> +    """Remote DPDK tarball location.
> +
> +    This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
> +    validation.
> +
> +    Attributes:
> +        tarball: The path to the DPDK tarball.
> +    """
> +
> +    tarball: PurePath
> +
> +
> +#: Union type for different DPDK locations
> +DPDKLocation = (
> +    LocalDPDKTreeLocation
> +    | LocalDPDKTarballLocation
> +    | RemoteDPDKTreeLocation
> +    | RemoteDPDKTarballLocation
> +)
> +
> +
> +class BaseDPDKBuildConfiguration(BaseModel, frozen=True, extra="forbid"):
> +    """The base configuration for different types of build.
> +
> +    The configuration contain the location of the DPDK and configuration used for building it.
>
>      Attributes:
>          dpdk_location: The location of the DPDK tree.
> -        dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
> -            DTS will use pre-built DPDK from `build_dir` in a :class:`DPDKLocation`.
>      """
>
>      dpdk_location: DPDKLocation
> -    dpdk_build_config: DPDKBuildConfiguration | None
>
> -    @classmethod
> -    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
> -        """A convenience method that processes the inputs before creating an instance.
>
> -        Args:
> -            d: The configuration dictionary.
> +class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
> +    """DPDK precompiled build configuration.
>
> -        Returns:
> -            The DPDK configuration.
> -        """
> -        return cls(
> -            dpdk_location=DPDKLocation.from_dict(d),
> -            dpdk_build_config=(
> -                DPDKBuildConfiguration.from_dict(d["build_options"])
> -                if d.get("build_options")
> -                else None
> -            ),
> -        )
> +    Attributes:
> +        precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory
> +            is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will
> +            be using `dpdk_build_config` from configuration to build the DPDK from source.
> +    """
> +
> +    precompiled_build_dir: str = Field(min_length=1)
> +
> +
> +class DPDKBuildOptionsConfiguration(BaseModel, frozen=True, extra="forbid"):
> +    """DPDK build options configuration.
> +
> +    The build options used for building DPDK.
> +
> +    Attributes:
> +        arch: The target architecture to build for.
> +        os: The target os to build for.
> +        cpu: The target CPU to build for.
> +        compiler: The compiler executable to use.
> +        compiler_wrapper: This string will be put in front of the compiler when executing the build.
> +            Useful for adding wrapper commands, such as ``ccache``.
> +    """
> +
> +    arch: Architecture
> +    os: OS
> +    cpu: CPUType
> +    compiler: Compiler
> +    compiler_wrapper: str = ""
>
> +    @cached_property
> +    def name(self) -> str:
> +        """The name of the compiler."""
> +        return f"{self.arch}-{self.os}-{self.cpu}-{self.compiler}"
>
> -@dataclass(slots=True, frozen=True)
> -class TestSuiteConfig:
> +
> +class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
> +    """DPDK uncompiled build configuration.
> +
> +    Attributes:
> +        build_options: The build options to compile DPDK.
> +    """
> +
> +    build_options: DPDKBuildOptionsConfiguration
> +
> +
> +#: Union type for different build configurations
> +DPDKBuildConfiguration = DPDKPrecompiledBuildConfiguration | DPDKUncompiledBuildConfiguration
> +
> +
> +class TestSuiteConfig(BaseModel, frozen=True, extra="forbid"):
>      """Test suite configuration.
>
> -    Information about a single test suite to be executed.
> +    Information about a single test suite to be executed. This can also be represented as a string
> +    instead of a mapping, example:
> +
> +    .. code:: yaml
> +
> +        test_runs:
> +        - test_suites:
> +            # As string representation:
> +            - hello_world # test all of `hello_world`, or
> +            - hello_world hello_world_single_core # test only `hello_world_single_core`
> +            # or as model fields:
> +            - test_suite: hello_world
> +              test_cases: [hello_world_single_core] # without this field all test cases are run
>
>      Attributes:
> -        test_suite: The name of the test suite module without the starting ``TestSuite_``.
> -        test_cases: The names of test cases from this test suite to execute.
> +        test_suite_name: The name of the test suite module without the starting ``TestSuite_``.
> +        test_cases_names: The names of test cases from this test suite to execute.
>              If empty, all test cases will be executed.
>      """
>
> -    test_suite: str
> -    test_cases: list[str]
> -
> +    test_suite_name: str = Field(
> +        title="Test suite name",
> +        description="The identifying module name of the test suite without the prefix.",
> +        alias="test_suite",
> +    )
> +    test_cases_names: list[str] = Field(
> +        default_factory=list,
> +        title="Test cases by name",
> +        description="The identifying name of the test cases of the test suite.",
> +        alias="test_cases",
> +    )
> +
> +    @cached_property
> +    def test_suite_spec(self) -> "TestSuiteSpec":
> +        """The specification of the requested test suite."""
> +        from framework.test_suite import find_by_name
> +
> +        test_suite_spec = find_by_name(self.test_suite_name)
> +        assert (
> +            test_suite_spec is not None
> +        ), f"{self.test_suite_name} is not a valid test suite module name."
> +        return test_suite_spec
> +
> +    @model_validator(mode="before")
>      @classmethod
> -    def from_dict(
> -        cls,
> -        entry: str | TestSuiteConfigDict,
> -    ) -> Self:
> -        """Create an instance from two different types.
> +    def convert_from_string(cls, data: Any) -> Any:
> +        """Convert the string representation of the model into a valid mapping."""
> +        if isinstance(data, str):
> +            [test_suite, *test_cases] = data.split()
> +            return dict(test_suite=test_suite, test_cases=test_cases)
> +        return data
> +
> +    @model_validator(mode="after")
> +    def validate_names(self) -> Self:
> +        """Validate the supplied test suite and test cases names.
> +
> +        This validator relies on the cached property `test_suite_spec` to run for the first
> +        time in this call, therefore triggering the assertions if needed.
> +        """
> +        available_test_cases = map(
> +            lambda t: t.name, self.test_suite_spec.class_obj.get_test_cases()
> +        )
> +        for requested_test_case in self.test_cases_names:
> +            assert requested_test_case in available_test_cases, (
> +                f"{requested_test_case} is not a valid test case "
> +                f"of test suite {self.test_suite_name}."
> +            )
>
> -        Args:
> -            entry: Either a suite name or a dictionary containing the config.
> +        return self
>
> -        Returns:
> -            The test suite configuration instance.
> -        """
> -        if isinstance(entry, str):
> -            return cls(test_suite=entry, test_cases=[])
> -        elif isinstance(entry, dict):
> -            return cls(test_suite=entry["suite"], test_cases=entry["cases"])
> -        else:
> -            raise TypeError(f"{type(entry)} is not valid for a test suite config.")
>
> +class TestRunSUTNodeConfiguration(BaseModel, frozen=True, extra="forbid"):
> +    """The SUT node configuration of a test run.
>
> -@dataclass(slots=True, frozen=True)
> -class TestRunConfiguration:
> +    Attributes:
> +        node_name: The SUT node to use in this test run.
> +        vdevs: The names of virtual devices to test.
> +    """
> +
> +    node_name: str
> +    vdevs: list[str] = Field(default_factory=list)
> +
> +
> +class TestRunConfiguration(BaseModel, frozen=True, extra="forbid"):
>      """The configuration of a test run.
>
>      The configuration contains testbed information, what tests to execute
> @@ -524,144 +530,130 @@ class TestRunConfiguration:
>          func: Whether to run functional tests.
>          skip_smoke_tests: Whether to skip smoke tests.
>          test_suites: The names of test suites and/or test cases to execute.
> -        system_under_test_node: The SUT node to use in this test run.
> -        traffic_generator_node: The TG node to use in this test run.
> -        vdevs: The names of virtual devices to test.
> +        system_under_test_node: The SUT node configuration to use in this test run.
> +        traffic_generator_node: The TG node name to use in this test run.
>          random_seed: The seed to use for pseudo-random generation.
>      """
>
> -    dpdk_config: DPDKConfiguration
> -    perf: bool
> -    func: bool
> -    skip_smoke_tests: bool
> -    test_suites: list[TestSuiteConfig]
> -    system_under_test_node: SutNodeConfiguration
> -    traffic_generator_node: TGNodeConfiguration
> -    vdevs: list[str]
> -    random_seed: int | None
> -
> -    @classmethod
> -    def from_dict(
> -        cls,
> -        d: TestRunConfigDict,
> -        node_map: dict[str, SutNodeConfiguration | TGNodeConfiguration],
> -    ) -> Self:
> -        """A convenience method that processes the inputs before creating an instance.
> -
> -        The DPDK build and the test suite config are transformed into their respective objects.
> -        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
> -        are just stored.
> -
> -        Args:
> -            d: The test run configuration dictionary.
> -            node_map: A dictionary mapping node names to their config objects.
> -
> -        Returns:
> -            The test run configuration instance.
> -        """
> -        test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"]))
> -        sut_name = d["system_under_test_node"]["node_name"]
> -        skip_smoke_tests = d.get("skip_smoke_tests", False)
> -        assert sut_name in node_map, f"Unknown SUT {sut_name} in test run {d}"
> -        system_under_test_node = node_map[sut_name]
> -        assert isinstance(
> -            system_under_test_node, SutNodeConfiguration
> -        ), f"Invalid SUT configuration {system_under_test_node}"
> -
> -        tg_name = d["traffic_generator_node"]
> -        assert tg_name in node_map, f"Unknown TG {tg_name} in test run {d}"
> -        traffic_generator_node = node_map[tg_name]
> -        assert isinstance(
> -            traffic_generator_node, TGNodeConfiguration
> -        ), f"Invalid TG configuration {traffic_generator_node}"
> -
> -        vdevs = (
> -            d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else []
> -        )
> -        random_seed = d.get("random_seed", None)
> -        return cls(
> -            dpdk_config=DPDKConfiguration.from_dict(d["dpdk_build"]),
> -            perf=d["perf"],
> -            func=d["func"],
> -            skip_smoke_tests=skip_smoke_tests,
> -            test_suites=test_suites,
> -            system_under_test_node=system_under_test_node,
> -            traffic_generator_node=traffic_generator_node,
> -            vdevs=vdevs,
> -            random_seed=random_seed,
> -        )
> -
> -    def copy_and_modify(self, **kwargs) -> Self:
> -        """Create a shallow copy with any of the fields modified.
> +    dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
> +    perf: bool = Field(description="Enable performance testing.")
> +    func: bool = Field(description="Enable functional testing.")
> +    skip_smoke_tests: bool = False
> +    test_suites: list[TestSuiteConfig] = Field(min_length=1)
> +    system_under_test_node: TestRunSUTNodeConfiguration
> +    traffic_generator_node: str
> +    random_seed: int | None = None
>
> -        The only new data are those passed to this method.
> -        The rest are copied from the object's fields calling the method.
>
> -        Args:
> -            **kwargs: The names and types of keyword arguments are defined
> -                by the fields of the :class:`TestRunConfiguration` class.
> +class TestRunWithNodesConfiguration(NamedTuple):
> +    """Tuple containing the configuration of the test run and its associated nodes."""
>
> -        Returns:
> -            The copied and modified test run configuration.
> -        """
> -        new_config = {}
> -        for field in fields(self):
> -            if field.name in kwargs:
> -                new_config[field.name] = kwargs[field.name]
> -            else:
> -                new_config[field.name] = getattr(self, field.name)
> -
> -        return type(self)(**new_config)
> +    #:
> +    test_run_config: TestRunConfiguration
> +    #:
> +    sut_node_config: SutNodeConfiguration
> +    #:
> +    tg_node_config: TGNodeConfiguration
>
>
> -@dataclass(slots=True, frozen=True)
> -class Configuration:
> +class Configuration(BaseModel, extra="forbid"):
>      """DTS testbed and test configuration.
>
> -    The node configuration is not stored in this object. Rather, all used node configurations
> -    are stored inside the test run configuration where the nodes are actually used.
> -
>      Attributes:
>          test_runs: Test run configurations.
> +        nodes: Node configurations.
>      """
>
> -    test_runs: list[TestRunConfiguration]
> +    test_runs: list[TestRunConfiguration] = Field(min_length=1)
> +    nodes: list[NodeConfigurationTypes] = Field(min_length=1)
>
> -    @classmethod
> -    def from_dict(cls, d: ConfigurationDict) -> Self:
> -        """A convenience method that processes the inputs before creating an instance.
> +    @cached_property
> +    def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]:
> +        """List of test runs with the associated nodes."""
> +        test_runs_with_nodes = []
>
> -        DPDK build and test suite config are transformed into their respective objects.
> -        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
> -        are just stored.
> +        for test_run_no, test_run in enumerate(self.test_runs):
> +            sut_node_name = test_run.system_under_test_node.node_name
> +            sut_node = next(filter(lambda n: n.name == sut_node_name, self.nodes), None)
>
> -        Args:
> -            d: The configuration dictionary.
> +            assert sut_node is not None, (
> +                f"test_runs.{test_run_no}.sut_node_config.node_name "
> +                f"({test_run.system_under_test_node.node_name}) is not a valid node name"
> +            )
> +            assert isinstance(sut_node, SutNodeConfiguration), (
> +                f"test_runs.{test_run_no}.sut_node_config.node_name is a valid node name, "
> +                "but it is not a valid SUT node"
> +            )
>
> -        Returns:
> -            The whole configuration instance.
> -        """
> -        nodes: list[SutNodeConfiguration | TGNodeConfiguration] = list(
> -            map(NodeConfiguration.from_dict, d["nodes"])
> -        )
> -        assert len(nodes) > 0, "There must be a node to test"
> +            tg_node_name = test_run.traffic_generator_node
> +            tg_node = next(filter(lambda n: n.name == tg_node_name, self.nodes), None)
>
> -        node_map = {node.name: node for node in nodes}
> -        assert len(nodes) == len(node_map), "Duplicate node names are not allowed"
> +            assert tg_node is not None, (
> +                f"test_runs.{test_run_no}.tg_node_name "
> +                f"({test_run.traffic_generator_node}) is not a valid node name"
> +            )
> +            assert isinstance(tg_node, TGNodeConfiguration), (
> +                f"test_runs.{test_run_no}.tg_node_name is a valid node name, "
> +                "but it is not a valid TG node"
> +            )
>
> -        test_runs: list[TestRunConfiguration] = list(
> -            map(TestRunConfiguration.from_dict, d["test_runs"], [node_map for _ in d])
> -        )
> +            test_runs_with_nodes.append(TestRunWithNodesConfiguration(test_run, sut_node, tg_node))
> +
> +        return test_runs_with_nodes
> +
> +    @field_validator("nodes")
> +    @classmethod
> +    def validate_node_names(cls, nodes: list[NodeConfiguration]) -> list[NodeConfiguration]:
> +        """Validate that the node names are unique."""
> +        nodes_by_name: dict[str, int] = {}
> +        for node_no, node in enumerate(nodes):
> +            assert node.name not in nodes_by_name, (
> +                f"node {node_no} cannot have the same name as node {nodes_by_name[node.name]} "
> +                f"({node.name})"
> +            )
> +            nodes_by_name[node.name] = node_no
> +
> +        return nodes
> +
> +    @model_validator(mode="after")
> +    def validate_ports(self) -> Self:
> +        """Validate that the ports are all linked to valid ones."""
> +        port_links: dict[tuple[str, str], Literal[False] | tuple[int, int]] = {
> +            (node.name, port.pci): False for node in self.nodes for port in node.ports
> +        }
> +
> +        for node_no, node in enumerate(self.nodes):
> +            for port_no, port in enumerate(node.ports):
> +                peer_port_identifier = (port.peer_node, port.peer_pci)
> +                peer_port = port_links.get(peer_port_identifier, None)
> +                assert peer_port is not None, (
> +                    "invalid peer port specified for " f"nodes.{node_no}.ports.{port_no}"
> +                )
> +                assert peer_port is False, (
> +                    f"the peer port specified for nodes.{node_no}.ports.{port_no} "
> +                    f"is already linked to nodes.{peer_port[0]}.ports.{peer_port[1]}"
> +                )
> +                port_links[peer_port_identifier] = (node_no, port_no)
>
> -        return cls(test_runs=test_runs)
> +        return self
> +
> +    @model_validator(mode="after")
> +    def validate_test_runs_with_nodes(self) -> Self:
> +        """Validate the test runs to nodes associations.
> +
> +        This validator relies on the cached property `test_runs_with_nodes` to run for the first
> +        time in this call, therefore triggering the assertions if needed.
> +        """
> +        if self.test_runs_with_nodes:
> +            pass
> +        return self
>
>
>  def load_config(config_file_path: Path) -> Configuration:
>      """Load DTS test run configuration from a file.
>
> -    Load the YAML test run configuration file
> -    and :download:`the configuration file schema <conf_yaml_schema.json>`,
> -    validate the test run configuration file, and create a test run configuration object.
> +    Load the YAML test run configuration file, validate it, and create a test run configuration
> +    object.
>
>      The YAML test run configuration file is specified in the :option:`--config-file` command line
>      argument or the :envvar:`DTS_CFG_FILE` environment variable.
> @@ -671,14 +663,14 @@ def load_config(config_file_path: Path) -> Configuration:
>
>      Returns:
>          The parsed test run configuration.
> +
> +    Raises:
> +        ConfigurationError: If the supplied configuration file is invalid.
>      """
>      with open(config_file_path, "r") as f:
>          config_data = yaml.safe_load(f)
>
> -    schema_path = os.path.join(Path(__file__).parent.resolve(), "conf_yaml_schema.json")
> -
> -    with open(schema_path, "r") as f:
> -        schema = json.load(f)
> -    config = warlock.model_factory(schema, name="_Config")(config_data)
> -    config_obj: Configuration = Configuration.from_dict(dict(config))  # type: ignore[arg-type]
> -    return config_obj
> +    try:
> +        return Configuration.model_validate(config_data)
> +    except ValidationError as e:
> +        raise ConfigurationError("failed to load the supplied configuration") from e
> diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
> deleted file mode 100644
> index cc3e78cef5..0000000000
> --- a/dts/framework/config/conf_yaml_schema.json
> +++ /dev/null
> @@ -1,459 +0,0 @@
> -{
> -  "$schema": "https://json-schema.org/draft-07/schema",
> -  "title": "DTS Config Schema",
> -  "definitions": {
> -    "node_name": {
> -      "type": "string",
> -      "description": "A unique identifier for a node"
> -    },
> -    "NIC": {
> -      "type": "string",
> -      "enum": [
> -        "ALL",
> -        "ConnectX3_MT4103",
> -        "ConnectX4_LX_MT4117",
> -        "ConnectX4_MT4115",
> -        "ConnectX5_MT4119",
> -        "ConnectX5_MT4121",
> -        "I40E_10G-10G_BASE_T_BC",
> -        "I40E_10G-10G_BASE_T_X722",
> -        "I40E_10G-SFP_X722",
> -        "I40E_10G-SFP_XL710",
> -        "I40E_10G-X722_A0",
> -        "I40E_1G-1G_BASE_T_X722",
> -        "I40E_25G-25G_SFP28",
> -        "I40E_40G-QSFP_A",
> -        "I40E_40G-QSFP_B",
> -        "IAVF-ADAPTIVE_VF",
> -        "IAVF-VF",
> -        "IAVF_10G-X722_VF",
> -        "ICE_100G-E810C_QSFP",
> -        "ICE_25G-E810C_SFP",
> -        "ICE_25G-E810_XXV_SFP",
> -        "IGB-I350_VF",
> -        "IGB_1G-82540EM",
> -        "IGB_1G-82545EM_COPPER",
> -        "IGB_1G-82571EB_COPPER",
> -        "IGB_1G-82574L",
> -        "IGB_1G-82576",
> -        "IGB_1G-82576_QUAD_COPPER",
> -        "IGB_1G-82576_QUAD_COPPER_ET2",
> -        "IGB_1G-82580_COPPER",
> -        "IGB_1G-I210_COPPER",
> -        "IGB_1G-I350_COPPER",
> -        "IGB_1G-I354_SGMII",
> -        "IGB_1G-PCH_LPTLP_I218_LM",
> -        "IGB_1G-PCH_LPTLP_I218_V",
> -        "IGB_1G-PCH_LPT_I217_LM",
> -        "IGB_1G-PCH_LPT_I217_V",
> -        "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
> -        "IGC-I225_LM",
> -        "IGC-I226_LM",
> -        "IXGBE_10G-82599_SFP",
> -        "IXGBE_10G-82599_SFP_SF_QP",
> -        "IXGBE_10G-82599_T3_LOM",
> -        "IXGBE_10G-82599_VF",
> -        "IXGBE_10G-X540T",
> -        "IXGBE_10G-X540_VF",
> -        "IXGBE_10G-X550EM_A_SFP",
> -        "IXGBE_10G-X550EM_X_10G_T",
> -        "IXGBE_10G-X550EM_X_SFP",
> -        "IXGBE_10G-X550EM_X_VF",
> -        "IXGBE_10G-X550T",
> -        "IXGBE_10G-X550_VF",
> -        "brcm_57414",
> -        "brcm_P2100G",
> -        "cavium_0011",
> -        "cavium_a034",
> -        "cavium_a063",
> -        "cavium_a064",
> -        "fastlinq_ql41000",
> -        "fastlinq_ql41000_vf",
> -        "fastlinq_ql45000",
> -        "fastlinq_ql45000_vf",
> -        "hi1822",
> -        "virtio"
> -      ]
> -    },
> -
> -    "ARCH": {
> -      "type": "string",
> -      "enum": [
> -        "x86_64",
> -        "arm64",
> -        "ppc64le"
> -      ]
> -    },
> -    "OS": {
> -      "type": "string",
> -      "enum": [
> -        "linux"
> -      ]
> -    },
> -    "cpu": {
> -      "type": "string",
> -      "description": "Native should be the default on x86",
> -      "enum": [
> -        "native",
> -        "armv8a",
> -        "dpaa2",
> -        "thunderx",
> -        "xgene1"
> -      ]
> -    },
> -    "compiler": {
> -      "type": "string",
> -      "enum": [
> -        "gcc",
> -        "clang",
> -        "icc",
> -        "mscv"
> -      ]
> -    },
> -    "build_options": {
> -      "type": "object",
> -      "properties": {
> -        "arch": {
> -          "type": "string",
> -          "enum": [
> -            "ALL",
> -            "x86_64",
> -            "arm64",
> -            "ppc64le",
> -            "other"
> -          ]
> -        },
> -        "os": {
> -          "$ref": "#/definitions/OS"
> -        },
> -        "cpu": {
> -          "$ref": "#/definitions/cpu"
> -        },
> -        "compiler": {
> -          "$ref": "#/definitions/compiler"
> -        },
> -        "compiler_wrapper": {
> -          "type": "string",
> -          "description": "This will be added before compiler to the CC variable when building DPDK. Optional."
> -        }
> -      },
> -      "additionalProperties": false,
> -      "required": [
> -        "arch",
> -        "os",
> -        "cpu",
> -        "compiler"
> -      ]
> -    },
> -    "dpdk_build": {
> -      "type": "object",
> -      "description": "DPDK source and build configuration.",
> -      "properties": {
> -        "dpdk_tree": {
> -          "type": "string",
> -          "description": "The path to the DPDK source tree directory to test. Only one of `dpdk_tree` or `tarball` must be provided."
> -        },
> -        "tarball": {
> -          "type": "string",
> -          "description": "The path to the DPDK source tarball to test. Only one of `dpdk_tree` or `tarball` must be provided."
> -        },
> -        "remote": {
> -          "type": "boolean",
> -          "description": "Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball` is located on the SUT node, instead of the execution host."
> -        },
> -        "precompiled_build_dir": {
> -          "type": "string",
> -          "description": "If it's defined, DPDK has been pre-built and the build directory is located in a subdirectory of DPDK tree root directory. Otherwise, will be using a `build_options` to build the DPDK from source. Either this or `build_options` must be defined, but not both."
> -        },
> -        "build_options": {
> -          "$ref": "#/definitions/build_options",
> -          "description": "Either this or `precompiled_build_dir` must be defined, but not both. DPDK build configuration supported by DTS."
> -        }
> -      },
> -      "allOf": [
> -        {
> -          "oneOf": [
> -            {
> -            "required": [
> -              "dpdk_tree"
> -              ]
> -            },
> -            {
> -              "required": [
> -                "tarball"
> -              ]
> -            }
> -          ]
> -        },
> -        {
> -          "oneOf": [
> -            {
> -              "required": [
> -                "precompiled_build_dir"
> -              ]
> -            },
> -            {
> -              "required": [
> -                "build_options"
> -              ]
> -            }
> -          ]
> -        }
> -      ],
> -      "additionalProperties": false
> -    },
> -    "hugepages_2mb": {
> -      "type": "object",
> -      "description": "Optional hugepage configuration. If not specified, hugepages won't be configured and DTS will use system configuration.",
> -      "properties": {
> -        "number_of": {
> -          "type": "integer",
> -          "description": "The number of hugepages to configure. Hugepage size will be the system default."
> -        },
> -        "force_first_numa": {
> -          "type": "boolean",
> -          "description": "Set to True to force configuring hugepages on the first NUMA node. Defaults to False."
> -        }
> -      },
> -      "additionalProperties": false,
> -      "required": [
> -        "number_of"
> -      ]
> -    },
> -    "mac_address": {
> -      "type": "string",
> -      "description": "A MAC address",
> -      "pattern": "^([0-9A-Fa-f]{2}[:-]){5}([0-9A-Fa-f]{2})$"
> -    },
> -    "pci_address": {
> -      "type": "string",
> -      "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
> -    },
> -    "port_peer_address": {
> -      "description": "Peer is a TRex port, and IXIA port or a PCI address",
> -      "oneOf": [
> -        {
> -          "description": "PCI peer port",
> -          "$ref": "#/definitions/pci_address"
> -        }
> -      ]
> -    },
> -    "test_suite": {
> -      "type": "string",
> -      "enum": [
> -        "hello_world",
> -        "os_udp",
> -        "pmd_buffer_scatter",
> -        "vlan"
> -      ]
> -    },
> -    "test_target": {
> -      "type": "object",
> -      "properties": {
> -        "suite": {
> -          "$ref": "#/definitions/test_suite"
> -        },
> -        "cases": {
> -          "type": "array",
> -          "description": "If specified, only this subset of test suite's test cases will be run.",
> -          "items": {
> -            "type": "string"
> -          },
> -          "minimum": 1
> -        }
> -      },
> -      "required": [
> -        "suite"
> -      ],
> -      "additionalProperties": false
> -    }
> -  },
> -  "type": "object",
> -  "properties": {
> -    "nodes": {
> -      "type": "array",
> -      "items": {
> -        "type": "object",
> -        "properties": {
> -          "name": {
> -            "type": "string",
> -            "description": "A unique identifier for this node"
> -          },
> -          "hostname": {
> -            "type": "string",
> -            "description": "A hostname from which the node running DTS can access this node. This can also be an IP address."
> -          },
> -          "user": {
> -            "type": "string",
> -            "description": "The user to access this node with."
> -          },
> -          "password": {
> -            "type": "string",
> -            "description": "The password to use on this node. Use only as a last resort. SSH keys are STRONGLY preferred."
> -          },
> -          "arch": {
> -            "$ref": "#/definitions/ARCH"
> -          },
> -          "os": {
> -            "$ref": "#/definitions/OS"
> -          },
> -          "lcores": {
> -            "type": "string",
> -            "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
> -            "description": "Optional comma-separated list of logical cores to use, e.g.: 1,2,3,4,5,18-22. Defaults to 1. An empty string means use all lcores."
> -          },
> -          "use_first_core": {
> -            "type": "boolean",
> -            "description": "Indicate whether DPDK should use the first physical core. It won't be used by default."
> -          },
> -          "memory_channels": {
> -            "type": "integer",
> -            "description": "How many memory channels to use. Optional, defaults to 1."
> -          },
> -          "hugepages_2mb": {
> -            "$ref": "#/definitions/hugepages_2mb"
> -          },
> -          "ports": {
> -            "type": "array",
> -            "items": {
> -              "type": "object",
> -              "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.",
> -              "properties": {
> -                "pci": {
> -                  "$ref": "#/definitions/pci_address",
> -                  "description": "The local PCI address of the port"
> -                },
> -                "os_driver_for_dpdk": {
> -                  "type": "string",
> -                  "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)"
> -                },
> -                "os_driver": {
> -                  "type": "string",
> -                  "description": "The driver normally used by this port (ex: i40e)"
> -                },
> -                "peer_node": {
> -                  "type": "string",
> -                  "description": "The name of the node the peer port is on"
> -                },
> -                "peer_pci": {
> -                  "$ref": "#/definitions/pci_address",
> -                  "description": "The PCI address of the peer port"
> -                }
> -              },
> -              "additionalProperties": false,
> -              "required": [
> -                "pci",
> -                "os_driver_for_dpdk",
> -                "os_driver",
> -                "peer_node",
> -                "peer_pci"
> -              ]
> -            },
> -            "minimum": 1
> -          },
> -          "traffic_generator": {
> -            "oneOf": [
> -              {
> -                "type": "object",
> -                "description": "Scapy traffic generator. Used for functional testing.",
> -                "properties": {
> -                  "type": {
> -                    "type": "string",
> -                    "enum": [
> -                      "SCAPY"
> -                    ]
> -                  }
> -                }
> -              }
> -            ]
> -          }
> -        },
> -        "additionalProperties": false,
> -        "required": [
> -          "name",
> -          "hostname",
> -          "user",
> -          "arch",
> -          "os"
> -        ]
> -      },
> -      "minimum": 1
> -    },
> -    "test_runs": {
> -      "type": "array",
> -      "items": {
> -        "type": "object",
> -        "properties": {
> -          "dpdk_build": {
> -            "$ref": "#/definitions/dpdk_build"
> -          },
> -          "perf": {
> -            "type": "boolean",
> -            "description": "Enable performance testing."
> -          },
> -          "func": {
> -            "type": "boolean",
> -            "description": "Enable functional testing."
> -          },
> -          "test_suites": {
> -            "type": "array",
> -            "items": {
> -              "oneOf": [
> -                {
> -                  "$ref": "#/definitions/test_suite"
> -                },
> -                {
> -                  "$ref": "#/definitions/test_target"
> -                }
> -              ]
> -            }
> -          },
> -          "skip_smoke_tests": {
> -            "description": "Optional field that allows you to skip smoke testing",
> -            "type": "boolean"
> -          },
> -          "system_under_test_node": {
> -            "type":"object",
> -            "properties": {
> -              "node_name": {
> -                "$ref": "#/definitions/node_name"
> -              },
> -              "vdevs": {
> -                "description": "Optional list of names of vdevs to be used in the test run",
> -                "type": "array",
> -                "items": {
> -                  "type": "string"
> -                }
> -              }
> -            },
> -            "required": [
> -              "node_name"
> -            ]
> -          },
> -          "traffic_generator_node": {
> -            "$ref": "#/definitions/node_name"
> -          },
> -          "random_seed": {
> -            "type": "integer",
> -            "description": "Optional field. Allows you to set a seed for pseudo-random generation."
> -          }
> -        },
> -        "additionalProperties": false,
> -        "required": [
> -          "dpdk_build",
> -          "perf",
> -          "func",
> -          "test_suites",
> -          "system_under_test_node",
> -          "traffic_generator_node"
> -        ]
> -      },
> -      "minimum": 1
> -    }
> -  },
> -  "required": [
> -    "test_runs",
> -    "nodes"
> -  ],
> -  "additionalProperties": false
> -}
> diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
> deleted file mode 100644
> index 02e738a61e..0000000000
> --- a/dts/framework/config/types.py
> +++ /dev/null
> @@ -1,149 +0,0 @@
> -# SPDX-License-Identifier: BSD-3-Clause
> -# Copyright(c) 2023 PANTHEON.tech s.r.o.
> -
> -"""Configuration dictionary contents specification.
> -
> -These type definitions serve as documentation of the configuration dictionary contents.
> -
> -The definitions use the built-in :class:`~typing.TypedDict` construct.
> -"""
> -
> -from typing import TypedDict
> -
> -
> -class PortConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    pci: str
> -    #:
> -    os_driver_for_dpdk: str
> -    #:
> -    os_driver: str
> -    #:
> -    peer_node: str
> -    #:
> -    peer_pci: str
> -
> -
> -class TrafficGeneratorConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    type: str
> -
> -
> -class HugepageConfigurationDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    number_of: int
> -    #:
> -    force_first_numa: bool
> -
> -
> -class NodeConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    hugepages_2mb: HugepageConfigurationDict
> -    #:
> -    name: str
> -    #:
> -    hostname: str
> -    #:
> -    user: str
> -    #:
> -    password: str
> -    #:
> -    arch: str
> -    #:
> -    os: str
> -    #:
> -    lcores: str
> -    #:
> -    use_first_core: bool
> -    #:
> -    ports: list[PortConfigDict]
> -    #:
> -    memory_channels: int
> -    #:
> -    traffic_generator: TrafficGeneratorConfigDict
> -
> -
> -class DPDKBuildConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    arch: str
> -    #:
> -    os: str
> -    #:
> -    cpu: str
> -    #:
> -    compiler: str
> -    #:
> -    compiler_wrapper: str
> -
> -
> -class DPDKConfigurationDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    dpdk_tree: str | None
> -    #:
> -    tarball: str | None
> -    #:
> -    remote: bool
> -    #:
> -    precompiled_build_dir: str | None
> -    #:
> -    build_options: DPDKBuildConfigDict
> -
> -
> -class TestSuiteConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    suite: str
> -    #:
> -    cases: list[str]
> -
> -
> -class TestRunSUTConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    node_name: str
> -    #:
> -    vdevs: list[str]
> -
> -
> -class TestRunConfigDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    dpdk_build: DPDKConfigurationDict
> -    #:
> -    perf: bool
> -    #:
> -    func: bool
> -    #:
> -    skip_smoke_tests: bool
> -    #:
> -    test_suites: TestSuiteConfigDict
> -    #:
> -    system_under_test_node: TestRunSUTConfigDict
> -    #:
> -    traffic_generator_node: str
> -    #:
> -    random_seed: int
> -
> -
> -class ConfigurationDict(TypedDict):
> -    """Allowed keys and values."""
> -
> -    #:
> -    nodes: list[NodeConfigDict]
> -    #:
> -    test_runs: list[TestRunConfigDict]
> diff --git a/dts/framework/runner.py b/dts/framework/runner.py
> index 195622c653..c3d9a27a8c 100644
> --- a/dts/framework/runner.py
> +++ b/dts/framework/runner.py
> @@ -30,7 +30,15 @@
>  from framework.testbed_model.sut_node import SutNode
>  from framework.testbed_model.tg_node import TGNode
>
> -from .config import Configuration, TestRunConfiguration, TestSuiteConfig, load_config
> +from .config import (
> +    Configuration,
> +    DPDKPrecompiledBuildConfiguration,
> +    SutNodeConfiguration,
> +    TestRunConfiguration,
> +    TestSuiteConfig,
> +    TGNodeConfiguration,
> +    load_config,
> +)
>  from .exception import (
>      BlockingTestSuiteError,
>      ConfigurationError,
> @@ -133,11 +141,10 @@ def run(self) -> None:
>              self._result.update_setup(Result.PASS)
>
>              # for all test run sections
> -            for test_run_config in self._configuration.test_runs:
> +            for test_run_with_nodes_config in self._configuration.test_runs_with_nodes:
> +                test_run_config, sut_node_config, tg_node_config = test_run_with_nodes_config
>                  self._logger.set_stage(DtsStage.test_run_setup)
> -                self._logger.info(
> -                    f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
> -                )
> +                self._logger.info(f"Running test run with SUT '{sut_node_config.name}'.")
>                  self._init_random_seed(test_run_config)
>                  test_run_result = self._result.add_test_run(test_run_config)
>                  # we don't want to modify the original config, so create a copy
> @@ -145,7 +152,7 @@ def run(self) -> None:
>                      SETTINGS.test_suites if SETTINGS.test_suites else test_run_config.test_suites
>                  )
>                  if not test_run_config.skip_smoke_tests:
> -                    test_run_test_suites[:0] = [TestSuiteConfig.from_dict("smoke_tests")]
> +                    test_run_test_suites[:0] = [TestSuiteConfig(test_suite="smoke_tests")]
>                  try:
>                      test_suites_with_cases = self._get_test_suites_with_cases(
>                          test_run_test_suites, test_run_config.func, test_run_config.perf
> @@ -161,6 +168,8 @@ def run(self) -> None:
>                      self._connect_nodes_and_run_test_run(
>                          sut_nodes,
>                          tg_nodes,
> +                        sut_node_config,
> +                        tg_node_config,
>                          test_run_config,
>                          test_run_result,
>                          test_suites_with_cases,
> @@ -223,10 +232,10 @@ def _get_test_suites_with_cases(
>          test_suites_with_cases = []
>
>          for test_suite_config in test_suite_configs:
> -            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
> +            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
>              test_cases: list[type[TestCase]] = []
>              func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
> -                test_suite_config.test_cases
> +                test_suite_config.test_cases_names
>              )
>              if func:
>                  test_cases.extend(func_test_cases)
> @@ -305,6 +314,8 @@ def _connect_nodes_and_run_test_run(
>          self,
>          sut_nodes: dict[str, SutNode],
>          tg_nodes: dict[str, TGNode],
> +        sut_node_config: SutNodeConfiguration,
> +        tg_node_config: TGNodeConfiguration,
>          test_run_config: TestRunConfiguration,
>          test_run_result: TestRunResult,
>          test_suites_with_cases: Iterable[TestSuiteWithCases],
> @@ -319,24 +330,26 @@ def _connect_nodes_and_run_test_run(
>          Args:
>              sut_nodes: A dictionary storing connected/to be connected SUT nodes.
>              tg_nodes: A dictionary storing connected/to be connected TG nodes.
> +            sut_node_config: The test run's SUT node configuration.
> +            tg_node_config: The test run's TG node configuration.
>              test_run_config: A test run configuration.
>              test_run_result: The test run's result.
>              test_suites_with_cases: The test suites with test cases to run.
>          """
> -        sut_node = sut_nodes.get(test_run_config.system_under_test_node.name)
> -        tg_node = tg_nodes.get(test_run_config.traffic_generator_node.name)
> +        sut_node = sut_nodes.get(sut_node_config.name)
> +        tg_node = tg_nodes.get(tg_node_config.name)
>
>          try:
>              if not sut_node:
> -                sut_node = SutNode(test_run_config.system_under_test_node)
> +                sut_node = SutNode(sut_node_config)
>                  sut_nodes[sut_node.name] = sut_node
>              if not tg_node:
> -                tg_node = TGNode(test_run_config.traffic_generator_node)
> +                tg_node = TGNode(tg_node_config)
>                  tg_nodes[tg_node.name] = tg_node
>          except Exception as e:
> -            failed_node = test_run_config.system_under_test_node.name
> +            failed_node = test_run_config.system_under_test_node.node_name
>              if sut_node:
> -                failed_node = test_run_config.traffic_generator_node.name
> +                failed_node = test_run_config.traffic_generator_node
>              self._logger.exception(f"The Creation of node {failed_node} failed.")
>              test_run_result.update_setup(Result.FAIL, e)
>
> @@ -369,14 +382,22 @@ def _run_test_run(
>              ConfigurationError: If the DPDK sources or build is not set up from config or settings.
>          """
>          self._logger.info(
> -            f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
> +            f"Running test run with SUT '{test_run_config.system_under_test_node.node_name}'."
>          )
>          test_run_result.add_sut_info(sut_node.node_info)
>          try:
> -            dpdk_location = SETTINGS.dpdk_location or test_run_config.dpdk_config.dpdk_location
> -            sut_node.set_up_test_run(test_run_config, dpdk_location)
> +            dpdk_build_config = test_run_config.dpdk_config
> +            if new_location := SETTINGS.dpdk_location:
> +                dpdk_build_config = dpdk_build_config.model_copy(
> +                    update={"dpdk_location": new_location}
> +                )
> +            if dir := SETTINGS.precompiled_build_dir:
> +                dpdk_build_config = DPDKPrecompiledBuildConfiguration(
> +                    dpdk_location=dpdk_build_config.dpdk_location, precompiled_build_dir=dir
> +                )
> +            sut_node.set_up_test_run(test_run_config, dpdk_build_config)
>              test_run_result.add_dpdk_build_info(sut_node.get_dpdk_build_info())
> -            tg_node.set_up_test_run(test_run_config, dpdk_location)
> +            tg_node.set_up_test_run(test_run_config, dpdk_build_config)
>              test_run_result.update_setup(Result.PASS)
>          except Exception as e:
>              self._logger.exception("Test run setup failed.")
> diff --git a/dts/framework/settings.py b/dts/framework/settings.py
> index a452319b90..1253ed86ac 100644
> --- a/dts/framework/settings.py
> +++ b/dts/framework/settings.py
> @@ -60,9 +60,8 @@
>  .. option:: --precompiled-build-dir
>  .. envvar:: DTS_PRECOMPILED_BUILD_DIR
>
> -    Define the subdirectory under the DPDK tree root directory where the pre-compiled binaries are
> -    located. If set, DTS will build DPDK under the `build` directory instead. Can only be used with
> -    --dpdk-tree or --tarball.
> +    Define the subdirectory under the DPDK tree root directory or tarball where the pre-compiled
> +    binaries are located.
>
>  .. option:: --test-suite
>  .. envvar:: DTS_TEST_SUITES
> @@ -95,13 +94,21 @@
>  import argparse
>  import os
>  import sys
> -import tarfile
>  from argparse import Action, ArgumentDefaultsHelpFormatter, _get_action_name
>  from dataclasses import dataclass, field
>  from pathlib import Path
>  from typing import Callable
>
> -from .config import DPDKLocation, TestSuiteConfig
> +from pydantic import ValidationError
> +
> +from .config import (
> +    DPDKLocation,
> +    LocalDPDKTarballLocation,
> +    LocalDPDKTreeLocation,
> +    RemoteDPDKTarballLocation,
> +    RemoteDPDKTreeLocation,
> +    TestSuiteConfig,
> +)
>
>
>  @dataclass(slots=True)
> @@ -122,6 +129,8 @@ class Settings:
>      #:
>      dpdk_location: DPDKLocation | None = None
>      #:
> +    precompiled_build_dir: str | None = None
> +    #:
>      compile_timeout: float = 1200
>      #:
>      test_suites: list[TestSuiteConfig] = field(default_factory=list)
> @@ -383,13 +392,11 @@ def _get_parser() -> _DTSArgumentParser:
>
>      action = dpdk_build.add_argument(
>          "--precompiled-build-dir",
> -        help="Define the subdirectory under the DPDK tree root directory where the pre-compiled "
> -        "binaries are located. If set, DTS will build DPDK under the `build` directory instead. "
> -        "Can only be used with --dpdk-tree or --tarball.",
> +        help="Define the subdirectory under the DPDK tree root directory or tarball where the "
> +        "pre-compiled binaries are located.",
>          metavar="DIR_NAME",
>      )
>      _add_env_var_to_action(action)
> -    _required_with_one_of(parser, action, "dpdk_tarball_path", "dpdk_tree_path")
>
>      action = parser.add_argument(
>          "--compile-timeout",
> @@ -442,61 +449,61 @@ def _get_parser() -> _DTSArgumentParser:
>
>
>  def _process_dpdk_location(
> +    parser: _DTSArgumentParser,
>      dpdk_tree: str | None,
>      tarball: str | None,
>      remote: bool,
> -    build_dir: str | None,
> -):
> +) -> DPDKLocation | None:
>      """Process and validate DPDK build arguments.
>
>      Ensures that either `dpdk_tree` or `tarball` is provided. Validate existence and format of
>      `dpdk_tree` or `tarball` on local filesystem, if `remote` is False. Constructs and returns
> -    the :class:`DPDKLocation` with the provided parameters if validation is successful.
> +    any valid :class:`DPDKLocation` with the provided parameters if validation is successful.
>
>      Args:
> -        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
> -            must be provided.
> -        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
> -            provided.
> +        dpdk_tree: The path to the DPDK source tree directory.
> +        tarball: The path to the DPDK tarball.
>          remote: If :data:`True`, `dpdk_tree` or `tarball` is located on the SUT node, instead of the
>              execution host.
> -        build_dir: If it's defined, DPDK has been pre-built and the build directory is located in a
> -            subdirectory of `dpdk_tree` or `tarball` root directory.
>
>      Returns:
>          A DPDK location if construction is successful, otherwise None.
> -
> -    Raises:
> -        argparse.ArgumentTypeError: If `dpdk_tree` or `tarball` not found in local filesystem or
> -            they aren't in the right format.
>      """
> -    if not (dpdk_tree or tarball):
> -        return None
> -
> -    if not remote:
> -        if dpdk_tree:
> -            if not Path(dpdk_tree).exists():
> -                raise argparse.ArgumentTypeError(
> -                    f"DPDK tree '{dpdk_tree}' not found in local filesystem."
> -                )
> -
> -            if not Path(dpdk_tree).is_dir():
> -                raise argparse.ArgumentTypeError(f"DPDK tree '{dpdk_tree}' must be a directory.")
> -
> -            dpdk_tree = os.path.realpath(dpdk_tree)
> -
> -        if tarball:
> -            if not Path(tarball).exists():
> -                raise argparse.ArgumentTypeError(
> -                    f"DPDK tarball '{tarball}' not found in local filesystem."
> -                )
> -
> -            if not tarfile.is_tarfile(tarball):
> -                raise argparse.ArgumentTypeError(
> -                    f"DPDK tarball '{tarball}' must be a valid tar archive."
> -                )
> -
> -    return DPDKLocation(dpdk_tree=dpdk_tree, tarball=tarball, remote=remote, build_dir=build_dir)
> +    if dpdk_tree:
> +        action = parser.find_action("dpdk_tree", _is_from_env)
> +
> +        try:
> +            if remote:
> +                return RemoteDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
> +            else:
> +                return LocalDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
> +        except ValidationError as e:
> +            print(
> +                "An error has occurred while validating the DPDK tree supplied in the "
> +                f"{'environment variable' if action else 'arguments'}:",
> +                file=sys.stderr,
> +            )
> +            print(e, file=sys.stderr)
> +            sys.exit(1)
> +
> +    if tarball:
> +        action = parser.find_action("tarball", _is_from_env)
> +
> +        try:
> +            if remote:
> +                return RemoteDPDKTarballLocation.model_validate({"tarball": tarball})
> +            else:
> +                return LocalDPDKTarballLocation.model_validate({"tarball": tarball})
> +        except ValidationError as e:
> +            print(
> +                "An error has occurred while validating the DPDK tarball supplied in the "
> +                f"{'environment variable' if action else 'arguments'}:",
> +                file=sys.stderr,
> +            )
> +            print(e, file=sys.stderr)
> +            sys.exit(1)
> +
> +    return None
>
>
>  def _process_test_suites(
> @@ -512,11 +519,24 @@ def _process_test_suites(
>      Returns:
>          A list of test suite configurations to execute.
>      """
> -    if parser.find_action("test_suites", _is_from_env):
> +    action = parser.find_action("test_suites", _is_from_env)
> +    if action:
>          # Environment variable in the form of "SUITE1 CASE1 CASE2, SUITE2 CASE1, SUITE3, ..."
>          args = [suite_with_cases.split() for suite_with_cases in args[0][0].split(",")]
>
> -    return [TestSuiteConfig(test_suite, test_cases) for [test_suite, *test_cases] in args]
> +    try:
> +        return [
> +            TestSuiteConfig(test_suite=test_suite, test_cases=test_cases)
> +            for [test_suite, *test_cases] in args
> +        ]
> +    except ValidationError as e:
> +        print(
> +            "An error has occurred while validating the test suites supplied in the "
> +            f"{'environment variable' if action else 'arguments'}:",
> +            file=sys.stderr,
> +        )
> +        print(e, file=sys.stderr)
> +        sys.exit(1)
>
>
>  def get_settings() -> Settings:
> @@ -536,7 +556,7 @@ def get_settings() -> Settings:
>      args = parser.parse_args()
>
>      args.dpdk_location = _process_dpdk_location(
> -        args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source, args.precompiled_build_dir
> +        parser, args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source
>      )
>      args.test_suites = _process_test_suites(parser, args.test_suites)
>
> diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
> index 62867fd80c..6031eaf937 100644
> --- a/dts/framework/testbed_model/node.py
> +++ b/dts/framework/testbed_model/node.py
> @@ -17,7 +17,12 @@
>  from ipaddress import IPv4Interface, IPv6Interface
>  from typing import Union
>
> -from framework.config import OS, DPDKLocation, NodeConfiguration, TestRunConfiguration
> +from framework.config import (
> +    OS,
> +    DPDKBuildConfiguration,
> +    NodeConfiguration,
> +    TestRunConfiguration,
> +)
>  from framework.exception import ConfigurationError
>  from framework.logger import DTSLogger, get_dts_logger
>
> @@ -89,13 +94,15 @@ def __init__(self, node_config: NodeConfiguration):
>          self._init_ports()
>
>      def _init_ports(self) -> None:
> -        self.ports = [Port(port_config) for port_config in self.config.ports]
> +        self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
>          self.main_session.update_ports(self.ports)
>          for port in self.ports:
>              self.configure_port_state(port)
>
>      def set_up_test_run(
> -        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
> +        self,
> +        test_run_config: TestRunConfiguration,
> +        dpdk_build_config: DPDKBuildConfiguration,
>      ) -> None:
>          """Test run setup steps.
>
> @@ -105,7 +112,7 @@ def set_up_test_run(
>          Args:
>              test_run_config: A test run configuration according to which
>                  the setup steps will be taken.
> -            dpdk_location: The target source of the DPDK tree.
> +            dpdk_build_config: The build configuration of DPDK.
>          """
>          self._setup_hugepages()
>
> diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
> index 5f087f40d6..42ab4bb8fd 100644
> --- a/dts/framework/testbed_model/os_session.py
> +++ b/dts/framework/testbed_model/os_session.py
> @@ -364,7 +364,7 @@ def extract_remote_tarball(
>          """
>
>      @abstractmethod
> -    def is_remote_dir(self, remote_path: str) -> bool:
> +    def is_remote_dir(self, remote_path: PurePath) -> bool:
>          """Check if the `remote_path` is a directory.
>
>          Args:
> @@ -375,7 +375,7 @@ def is_remote_dir(self, remote_path: str) -> bool:
>          """
>
>      @abstractmethod
> -    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
> +    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
>          """Check if the `remote_tarball_path` is a tar archive.
>
>          Args:
> diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
> index 82c84cf4f8..817405bea4 100644
> --- a/dts/framework/testbed_model/port.py
> +++ b/dts/framework/testbed_model/port.py
> @@ -54,7 +54,7 @@ class Port:
>      mac_address: str = ""
>      logical_name: str = ""
>
> -    def __init__(self, config: PortConfig):
> +    def __init__(self, node_name: str, config: PortConfig):
>          """Initialize the port from `node_name` and `config`.
>
>          Args:
> @@ -62,7 +62,7 @@ def __init__(self, config: PortConfig):
>              config: The test run configuration of the port.
>          """
>          self.identifier = PortIdentifier(
> -            node=config.node,
> +            node=node_name,
>              pci=config.pci,
>          )
>          self.os_driver = config.os_driver
> diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
> index 0d3abbc519..6b66f33e22 100644
> --- a/dts/framework/testbed_model/posix_session.py
> +++ b/dts/framework/testbed_model/posix_session.py
> @@ -201,12 +201,12 @@ def extract_remote_tarball(
>          if expected_dir:
>              self.send_command(f"ls {expected_dir}", verify=True)
>
> -    def is_remote_dir(self, remote_path: str) -> bool:
> +    def is_remote_dir(self, remote_path: PurePath) -> bool:
>          """Overrides :meth:`~.os_session.OSSession.is_remote_dir`."""
>          result = self.send_command(f"test -d {remote_path}")
>          return not result.return_code
>
> -    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
> +    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
>          """Overrides :meth:`~.os_session.OSSession.is_remote_tarfile`."""
>          result = self.send_command(f"tar -tvf {remote_tarball_path}")
>          return not result.return_code
> @@ -393,4 +393,8 @@ def get_node_info(self) -> NodeInfo:
>              SETTINGS.timeout,
>          ).stdout.split("\n")
>          kernel_version = self.send_command("uname -r", SETTINGS.timeout).stdout
> -        return NodeInfo(os_release_info[0].strip(), os_release_info[1].strip(), kernel_version)
> +        return NodeInfo(
> +            os_name=os_release_info[0].strip(),
> +            os_version=os_release_info[1].strip(),
> +            kernel_version=kernel_version,
> +        )
> diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
> index a6c42b548c..57337c8e7d 100644
> --- a/dts/framework/testbed_model/sut_node.py
> +++ b/dts/framework/testbed_model/sut_node.py
> @@ -15,11 +15,17 @@
>  import os
>  import time
>  from dataclasses import dataclass
> -from pathlib import PurePath
> +from pathlib import Path, PurePath
>
>  from framework.config import (
>      DPDKBuildConfiguration,
> -    DPDKLocation,
> +    DPDKBuildOptionsConfiguration,
> +    DPDKPrecompiledBuildConfiguration,
> +    DPDKUncompiledBuildConfiguration,
> +    LocalDPDKTarballLocation,
> +    LocalDPDKTreeLocation,
> +    RemoteDPDKTarballLocation,
> +    RemoteDPDKTreeLocation,
>      SutNodeConfiguration,
>      TestRunConfiguration,
>  )
> @@ -178,7 +184,9 @@ def get_dpdk_build_info(self) -> DPDKBuildInfo:
>          return DPDKBuildInfo(dpdk_version=self.dpdk_version, compiler_version=self.compiler_version)
>
>      def set_up_test_run(
> -        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
> +        self,
> +        test_run_config: TestRunConfiguration,
> +        dpdk_build_config: DPDKBuildConfiguration,
>      ) -> None:
>          """Extend the test run setup with vdev config and DPDK build set up.
>
> @@ -188,12 +196,12 @@ def set_up_test_run(
>          Args:
>              test_run_config: A test run configuration according to which
>                  the setup steps will be taken.
> -            dpdk_location: The target source of the DPDK tree.
> +            dpdk_build_config: The build configuration of DPDK.
>          """
> -        super().set_up_test_run(test_run_config, dpdk_location)
> -        for vdev in test_run_config.vdevs:
> +        super().set_up_test_run(test_run_config, dpdk_build_config)
> +        for vdev in test_run_config.system_under_test_node.vdevs:
>              self.virtual_devices.append(VirtualDevice(vdev))
> -        self._set_up_dpdk(dpdk_location, test_run_config.dpdk_config.dpdk_build_config)
> +        self._set_up_dpdk(dpdk_build_config)
>
>      def tear_down_test_run(self) -> None:
>          """Extend the test run teardown with virtual device teardown and DPDK teardown."""
> @@ -202,7 +210,8 @@ def tear_down_test_run(self) -> None:
>          self._tear_down_dpdk()
>
>      def _set_up_dpdk(
> -        self, dpdk_location: DPDKLocation, dpdk_build_config: DPDKBuildConfiguration | None
> +        self,
> +        dpdk_build_config: DPDKBuildConfiguration,
>      ) -> None:
>          """Set up DPDK the SUT node and bind ports.
>
> @@ -211,21 +220,26 @@ def _set_up_dpdk(
>          are bound to those that DPDK needs.
>
>          Args:
> -            dpdk_location: The location of the DPDK tree.
> -            dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
> -                DTS will use pre-built DPDK from a :dataclass:`DPDKLocation`.
> +            dpdk_build_config: A DPDK build configuration to test.
>          """
> -        self._set_remote_dpdk_tree_path(dpdk_location.dpdk_tree, dpdk_location.remote)
> -        if not self._remote_dpdk_tree_path:
> -            if dpdk_location.dpdk_tree:
> -                self._copy_dpdk_tree(dpdk_location.dpdk_tree)
> -            elif dpdk_location.tarball:
> -                self._prepare_and_extract_dpdk_tarball(dpdk_location.tarball, dpdk_location.remote)
> -
> -        self._set_remote_dpdk_build_dir(dpdk_location.build_dir)
> -        if not self.remote_dpdk_build_dir and dpdk_build_config:
> -            self._configure_dpdk_build(dpdk_build_config)
> -            self._build_dpdk()
> +        match dpdk_build_config.dpdk_location:
> +            case RemoteDPDKTreeLocation(dpdk_tree=dpdk_tree):
> +                self._set_remote_dpdk_tree_path(dpdk_tree)
> +            case LocalDPDKTreeLocation(dpdk_tree=dpdk_tree):
> +                self._copy_dpdk_tree(dpdk_tree)
> +            case RemoteDPDKTarballLocation(tarball=tarball):
> +                self._validate_remote_dpdk_tarball(tarball)
> +                self._prepare_and_extract_dpdk_tarball(tarball)
> +            case LocalDPDKTarballLocation(tarball=tarball):
> +                remote_tarball = self._copy_dpdk_tarball_to_remote(tarball)
> +                self._prepare_and_extract_dpdk_tarball(remote_tarball)
> +
> +        match dpdk_build_config:
> +            case DPDKPrecompiledBuildConfiguration(precompiled_build_dir=build_dir):
> +                self._set_remote_dpdk_build_dir(build_dir)
> +            case DPDKUncompiledBuildConfiguration(build_options=build_options):
> +                self._configure_dpdk_build(build_options)
> +                self._build_dpdk()
>
>          self.bind_ports_to_driver()
>
> @@ -238,37 +252,29 @@ def _tear_down_dpdk(self) -> None:
>          self.compiler_version = None
>          self.bind_ports_to_driver(for_dpdk=False)
>
> -    def _set_remote_dpdk_tree_path(self, dpdk_tree: str | None, remote: bool):
> +    def _set_remote_dpdk_tree_path(self, dpdk_tree: PurePath):
>          """Set the path to the remote DPDK source tree based on the provided DPDK location.
>
> -        If :data:`dpdk_tree` and :data:`remote` are defined, check existence of :data:`dpdk_tree`
> -        on SUT node and sets the `_remote_dpdk_tree_path` property. Otherwise, sets nothing.
> -
>          Verify DPDK source tree existence on the SUT node, if exists sets the
>          `_remote_dpdk_tree_path` property, otherwise sets nothing.
>
>          Args:
>              dpdk_tree: The path to the DPDK source tree directory.
> -            remote: Indicates whether the `dpdk_tree` is already on the SUT node, instead of the
> -                execution host.
>
>          Raises:
>              RemoteFileNotFoundError: If the DPDK source tree is expected to be on the SUT node but
>                  is not found.
>          """
> -        if remote and dpdk_tree:
> -            if not self.main_session.remote_path_exists(dpdk_tree):
> -                raise RemoteFileNotFoundError(
> -                    f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
> -                )
> -            if not self.main_session.is_remote_dir(dpdk_tree):
> -                raise ConfigurationError(
> -                    f"Remote DPDK source tree '{dpdk_tree}' must be a directory."
> -                )
> -
> -            self.__remote_dpdk_tree_path = PurePath(dpdk_tree)
> -
> -    def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
> +        if not self.main_session.remote_path_exists(dpdk_tree):
> +            raise RemoteFileNotFoundError(
> +                f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
> +            )
> +        if not self.main_session.is_remote_dir(dpdk_tree):
> +            raise ConfigurationError(f"Remote DPDK source tree '{dpdk_tree}' must be a directory.")
> +
> +        self.__remote_dpdk_tree_path = dpdk_tree
> +
> +    def _copy_dpdk_tree(self, dpdk_tree_path: Path) -> None:
>          """Copy the DPDK source tree to the SUT.
>
>          Args:
> @@ -288,25 +294,45 @@ def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
>              self._remote_tmp_dir, PurePath(dpdk_tree_path).name
>          )
>
> -    def _prepare_and_extract_dpdk_tarball(self, dpdk_tarball: str, remote: bool) -> None:
> -        """Ensure the DPDK tarball is available on the SUT node and extract it.
> +    def _validate_remote_dpdk_tarball(self, dpdk_tarball: PurePath) -> None:
> +        """Validate the DPDK tarball on the SUT node.
>
> -        This method ensures that the DPDK source tree tarball is available on the
> -        SUT node. If the `dpdk_tarball` is local, it is copied to the SUT node. If the
> -        `dpdk_tarball` is already on the SUT node, it verifies its existence.
> -        The `dpdk_tarball` is then extracted on the SUT node.
> +        Args:
> +            dpdk_tarball: The path to the DPDK tarball on the SUT node.
>
> -        This method sets the `_remote_dpdk_tree_path` property to the path of the
> -        extracted DPDK tree on the SUT node.
> +        Raises:
> +            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but is
> +                not found.
> +            ConfigurationError: If the `dpdk_tarball` is a valid path but not a valid tar archive.
> +        """
> +        if not self.main_session.remote_path_exists(dpdk_tarball):
> +            raise RemoteFileNotFoundError(f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT.")
> +        if not self.main_session.is_remote_tarfile(dpdk_tarball):
> +            raise ConfigurationError(f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive.")
> +
> +    def _copy_dpdk_tarball_to_remote(self, dpdk_tarball: Path) -> PurePath:
> +        """Copy the local DPDK tarball to the SUT node.
>
>          Args:
> -            dpdk_tarball: The path to the DPDK tarball, either locally or on the SUT node.
> -            remote: Indicates whether the `dpdk_tarball` is already on the SUT node, instead of the
> -                execution host.
> +            dpdk_tarball: The local path to the DPDK tarball.
>
> -        Raises:
> -            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but
> -                is not found.
> +        Returns:
> +            The path of the copied tarball on the SUT node.
> +        """
> +        self._logger.info(
> +            f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
> +        )
> +        self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
> +        return self.main_session.join_remote_path(self._remote_tmp_dir, dpdk_tarball.name)
> +
> +    def _prepare_and_extract_dpdk_tarball(self, remote_tarball_path: PurePath) -> None:
> +        """Prepare the remote DPDK tree path and extract the tarball.
> +
> +        This method extracts the remote tarball and sets the `_remote_dpdk_tree_path` property to
> +        the path of the extracted DPDK tree on the SUT node.
> +
> +        Args:
> +            remote_tarball_path: The path to the DPDK tarball on the SUT node.
>          """
>
>          def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
> @@ -324,30 +350,9 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
>                      return PurePath(str(remote_tarball_path).replace(suffixes_to_remove, ""))
>              return remote_tarball_path.with_suffix("")
>
> -        if remote:
> -            if not self.main_session.remote_path_exists(dpdk_tarball):
> -                raise RemoteFileNotFoundError(
> -                    f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT."
> -                )
> -            if not self.main_session.is_remote_tarfile(dpdk_tarball):
> -                raise ConfigurationError(
> -                    f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive."
> -                )
> -
> -            remote_tarball_path = PurePath(dpdk_tarball)
> -        else:
> -            self._logger.info(
> -                f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
> -            )
> -            self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
> -
> -            remote_tarball_path = self.main_session.join_remote_path(
> -                self._remote_tmp_dir, PurePath(dpdk_tarball).name
> -            )
> -
>          tarball_top_dir = self.main_session.get_tarball_top_dir(remote_tarball_path)
>          self.__remote_dpdk_tree_path = self.main_session.join_remote_path(
> -            PurePath(remote_tarball_path).parent,
> +            remote_tarball_path.parent,
>              tarball_top_dir or remove_tarball_suffix(remote_tarball_path),
>          )
>
> @@ -360,33 +365,32 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
>              self._remote_dpdk_tree_path,
>          )
>
> -    def _set_remote_dpdk_build_dir(self, build_dir: str | None):
> +    def _set_remote_dpdk_build_dir(self, build_dir: str):
>          """Set the `remote_dpdk_build_dir` on the SUT.
>
> -        If :data:`build_dir` is defined, check existence on the SUT node and sets the
> +        Check existence on the SUT node and sets the
>          `remote_dpdk_build_dir` property by joining the `_remote_dpdk_tree_path` and `build_dir`.
>          Otherwise, sets nothing.
>
>          Args:
> -            build_dir: If it's defined, DPDK has been pre-built and the build directory is located
> +            build_dir: DPDK has been pre-built and the build directory is located
>                  in a subdirectory of `dpdk_tree` or `tarball` root directory.
>
>          Raises:
>              RemoteFileNotFoundError: If the `build_dir` is expected but does not exist on the SUT
>                  node.
>          """
> -        if build_dir:
> -            remote_dpdk_build_dir = self.main_session.join_remote_path(
> -                self._remote_dpdk_tree_path, build_dir
> +        remote_dpdk_build_dir = self.main_session.join_remote_path(
> +            self._remote_dpdk_tree_path, build_dir
> +        )
> +        if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
> +            raise RemoteFileNotFoundError(
> +                f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
>              )
> -            if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
> -                raise RemoteFileNotFoundError(
> -                    f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
> -                )
>
> -            self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
> +        self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
>
> -    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
> +    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildOptionsConfiguration) -> None:
>          """Populate common environment variables and set the DPDK build related properties.
>
>          This method sets `compiler_version` for additional information and `remote_dpdk_build_dir`
> diff --git a/dts/framework/testbed_model/topology.py b/dts/framework/testbed_model/topology.py
> index d38ae36c2a..17b333e76a 100644
> --- a/dts/framework/testbed_model/topology.py
> +++ b/dts/framework/testbed_model/topology.py
> @@ -99,7 +99,16 @@ def __init__(self, sut_ports: Iterable[Port], tg_ports: Iterable[Port]):
>                      port_links.append(PortLink(sut_port=sut_port, tg_port=tg_port))
>
>          self.type = TopologyType.get_from_value(len(port_links))
> -        dummy_port = Port(PortConfig("", "", "", "", "", ""))
> +        dummy_port = Port(
> +            "",
> +            PortConfig(
> +                pci="0000:00:00.0",
> +                os_driver_for_dpdk="",
> +                os_driver="",
> +                peer_node="",
> +                peer_pci="0000:00:00.0",
> +            ),
> +        )
>          self.tg_port_egress = dummy_port
>          self.sut_port_ingress = dummy_port
>          self.sut_port_egress = dummy_port
> diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
> index a319fa5320..945f6bbbbb 100644
> --- a/dts/framework/testbed_model/traffic_generator/__init__.py
> +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
> @@ -38,6 +38,4 @@ def create_traffic_generator(
>          case ScapyTrafficGeneratorConfig():
>              return ScapyTrafficGenerator(tg_node, traffic_generator_config, privileged=True)
>          case _:
> -            raise ConfigurationError(
> -                f"Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
> -            )
> +            raise ConfigurationError(f"Unknown traffic generator: {traffic_generator_config.type}")
> diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> index 469a12a780..5ac61cd4e1 100644
> --- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> @@ -45,7 +45,7 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig, **kwargs):
>          """
>          self._config = config
>          self._tg_node = tg_node
> -        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
> +        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.type}")
>          super().__init__(tg_node, **kwargs)
>
>      def send_packet(self, packet: Packet, port: Port) -> None:
> diff --git a/dts/framework/utils.py b/dts/framework/utils.py
> index 78a39e32c7..e862e3ac66 100644
> --- a/dts/framework/utils.py
> +++ b/dts/framework/utils.py
> @@ -28,7 +28,7 @@
>
>  from .exception import InternalError
>
> -REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
> +REGEX_FOR_PCI_ADDRESS: str = r"[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}"
>  _REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC: str = r"(?:[\da-fA-F]{2}[:-]){5}[\da-fA-F]{2}"
>  _REGEX_FOR_DOT_SEP_MAC: str = r"(?:[\da-fA-F]{4}.){2}[\da-fA-F]{4}"
>  REGEX_FOR_MAC_ADDRESS: str = rf"{_REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC}|{_REGEX_FOR_DOT_SEP_MAC}"
> diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
> index d7870bd40f..bc3a2a6bf9 100644
> --- a/dts/tests/TestSuite_smoke_tests.py
> +++ b/dts/tests/TestSuite_smoke_tests.py
> @@ -127,7 +127,7 @@ def test_device_bound_to_driver(self) -> None:
>          path_to_devbind = self.sut_node.path_to_devbind_script
>
>          all_nics_in_dpdk_devbind = self.sut_node.main_session.send_command(
> -            f"{path_to_devbind} --status | awk '{REGEX_FOR_PCI_ADDRESS}'",
> +            f"{path_to_devbind} --status | awk '/{REGEX_FOR_PCI_ADDRESS}/'",
>              SETTINGS.timeout,
>          ).stdout
>
> --
> 2.43.0
>

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v4 2/8] dts: add TestSuiteSpec class and discovery
  2024-10-28 17:49   ` [PATCH v4 2/8] dts: add TestSuiteSpec class and discovery Luca Vizzarro
  2024-10-31 19:32     ` Nicholas Pratte
@ 2024-10-31 20:21     ` Nicholas Pratte
  2024-11-06 17:58       ` Luca Vizzarro
  1 sibling, 1 reply; 83+ messages in thread
From: Nicholas Pratte @ 2024-10-31 20:21 UTC (permalink / raw)
  To: Luca Vizzarro; +Cc: dev, Paul Szczepanek, Patrick Robb

On Mon, Oct 28, 2024 at 1:51 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> Currently there is a lack of a definition which identifies all the test
> suites available to test. This change intends to simplify the process to
> discover all the test suites and idenfity them.

Noticed this in the corner of my eye. 'idenfity' should be 'identify.'
<snip>

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v4 5/8] dts: remove warlock dependency
  2024-10-28 17:49   ` [PATCH v4 5/8] dts: remove warlock dependency Luca Vizzarro
@ 2024-10-31 20:23     ` Nicholas Pratte
  0 siblings, 0 replies; 83+ messages in thread
From: Nicholas Pratte @ 2024-10-31 20:23 UTC (permalink / raw)
  To: Luca Vizzarro; +Cc: dev, Paul Szczepanek, Patrick Robb

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

On Mon, Oct 28, 2024 at 1:51 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> Since pydantic has completely replaced warlock, there is no more need to
> keep it as a dependency. This removes it.
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>  dts/poetry.lock    | 227 +--------------------------------------------
>  dts/pyproject.toml |   1 -
>  2 files changed, 1 insertion(+), 227 deletions(-)
>
> diff --git a/dts/poetry.lock b/dts/poetry.lock
> index 56c50ad52c..9f7db60793 100644
> --- a/dts/poetry.lock
> +++ b/dts/poetry.lock
> @@ -34,24 +34,6 @@ files = [
>      {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
>  ]
>
> -[[package]]
> -name = "attrs"
> -version = "23.1.0"
> -description = "Classes Without Boilerplate"
> -optional = false
> -python-versions = ">=3.7"
> -files = [
> -    {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
> -    {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
> -]
> -
> -[package.extras]
> -cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
> -dev = ["attrs[docs,tests]", "pre-commit"]
> -docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
> -tests = ["attrs[tests-no-zope]", "zope-interface"]
> -tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
> -
>  [[package]]
>  name = "babel"
>  version = "2.13.1"
> @@ -491,66 +473,6 @@ MarkupSafe = ">=2.0"
>  [package.extras]
>  i18n = ["Babel (>=2.7)"]
>
> -[[package]]
> -name = "jsonpatch"
> -version = "1.33"
> -description = "Apply JSON-Patches (RFC 6902)"
> -optional = false
> -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
> -files = [
> -    {file = "jsonpatch-1.33-py2.py3-none-any.whl", hash = "sha256:0ae28c0cd062bbd8b8ecc26d7d164fbbea9652a1a3693f3b956c1eae5145dade"},
> -    {file = "jsonpatch-1.33.tar.gz", hash = "sha256:9fcd4009c41e6d12348b4a0ff2563ba56a2923a7dfee731d004e212e1ee5030c"},
> -]
> -
> -[package.dependencies]
> -jsonpointer = ">=1.9"
> -
> -[[package]]
> -name = "jsonpointer"
> -version = "2.4"
> -description = "Identify specific nodes in a JSON document (RFC 6901)"
> -optional = false
> -python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
> -files = [
> -    {file = "jsonpointer-2.4-py2.py3-none-any.whl", hash = "sha256:15d51bba20eea3165644553647711d150376234112651b4f1811022aecad7d7a"},
> -    {file = "jsonpointer-2.4.tar.gz", hash = "sha256:585cee82b70211fa9e6043b7bb89db6e1aa49524340dde8ad6b63206ea689d88"},
> -]
> -
> -[[package]]
> -name = "jsonschema"
> -version = "4.18.4"
> -description = "An implementation of JSON Schema validation for Python"
> -optional = false
> -python-versions = ">=3.8"
> -files = [
> -    {file = "jsonschema-4.18.4-py3-none-any.whl", hash = "sha256:971be834317c22daaa9132340a51c01b50910724082c2c1a2ac87eeec153a3fe"},
> -    {file = "jsonschema-4.18.4.tar.gz", hash = "sha256:fb3642735399fa958c0d2aad7057901554596c63349f4f6b283c493cf692a25d"},
> -]
> -
> -[package.dependencies]
> -attrs = ">=22.2.0"
> -jsonschema-specifications = ">=2023.03.6"
> -referencing = ">=0.28.4"
> -rpds-py = ">=0.7.1"
> -
> -[package.extras]
> -format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
> -format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
> -
> -[[package]]
> -name = "jsonschema-specifications"
> -version = "2023.7.1"
> -description = "The JSON Schema meta-schemas and vocabularies, exposed as a Registry"
> -optional = false
> -python-versions = ">=3.8"
> -files = [
> -    {file = "jsonschema_specifications-2023.7.1-py3-none-any.whl", hash = "sha256:05adf340b659828a004220a9613be00fa3f223f2b82002e273dee62fd50524b1"},
> -    {file = "jsonschema_specifications-2023.7.1.tar.gz", hash = "sha256:c91a50404e88a1f6ba40636778e2ee08f6e24c5613fe4c53ac24578a5a7f72bb"},
> -]
> -
> -[package.dependencies]
> -referencing = ">=0.28.0"
> -
>  [[package]]
>  name = "markupsafe"
>  version = "2.1.3"
> @@ -1073,21 +995,6 @@ files = [
>      {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
>  ]
>
> -[[package]]
> -name = "referencing"
> -version = "0.30.0"
> -description = "JSON Referencing + Python"
> -optional = false
> -python-versions = ">=3.8"
> -files = [
> -    {file = "referencing-0.30.0-py3-none-any.whl", hash = "sha256:c257b08a399b6c2f5a3510a50d28ab5dbc7bbde049bcaf954d43c446f83ab548"},
> -    {file = "referencing-0.30.0.tar.gz", hash = "sha256:47237742e990457f7512c7d27486394a9aadaf876cbfaa4be65b27b4f4d47c6b"},
> -]
> -
> -[package.dependencies]
> -attrs = ">=22.2.0"
> -rpds-py = ">=0.7.0"
> -
>  [[package]]
>  name = "requests"
>  version = "2.31.0"
> @@ -1109,112 +1016,6 @@ urllib3 = ">=1.21.1,<3"
>  socks = ["PySocks (>=1.5.6,!=1.5.7)"]
>  use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
>
> -[[package]]
> -name = "rpds-py"
> -version = "0.9.2"
> -description = "Python bindings to Rust's persistent data structures (rpds)"
> -optional = false
> -python-versions = ">=3.8"
> -files = [
> -    {file = "rpds_py-0.9.2-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:ab6919a09c055c9b092798ce18c6c4adf49d24d4d9e43a92b257e3f2548231e7"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d55777a80f78dd09410bd84ff8c95ee05519f41113b2df90a69622f5540c4f8b"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a216b26e5af0a8e265d4efd65d3bcec5fba6b26909014effe20cd302fd1138fa"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:29cd8bfb2d716366a035913ced99188a79b623a3512292963d84d3e06e63b496"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:44659b1f326214950a8204a248ca6199535e73a694be8d3e0e869f820767f12f"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:745f5a43fdd7d6d25a53ab1a99979e7f8ea419dfefebcab0a5a1e9095490ee5e"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a987578ac5214f18b99d1f2a3851cba5b09f4a689818a106c23dbad0dfeb760f"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf4151acb541b6e895354f6ff9ac06995ad9e4175cbc6d30aaed08856558201f"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:03421628f0dc10a4119d714a17f646e2837126a25ac7a256bdf7c3943400f67f"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:13b602dc3e8dff3063734f02dcf05111e887f301fdda74151a93dbbc249930fe"},
> -    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:fae5cb554b604b3f9e2c608241b5d8d303e410d7dfb6d397c335f983495ce7f6"},
> -    {file = "rpds_py-0.9.2-cp310-none-win32.whl", hash = "sha256:47c5f58a8e0c2c920cc7783113df2fc4ff12bf3a411d985012f145e9242a2764"},
> -    {file = "rpds_py-0.9.2-cp310-none-win_amd64.whl", hash = "sha256:4ea6b73c22d8182dff91155af018b11aac9ff7eca085750455c5990cb1cfae6e"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:e564d2238512c5ef5e9d79338ab77f1cbbda6c2d541ad41b2af445fb200385e3"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f411330a6376fb50e5b7a3e66894e4a39e60ca2e17dce258d53768fea06a37bd"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e7521f5af0233e89939ad626b15278c71b69dc1dfccaa7b97bd4cdf96536bb7"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8d3335c03100a073883857e91db9f2e0ef8a1cf42dc0369cbb9151c149dbbc1b"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d25b1c1096ef0447355f7293fbe9ad740f7c47ae032c2884113f8e87660d8f6e"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a5d3fbd02efd9cf6a8ffc2f17b53a33542f6b154e88dd7b42ef4a4c0700fdad"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c5934e2833afeaf36bd1eadb57256239785f5af0220ed8d21c2896ec4d3a765f"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:095b460e117685867d45548fbd8598a8d9999227e9061ee7f012d9d264e6048d"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:91378d9f4151adc223d584489591dbb79f78814c0734a7c3bfa9c9e09978121c"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:24a81c177379300220e907e9b864107614b144f6c2a15ed5c3450e19cf536fae"},
> -    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:de0b6eceb46141984671802d412568d22c6bacc9b230174f9e55fc72ef4f57de"},
> -    {file = "rpds_py-0.9.2-cp311-none-win32.whl", hash = "sha256:700375326ed641f3d9d32060a91513ad668bcb7e2cffb18415c399acb25de2ab"},
> -    {file = "rpds_py-0.9.2-cp311-none-win_amd64.whl", hash = "sha256:0766babfcf941db8607bdaf82569ec38107dbb03c7f0b72604a0b346b6eb3298"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-macosx_10_7_x86_64.whl", hash = "sha256:b1440c291db3f98a914e1afd9d6541e8fc60b4c3aab1a9008d03da4651e67386"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0f2996fbac8e0b77fd67102becb9229986396e051f33dbceada3debaacc7033f"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f30d205755566a25f2ae0382944fcae2f350500ae4df4e795efa9e850821d82"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:159fba751a1e6b1c69244e23ba6c28f879a8758a3e992ed056d86d74a194a0f3"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a1f044792e1adcea82468a72310c66a7f08728d72a244730d14880cd1dabe36b"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9251eb8aa82e6cf88510530b29eef4fac825a2b709baf5b94a6094894f252387"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01899794b654e616c8625b194ddd1e5b51ef5b60ed61baa7a2d9c2ad7b2a4238"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0c43f8ae8f6be1d605b0465671124aa8d6a0e40f1fb81dcea28b7e3d87ca1e1"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:207f57c402d1f8712618f737356e4b6f35253b6d20a324d9a47cb9f38ee43a6b"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b52e7c5ae35b00566d244ffefba0f46bb6bec749a50412acf42b1c3f402e2c90"},
> -    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:978fa96dbb005d599ec4fd9ed301b1cc45f1a8f7982d4793faf20b404b56677d"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:6aa8326a4a608e1c28da191edd7c924dff445251b94653988efb059b16577a4d"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:aad51239bee6bff6823bbbdc8ad85136c6125542bbc609e035ab98ca1e32a192"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4bd4dc3602370679c2dfb818d9c97b1137d4dd412230cfecd3c66a1bf388a196"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dd9da77c6ec1f258387957b754f0df60766ac23ed698b61941ba9acccd3284d1"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:190ca6f55042ea4649ed19c9093a9be9d63cd8a97880106747d7147f88a49d18"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:876bf9ed62323bc7dcfc261dbc5572c996ef26fe6406b0ff985cbcf460fc8a4c"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa2818759aba55df50592ecbc95ebcdc99917fa7b55cc6796235b04193eb3c55"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9ea4d00850ef1e917815e59b078ecb338f6a8efda23369677c54a5825dbebb55"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:5855c85eb8b8a968a74dc7fb014c9166a05e7e7a8377fb91d78512900aadd13d"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:14c408e9d1a80dcb45c05a5149e5961aadb912fff42ca1dd9b68c0044904eb32"},
> -    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:65a0583c43d9f22cb2130c7b110e695fff834fd5e832a776a107197e59a1898e"},
> -    {file = "rpds_py-0.9.2-cp38-none-win32.whl", hash = "sha256:71f2f7715935a61fa3e4ae91d91b67e571aeb5cb5d10331ab681256bda2ad920"},
> -    {file = "rpds_py-0.9.2-cp38-none-win_amd64.whl", hash = "sha256:674c704605092e3ebbbd13687b09c9f78c362a4bc710343efe37a91457123044"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:07e2c54bef6838fa44c48dfbc8234e8e2466d851124b551fc4e07a1cfeb37260"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f7fdf55283ad38c33e35e2855565361f4bf0abd02470b8ab28d499c663bc5d7c"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:890ba852c16ace6ed9f90e8670f2c1c178d96510a21b06d2fa12d8783a905193"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:50025635ba8b629a86d9d5474e650da304cb46bbb4d18690532dd79341467846"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:517cbf6e67ae3623c5127206489d69eb2bdb27239a3c3cc559350ef52a3bbf0b"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0836d71ca19071090d524739420a61580f3f894618d10b666cf3d9a1688355b1"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c439fd54b2b9053717cca3de9583be6584b384d88d045f97d409f0ca867d80f"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f68996a3b3dc9335037f82754f9cdbe3a95db42bde571d8c3be26cc6245f2324"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7d68dc8acded354c972116f59b5eb2e5864432948e098c19fe6994926d8e15c3"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:f963c6b1218b96db85fc37a9f0851eaf8b9040aa46dec112611697a7023da535"},
> -    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5a46859d7f947061b4010e554ccd1791467d1b1759f2dc2ec9055fa239f1bc26"},
> -    {file = "rpds_py-0.9.2-cp39-none-win32.whl", hash = "sha256:e07e5dbf8a83c66783a9fe2d4566968ea8c161199680e8ad38d53e075df5f0d0"},
> -    {file = "rpds_py-0.9.2-cp39-none-win_amd64.whl", hash = "sha256:682726178138ea45a0766907957b60f3a1bf3acdf212436be9733f28b6c5af3c"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_10_7_x86_64.whl", hash = "sha256:196cb208825a8b9c8fc360dc0f87993b8b260038615230242bf18ec84447c08d"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:c7671d45530fcb6d5e22fd40c97e1e1e01965fc298cbda523bb640f3d923b387"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83b32f0940adec65099f3b1c215ef7f1d025d13ff947975a055989cb7fd019a4"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7f67da97f5b9eac838b6980fc6da268622e91f8960e083a34533ca710bec8611"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03975db5f103997904c37e804e5f340c8fdabbb5883f26ee50a255d664eed58c"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:987b06d1cdb28f88a42e4fb8a87f094e43f3c435ed8e486533aea0bf2e53d931"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c861a7e4aef15ff91233751619ce3a3d2b9e5877e0fcd76f9ea4f6847183aa16"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:02938432352359805b6da099c9c95c8a0547fe4b274ce8f1a91677401bb9a45f"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:ef1f08f2a924837e112cba2953e15aacfccbbfcd773b4b9b4723f8f2ddded08e"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:35da5cc5cb37c04c4ee03128ad59b8c3941a1e5cd398d78c37f716f32a9b7f67"},
> -    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:141acb9d4ccc04e704e5992d35472f78c35af047fa0cfae2923835d153f091be"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_10_7_x86_64.whl", hash = "sha256:79f594919d2c1a0cc17d1988a6adaf9a2f000d2e1048f71f298b056b1018e872"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:a06418fe1155e72e16dddc68bb3780ae44cebb2912fbd8bb6ff9161de56e1798"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b2eb034c94b0b96d5eddb290b7b5198460e2d5d0c421751713953a9c4e47d10"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b08605d248b974eb02f40bdcd1a35d3924c83a2a5e8f5d0fa5af852c4d960af"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a0805911caedfe2736935250be5008b261f10a729a303f676d3d5fea6900c96a"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab2299e3f92aa5417d5e16bb45bb4586171c1327568f638e8453c9f8d9e0f020"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c8d7594e38cf98d8a7df25b440f684b510cf4627fe038c297a87496d10a174f"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8b9ec12ad5f0a4625db34db7e0005be2632c1013b253a4a60e8302ad4d462afd"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:1fcdee18fea97238ed17ab6478c66b2095e4ae7177e35fb71fbe561a27adf620"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_i686.whl", hash = "sha256:933a7d5cd4b84f959aedeb84f2030f0a01d63ae6cf256629af3081cf3e3426e8"},
> -    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:686ba516e02db6d6f8c279d1641f7067ebb5dc58b1d0536c4aaebb7bf01cdc5d"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_10_7_x86_64.whl", hash = "sha256:0173c0444bec0a3d7d848eaeca2d8bd32a1b43f3d3fde6617aac3731fa4be05f"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:d576c3ef8c7b2d560e301eb33891d1944d965a4d7a2eacb6332eee8a71827db6"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed89861ee8c8c47d6beb742a602f912b1bb64f598b1e2f3d758948721d44d468"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1054a08e818f8e18910f1bee731583fe8f899b0a0a5044c6e680ceea34f93876"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99e7c4bb27ff1aab90dcc3e9d37ee5af0231ed98d99cb6f5250de28889a3d502"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c545d9d14d47be716495076b659db179206e3fd997769bc01e2d550eeb685596"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9039a11bca3c41be5a58282ed81ae422fa680409022b996032a43badef2a3752"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fb39aca7a64ad0c9490adfa719dbeeb87d13be137ca189d2564e596f8ba32c07"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:2d8b3b3a2ce0eaa00c5bbbb60b6713e94e7e0becab7b3db6c5c77f979e8ed1f1"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:99b1c16f732b3a9971406fbfe18468592c5a3529585a45a35adbc1389a529a03"},
> -    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:c27ee01a6c3223025f4badd533bea5e87c988cb0ba2811b690395dfe16088cfe"},
> -    {file = "rpds_py-0.9.2.tar.gz", hash = "sha256:8d70e8f14900f2657c249ea4def963bed86a29b81f81f5b76b5a9215680de945"},
> -]
> -
>  [[package]]
>  name = "scapy"
>  version = "2.5.0"
> @@ -1472,17 +1273,6 @@ files = [
>      {file = "types_PyYAML-6.0.12.11-py3-none-any.whl", hash = "sha256:a461508f3096d1d5810ec5ab95d7eeecb651f3a15b71959999988942063bf01d"},
>  ]
>
> -[[package]]
> -name = "typing-extensions"
> -version = "4.11.0"
> -description = "Backported and Experimental Type Hints for Python 3.8+"
> -optional = false
> -python-versions = ">=3.8"
> -files = [
> -    {file = "typing_extensions-4.11.0-py3-none-any.whl", hash = "sha256:c1f94d72897edaf4ce775bb7558d5b79d8126906a14ea5ed1635921406c0387a"},
> -    {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
> -]
> -
>  [[package]]
>  name = "typing-extensions"
>  version = "4.12.2"
> @@ -1511,22 +1301,7 @@ secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.
>  socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
>  zstd = ["zstandard (>=0.18.0)"]
>
> -[[package]]
> -name = "warlock"
> -version = "2.0.1"
> -description = "Python object model built on JSON schema and JSON patch."
> -optional = false
> -python-versions = ">=3.7,<4.0"
> -files = [
> -    {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"},
> -    {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"},
> -]
> -
> -[package.dependencies]
> -jsonpatch = ">=1,<2"
> -jsonschema = ">=4,<5"
> -
>  [metadata]
>  lock-version = "2.0"
>  python-versions = "^3.10"
> -content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827"
> +content-hash = "310e2d3725e20ffc6ef017db92e8000c042eb2ac98a1a5eb441de17c87417e9f"
> diff --git a/dts/pyproject.toml b/dts/pyproject.toml
> index 6c2d1ca8a4..9a3fb02ee9 100644
> --- a/dts/pyproject.toml
> +++ b/dts/pyproject.toml
> @@ -20,7 +20,6 @@ documentation = "https://doc.dpdk.org/guides/tools/dts.html"
>
>  [tool.poetry.dependencies]
>  python = "^3.10"
> -warlock = "^2.0.1"
>  PyYAML = "^6.0"
>  types-PyYAML = "^6.0.8"
>  fabric = "^2.7.1"
> --
> 2.43.0
>

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v4 6/8] dts: add autodoc pydantic
  2024-10-28 17:49   ` [PATCH v4 6/8] dts: add autodoc pydantic Luca Vizzarro
@ 2024-10-31 20:52     ` Nicholas Pratte
  2024-11-06 18:04       ` Luca Vizzarro
  0 siblings, 1 reply; 83+ messages in thread
From: Nicholas Pratte @ 2024-10-31 20:52 UTC (permalink / raw)
  To: Luca Vizzarro; +Cc: dev, Paul Szczepanek, Patrick Robb

Definitely a set in the right direction here!
Just a small typo, but otherwise:

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

<snip>
> --- a/doc/guides/tools/dts.rst
> +++ b/doc/guides/tools/dts.rst
> @@ -204,9 +204,10 @@ node, and then run the tests with the newly built binaries.
>  Configuring DTS
>  ~~~~~~~~~~~~~~~
>
> -DTS configuration is split into nodes and test runs and build targets within test runs,
> -and follows a defined schema as described in `Configuration Schema`_.
> -By default, DTS will try to use the ``dts/conf.yaml`` :ref:`config file <configuration_schema_example>`,
> +DTS configuration is split into nodes and test runs, and must respect the the model definitions as

There are two 'the' in the sentence above: "must respect the the model
definitions".

> +documented in the DTS API docs under the ``config`` page. The root of the configuration is
> +represented by the ``Configuration`` model.
> +By default, DTS will try to use the ``dts/conf.yaml`` :ref:`config file <configuration_example>`,
>  which is a template that illustrates what can be configured in DTS.
>
<snip>

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v4 7/8] dts: improve configuration API docs
  2024-10-28 17:49   ` [PATCH v4 7/8] dts: improve configuration API docs Luca Vizzarro
@ 2024-11-04 17:34     ` Nicholas Pratte
  0 siblings, 0 replies; 83+ messages in thread
From: Nicholas Pratte @ 2024-11-04 17:34 UTC (permalink / raw)
  To: Luca Vizzarro; +Cc: dev, Paul Szczepanek, Patrick Robb

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

On Mon, Oct 28, 2024 at 1:51 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> Pydantic models are not treated the same way as dataclasses by autodoc.
> As a consequence the docstrings need to be applied directly to each
> field. Otherwise the generated API documentation page would present two
> entries per each field with each their own differences.
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>  doc/guides/tools/dts.rst         |   5 +-
>  dts/framework/config/__init__.py | 253 +++++++++++--------------------
>  2 files changed, 88 insertions(+), 170 deletions(-)
>
> diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
> index 7ccca63ae8..ac12c5c4fa 100644
> --- a/doc/guides/tools/dts.rst
> +++ b/doc/guides/tools/dts.rst
> @@ -1,5 +1,6 @@
>  ..  SPDX-License-Identifier: BSD-3-Clause
>      Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> +    Copyright(c) 2024 Arm Limited
>
>  DPDK Test Suite
>  ===============
> @@ -327,8 +328,8 @@ where we deviate or where some additional clarification is helpful:
>     * The ``dataclass.dataclass`` decorator changes how the attributes are processed.
>       The dataclass attributes which result in instance variables/attributes
>       should also be recorded in the ``Attributes:`` section.
> -   * Class variables/attributes, on the other hand, should be documented with ``#:``
> -     above the type annotated line.
> +   * Class variables/attributes and Pydantic model fields, on the other hand, should be documented
> +     with ``#:`` above the type annotated line.
>       The description may be omitted if the meaning is obvious.
>     * The ``Enum`` and ``TypedDict`` also process the attributes in particular ways
>       and should be documented with ``#:`` as well.
> diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
> index c86bfaaabf..d7d3907a33 100644
> --- a/dts/framework/config/__init__.py
> +++ b/dts/framework/config/__init__.py
> @@ -116,54 +116,34 @@ class TrafficGeneratorType(str, Enum):
>
>
>  class HugepageConfiguration(BaseModel, frozen=True, extra="forbid"):
> -    r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
> -
> -    Attributes:
> -        number_of: The number of hugepages to allocate.
> -        force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.
> -    """
> +    r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s."""
>
> +    #: The number of hugepages to allocate.
>      number_of: int
> +    #: If :data:`True`, the hugepages will be configured on the first NUMA node.
>      force_first_numa: bool
>
>
>  class PortConfig(BaseModel, frozen=True, extra="forbid"):
> -    r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
> -
> -    Attributes:
> -        pci: The PCI address of the port.
> -        os_driver_for_dpdk: The operating system driver name for use with DPDK.
> -        os_driver: The operating system driver name when the operating system controls the port.
> -        peer_node: The :class:`~framework.testbed_model.node.Node` of the port
> -            connected to this port.
> -        peer_pci: The PCI address of the port connected to this port.
> -    """
> +    r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s."""
>
> -    pci: str = Field(
> -        description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS
> -    )
> -    os_driver_for_dpdk: str = Field(
> -        description="The driver that the kernel should bind this device to for DPDK to use it.",
> -        examples=["vfio-pci", "mlx5_core"],
> -    )
> -    os_driver: str = Field(
> -        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
> -    )
> -    peer_node: str = Field(description="The name of the peer node this port is connected to.")
> -    peer_pci: str = Field(
> -        description="The PCI address of the peer port this port is connected to.",
> -        pattern=REGEX_FOR_PCI_ADDRESS,
> -    )
> +    #: The PCI address of the port.
> +    pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS)
> +    #: The driver that the kernel should bind this device to for DPDK to use it.
> +    os_driver_for_dpdk: str = Field(examples=["vfio-pci", "mlx5_core"])
> +    #: The operating system driver name when the operating system controls the port.
> +    os_driver: str = Field(examples=["i40e", "ice", "mlx5_core"])
> +    #: The name of the peer node this port is connected to.
> +    peer_node: str
> +    #: The PCI address of the peer port connected to this port.
> +    peer_pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS)
>
>
>  class TrafficGeneratorConfig(BaseModel, frozen=True, extra="forbid"):
> -    """A protocol required to define traffic generator types.
> -
> -    Attributes:
> -        type: The traffic generator type, the child class is required to define to be distinguished
> -            among others.
> -    """
> +    """A protocol required to define traffic generator types."""
>
> +    #: The traffic generator type the child class is required to define to be distinguished among
> +    #: others.
>      type: TrafficGeneratorType
>
>
> @@ -176,13 +156,10 @@ class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="fo
>  #: A union type discriminating traffic generators by the `type` field.
>  TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
>
> -
> -#: A field representing logical core ranges.
> +#: Comma-separated list of logical cores to use. An empty string means use all lcores.
>  LogicalCores = Annotated[
>      str,
>      Field(
> -        description="Comma-separated list of logical cores to use. "
> -        "An empty string means use all lcores.",
>          examples=["1,2,3,4,5,18-22", "10-15"],
>          pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
>      ),
> @@ -190,61 +167,41 @@ class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="fo
>
>
>  class NodeConfiguration(BaseModel, frozen=True, extra="forbid"):
> -    r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
> -
> -    Attributes:
> -        name: The name of the :class:`~framework.testbed_model.node.Node`.
> -        hostname: The hostname of the :class:`~framework.testbed_model.node.Node`.
> -            Can be an IP or a domain name.
> -        user: The name of the user used to connect to
> -            the :class:`~framework.testbed_model.node.Node`.
> -        password: The password of the user. The use of passwords is heavily discouraged.
> -            Please use keys instead.
> -        arch: The architecture of the :class:`~framework.testbed_model.node.Node`.
> -        os: The operating system of the :class:`~framework.testbed_model.node.Node`.
> -        lcores: A comma delimited list of logical cores to use when running DPDK.
> -        use_first_core: If :data:`True`, the first logical core won't be used.
> -        hugepages: An optional hugepage configuration.
> -        ports: The ports that can be used in testing.
> -    """
> -
> -    name: str = Field(description="A unique identifier for this node.")
> -    hostname: str = Field(description="The hostname or IP address of the node.")
> -    user: str = Field(description="The login user to use to connect to this node.")
> -    password: str | None = Field(
> -        default=None,
> -        description="The login password to use to connect to this node. "
> -        "SSH keys are STRONGLY preferred, use only as last resort.",
> -    )
> +    r"""The configuration of :class:`~framework.testbed_model.node.Node`\s."""
> +
> +    #: The name of the :class:`~framework.testbed_model.node.Node`.
> +    name: str
> +    #: The hostname of the :class:`~framework.testbed_model.node.Node`. Can also be an IP address.
> +    hostname: str
> +    #: The name of the user used to connect to the :class:`~framework.testbed_model.node.Node`.
> +    user: str
> +    #: The password of the user. The use of passwords is heavily discouraged, please use SSH keys.
> +    password: str | None = None
> +    #: The architecture of the :class:`~framework.testbed_model.node.Node`.
>      arch: Architecture
> +    #: The operating system of the :class:`~framework.testbed_model.node.Node`.
>      os: OS
> +    #: A comma delimited list of logical cores to use when running DPDK.
>      lcores: LogicalCores = "1"
> -    use_first_core: bool = Field(
> -        default=False, description="DPDK won't use the first physical core if set to False."
> -    )
> +    #: If :data:`True`, the first logical core won't be used.
> +    use_first_core: bool = False
> +    #: An optional hugepage configuration.
>      hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
> +    #: The ports that can be used in testing.
>      ports: list[PortConfig] = Field(min_length=1)
>
>
>  class SutNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
> -    """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
> +    """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration."""
>
> -    Attributes:
> -        memory_channels: The number of memory channels to use when running DPDK.
> -    """
> -
> -    memory_channels: int = Field(
> -        default=1, description="Number of memory channels to use when running DPDK."
> -    )
> +    #: The number of memory channels to use when running DPDK.
> +    memory_channels: int = 1
>
>
>  class TGNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
> -    """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
> -
> -    Attributes:
> -        traffic_generator: The configuration of the traffic generator present on the TG node.
> -    """
> +    """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration."""
>
> +    #: The configuration of the traffic generator present on the TG node.
>      traffic_generator: TrafficGeneratorConfigTypes
>
>
> @@ -258,20 +215,18 @@ def resolve_path(path: Path) -> Path:
>
>
>  class BaseDPDKLocation(BaseModel, frozen=True, extra="forbid"):
> -    """DPDK location.
> +    """DPDK location base class.
>
> -    The path to the DPDK sources, build dir and type of location.
> -
> -    Attributes:
> -        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
> -            located on the SUT node, instead of the execution host.
> +    The path to the DPDK sources and type of location.
>      """
>
> +    #: Specifies whether to find DPDK on the SUT node or on the local host. Which are respectively
> +    #: represented by :class:`RemoteDPDKLocation` and :class:`LocalDPDKTreeLocation`.
>      remote: bool = False
>
>
>  class LocalDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
> -    """Local DPDK location parent class.
> +    """Local DPDK location base class.
>
>      This class is meant to represent any location that is present only locally.
>      """
> @@ -284,14 +239,12 @@ class LocalDPDKTreeLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
>
>      This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
>      validation.
> -
> -    Attributes:
> -        dpdk_tree: The path to the DPDK source tree directory.
>      """
>
> +    #: The path to the DPDK source tree directory on the local host passed as string.
>      dpdk_tree: Path
>
> -    #: Resolve the local DPDK tree path
> +    #: Resolve the local DPDK tree path.
>      resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
>
>      @model_validator(mode="after")
> @@ -307,14 +260,12 @@ class LocalDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
>
>      This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
>      validation.
> -
> -    Attributes:
> -        tarball: The path to the DPDK tarball.
>      """
>
> +    #: The path to the DPDK tarball on the local host passed as string.
>      tarball: Path
>
> -    #: Resolve the local tarball path
> +    #: Resolve the local tarball path.
>      resolve_tarball_path = field_validator("tarball")(resolve_path)
>
>      @model_validator(mode="after")
> @@ -326,7 +277,7 @@ def validate_tarball_path(self) -> Self:
>
>
>  class RemoteDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
> -    """Remote DPDK location parent class.
> +    """Remote DPDK location base class.
>
>      This class is meant to represent any location that is present only remotely.
>      """
> @@ -338,11 +289,9 @@ class RemoteDPDKTreeLocation(RemoteDPDKLocation, frozen=True, extra="forbid"):
>      """Remote DPDK tree location.
>
>      This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
> -
> -    Attributes:
> -        dpdk_tree: The path to the DPDK source tree directory.
>      """
>
> +    #: The path to the DPDK source tree directory on the remote node passed as string.
>      dpdk_tree: PurePath
>
>
> @@ -351,11 +300,9 @@ class RemoteDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
>
>      This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
>      validation.
> -
> -    Attributes:
> -        tarball: The path to the DPDK tarball.
>      """
>
> +    #: The path to the DPDK tarball on the remote node passed as string.
>      tarball: PurePath
>
>
> @@ -372,23 +319,17 @@ class BaseDPDKBuildConfiguration(BaseModel, frozen=True, extra="forbid"):
>      """The base configuration for different types of build.
>
>      The configuration contain the location of the DPDK and configuration used for building it.
> -
> -    Attributes:
> -        dpdk_location: The location of the DPDK tree.
>      """
>
> +    #: The location of the DPDK tree.
>      dpdk_location: DPDKLocation
>
>
>  class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
> -    """DPDK precompiled build configuration.
> -
> -    Attributes:
> -        precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory
> -            is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will
> -            be using `dpdk_build_config` from configuration to build the DPDK from source.
> -    """
> +    """DPDK precompiled build configuration."""
>
> +    #: If it's defined, DPDK has been pre-compiled and the build directory is located in a
> +    #: subdirectory of `~dpdk_location.dpdk_tree` or `~dpdk_location.tarball` root directory.
>      precompiled_build_dir: str = Field(min_length=1)
>
>
> @@ -396,20 +337,18 @@ class DPDKBuildOptionsConfiguration(BaseModel, frozen=True, extra="forbid"):
>      """DPDK build options configuration.
>
>      The build options used for building DPDK.
> -
> -    Attributes:
> -        arch: The target architecture to build for.
> -        os: The target os to build for.
> -        cpu: The target CPU to build for.
> -        compiler: The compiler executable to use.
> -        compiler_wrapper: This string will be put in front of the compiler when executing the build.
> -            Useful for adding wrapper commands, such as ``ccache``.
>      """
>
> +    #: The target architecture to build for.
>      arch: Architecture
> +    #: The target OS to build for.
>      os: OS
> +    #: The target CPU to build for.
>      cpu: CPUType
> +    #: The compiler executable to use.
>      compiler: Compiler
> +    #: This string will be put in front of the compiler when executing the build. Useful for adding
> +    #: wrapper commands, such as ``ccache``.
>      compiler_wrapper: str = ""
>
>      @cached_property
> @@ -419,12 +358,9 @@ def name(self) -> str:
>
>
>  class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
> -    """DPDK uncompiled build configuration.
> -
> -    Attributes:
> -        build_options: The build options to compile DPDK.
> -    """
> +    """DPDK uncompiled build configuration."""
>
> +    #: The build options to compiled DPDK with.
>      build_options: DPDKBuildOptionsConfiguration
>
>
> @@ -448,24 +384,13 @@ class TestSuiteConfig(BaseModel, frozen=True, extra="forbid"):
>              # or as model fields:
>              - test_suite: hello_world
>                test_cases: [hello_world_single_core] # without this field all test cases are run
> -
> -    Attributes:
> -        test_suite_name: The name of the test suite module without the starting ``TestSuite_``.
> -        test_cases_names: The names of test cases from this test suite to execute.
> -            If empty, all test cases will be executed.
>      """
>
> -    test_suite_name: str = Field(
> -        title="Test suite name",
> -        description="The identifying module name of the test suite without the prefix.",
> -        alias="test_suite",
> -    )
> -    test_cases_names: list[str] = Field(
> -        default_factory=list,
> -        title="Test cases by name",
> -        description="The identifying name of the test cases of the test suite.",
> -        alias="test_cases",
> -    )
> +    #: The name of the test suite module without the starting ``TestSuite_``.
> +    test_suite_name: str = Field(alias="test_suite")
> +    #: The names of test cases from this test suite to execute. If empty, all test cases will be
> +    #: executed.
> +    test_cases_names: list[str] = Field(default_factory=list, alias="test_cases")
>
>      @cached_property
>      def test_suite_spec(self) -> "TestSuiteSpec":
> @@ -507,14 +432,11 @@ def validate_names(self) -> Self:
>
>
>  class TestRunSUTNodeConfiguration(BaseModel, frozen=True, extra="forbid"):
> -    """The SUT node configuration of a test run.
> -
> -    Attributes:
> -        node_name: The SUT node to use in this test run.
> -        vdevs: The names of virtual devices to test.
> -    """
> +    """The SUT node configuration of a test run."""
>
> +    #: The SUT node to use in this test run.
>      node_name: str
> +    #: The names of virtual devices to test.
>      vdevs: list[str] = Field(default_factory=list)
>
>
> @@ -523,25 +445,23 @@ class TestRunConfiguration(BaseModel, frozen=True, extra="forbid"):
>
>      The configuration contains testbed information, what tests to execute
>      and with what DPDK build.
> -
> -    Attributes:
> -        dpdk_config: The DPDK configuration used to test.
> -        perf: Whether to run performance tests.
> -        func: Whether to run functional tests.
> -        skip_smoke_tests: Whether to skip smoke tests.
> -        test_suites: The names of test suites and/or test cases to execute.
> -        system_under_test_node: The SUT node configuration to use in this test run.
> -        traffic_generator_node: The TG node name to use in this test run.
> -        random_seed: The seed to use for pseudo-random generation.
>      """
>
> +    #: The DPDK configuration used to test.
>      dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
> -    perf: bool = Field(description="Enable performance testing.")
> -    func: bool = Field(description="Enable functional testing.")
> +    #: Whether to run performance tests.
> +    perf: bool
> +    #: Whether to run functional tests.
> +    func: bool
> +    #: Whether to skip smoke tests.
>      skip_smoke_tests: bool = False
> +    #: The names of test suites and/or test cases to execute.
>      test_suites: list[TestSuiteConfig] = Field(min_length=1)
> +    #: The SUT node configuration to use in this test run.
>      system_under_test_node: TestRunSUTNodeConfiguration
> +    #: The TG node name to use in this test run.
>      traffic_generator_node: str
> +    #: The seed to use for pseudo-random generation.
>      random_seed: int | None = None
>
>
> @@ -557,14 +477,11 @@ class TestRunWithNodesConfiguration(NamedTuple):
>
>
>  class Configuration(BaseModel, extra="forbid"):
> -    """DTS testbed and test configuration.
> -
> -    Attributes:
> -        test_runs: Test run configurations.
> -        nodes: Node configurations.
> -    """
> +    """DTS testbed and test configuration."""
>
> +    #: Test run configurations.
>      test_runs: list[TestRunConfiguration] = Field(min_length=1)
> +    #: Node configurations.
>      nodes: list[NodeConfigurationTypes] = Field(min_length=1)
>
>      @cached_property
> --
> 2.43.0
>

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH 4/5] dts: use TestSuiteSpec class imports
  2024-10-29 12:56     ` Luca Vizzarro
@ 2024-11-04 17:49       ` Nicholas Pratte
  0 siblings, 0 replies; 83+ messages in thread
From: Nicholas Pratte @ 2024-11-04 17:49 UTC (permalink / raw)
  To: Luca Vizzarro
  Cc: dev, Honnappa Nagarahalli, Juraj Linkeš, Paul Szczepanek

Noted, I appreciate the insight!

On Tue, Oct 29, 2024 at 8:56 AM Luca Vizzarro <Luca.Vizzarro@arm.com> wrote:
>
> On 01/10/2024 21:45, Nicholas Pratte wrote:
> > The code you have here makes sense, and I like the implementation as
> > it removes a lot of fluff in DTSRunner. I know Jurja mentioned in an
> > earlier patch in this series that this functionality intersects with
> > the capabilities series, but I'm missing a lot of context to
> > understand that fully. Maybe you could provide some insight? I'll make
> > sure to analyse this deeper in my own time as well. Beyond that:
>
> Most of the intersection comes from the fact that this series adds auto
> discovery of test suites and test cases, therefore treating the test
> cases as objects with labels for processing and filtering.
>
> In his capability patches Juraj also needed to turn test cases into
> objects to add runtime metadata to them, such as required capabilities.

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v4 8/8] dts: use TestSuiteSpec class imports
  2024-10-28 17:49   ` [PATCH v4 8/8] dts: use TestSuiteSpec class imports Luca Vizzarro
@ 2024-11-04 17:50     ` Nicholas Pratte
  0 siblings, 0 replies; 83+ messages in thread
From: Nicholas Pratte @ 2024-11-04 17:50 UTC (permalink / raw)
  To: Luca Vizzarro; +Cc: dev, Paul Szczepanek, Patrick Robb

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

On Mon, Oct 28, 2024 at 1:51 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> The introduction of TestSuiteSpec adds auto-discovery of test suites,
> which are also automatically imported. This causes double imports as the
> runner loads the test suites. This changes the behaviour of the runner
> to load the imported classes from TestSuiteSpec instead of importing
> them again.
>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> ---
>  dts/framework/runner.py | 84 ++++-------------------------------------
>  1 file changed, 7 insertions(+), 77 deletions(-)
>
> diff --git a/dts/framework/runner.py b/dts/framework/runner.py
> index c3d9a27a8c..5f5837a132 100644
> --- a/dts/framework/runner.py
> +++ b/dts/framework/runner.py
> @@ -2,6 +2,7 @@
>  # Copyright(c) 2010-2019 Intel Corporation
>  # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
>  # Copyright(c) 2022-2023 University of New Hampshire
> +# Copyright(c) 2024 Arm Limited
>
>  """Test suite runner module.
>
> @@ -17,8 +18,6 @@
>  and the test case stage runs test cases individually.
>  """
>
> -import importlib
> -import inspect
>  import os
>  import random
>  import sys
> @@ -39,12 +38,7 @@
>      TGNodeConfiguration,
>      load_config,
>  )
> -from .exception import (
> -    BlockingTestSuiteError,
> -    ConfigurationError,
> -    SSHTimeoutError,
> -    TestCaseVerifyError,
> -)
> +from .exception import BlockingTestSuiteError, SSHTimeoutError, TestCaseVerifyError
>  from .logger import DTSLogger, DtsStage, get_dts_logger
>  from .settings import SETTINGS
>  from .test_result import (
> @@ -215,11 +209,10 @@ def _get_test_suites_with_cases(
>          func: bool,
>          perf: bool,
>      ) -> list[TestSuiteWithCases]:
> -        """Test suites with test cases discovery.
> +        """Get test suites with selected cases.
>
> -        The test suites with test cases defined in the user configuration are discovered
> -        and stored for future use so that we don't import the modules twice and so that
> -        the list of test suites with test cases is available for recording right away.
> +        The test suites with test cases defined in the user configuration are selected
> +        and the corresponding functions and classes are gathered.
>
>          Args:
>              test_suite_configs: Test suite configurations.
> @@ -227,12 +220,12 @@ def _get_test_suites_with_cases(
>              perf: Whether to include performance test cases in the final list.
>
>          Returns:
> -            The discovered test suites, each with test cases.
> +            The test suites, each with test cases.
>          """
>          test_suites_with_cases = []
>
>          for test_suite_config in test_suite_configs:
> -            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
> +            test_suite_class = test_suite_config.test_suite_spec.class_obj
>              test_cases: list[type[TestCase]] = []
>              func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
>                  test_suite_config.test_cases_names
> @@ -245,71 +238,8 @@ def _get_test_suites_with_cases(
>              test_suites_with_cases.append(
>                  TestSuiteWithCases(test_suite_class=test_suite_class, test_cases=test_cases)
>              )
> -
>          return test_suites_with_cases
>
> -    def _get_test_suite_class(self, module_name: str) -> type[TestSuite]:
> -        """Find the :class:`TestSuite` class in `module_name`.
> -
> -        The full module name is `module_name` prefixed with `self._test_suite_module_prefix`.
> -        The module name is a standard filename with words separated with underscores.
> -        Search the `module_name` for a :class:`TestSuite` class which starts
> -        with `self._test_suite_class_prefix`, continuing with CamelCase `module_name`.
> -        The first matching class is returned.
> -
> -        The CamelCase convention applies to abbreviations, acronyms, initialisms and so on::
> -
> -            OS -> Os
> -            TCP -> Tcp
> -
> -        Args:
> -            module_name: The module name without prefix where to search for the test suite.
> -
> -        Returns:
> -            The found test suite class.
> -
> -        Raises:
> -            ConfigurationError: If the corresponding module is not found or
> -                a valid :class:`TestSuite` is not found in the module.
> -        """
> -
> -        def is_test_suite(object) -> bool:
> -            """Check whether `object` is a :class:`TestSuite`.
> -
> -            The `object` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
> -
> -            Args:
> -                object: The object to be checked.
> -
> -            Returns:
> -                :data:`True` if `object` is a subclass of `TestSuite`.
> -            """
> -            try:
> -                if issubclass(object, TestSuite) and object is not TestSuite:
> -                    return True
> -            except TypeError:
> -                return False
> -            return False
> -
> -        testsuite_module_path = f"{self._test_suite_module_prefix}{module_name}"
> -        try:
> -            test_suite_module = importlib.import_module(testsuite_module_path)
> -        except ModuleNotFoundError as e:
> -            raise ConfigurationError(
> -                f"Test suite module '{testsuite_module_path}' not found."
> -            ) from e
> -
> -        camel_case_suite_name = "".join(
> -            [suite_word.capitalize() for suite_word in module_name.split("_")]
> -        )
> -        full_suite_name_to_find = f"{self._test_suite_class_prefix}{camel_case_suite_name}"
> -        for class_name, class_obj in inspect.getmembers(test_suite_module, is_test_suite):
> -            if class_name == full_suite_name_to_find:
> -                return class_obj
> -        raise ConfigurationError(
> -            f"Couldn't find any valid test suites in {test_suite_module.__name__}."
> -        )
> -
>      def _connect_nodes_and_run_test_run(
>          self,
>          sut_nodes: dict[str, SutNode],
> --
> 2.43.0
>

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v4 2/8] dts: add TestSuiteSpec class and discovery
  2024-10-31 20:21     ` Nicholas Pratte
@ 2024-11-06 17:58       ` Luca Vizzarro
  0 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-06 17:58 UTC (permalink / raw)
  To: Nicholas Pratte; +Cc: dev, Paul Szczepanek, Patrick Robb

On 31/10/2024 20:21, Nicholas Pratte wrote:
> On Mon, Oct 28, 2024 at 1:51 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>>
>> Currently there is a lack of a definition which identifies all the test
>> suites available to test. This change intends to simplify the process to
>> discover all the test suites and idenfity them.
> 
> Noticed this in the corner of my eye. 'idenfity' should be 'identify.'
> <snip>

Nice! Great catch! Thank you!

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v4 3/8] dts: refactor build and node info classes
  2024-10-31 20:16     ` Nicholas Pratte
@ 2024-11-06 18:02       ` Luca Vizzarro
  0 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-06 18:02 UTC (permalink / raw)
  To: Nicholas Pratte; +Cc: dev, Paul Szczepanek, Patrick Robb

On 31/10/2024 20:16, Nicholas Pratte wrote:

> On Mon, Oct 28, 2024 at 1:51 PM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>>
>> The DPDKBuildInfo and NodeInfo classes, representing information
>> gathered in runtime, were erroneously placed in the configuration
>> package. This moves them in more appropriate modules.
>>
>> NodeInfo, specifically, ia moved to os_session instead of node mostly
> 
> Small typo here, change 'ia' to 'is'.

Once again, great catch!

>> as a consequence of circular dependencies. And given os_session is the
>> top-most module to reference it, it appears to be the most suitable
>> place outside of node.
> 
> As I said, this makes sense to me, but I wonder if it might make sense
> to change 'NodeInfo' to 'OSSessionInfo' or something like that. I'd
> imagine that if any attributes were to be tacked on in the future they
> would probably be os related, but maybe there would be system
> information, and in this case "OSSessionInfo" might be a good middle
> ground. There are existing changes that I've done where arch is
> discovered during runtime, and this could probably be placed in this
> 'NodeInfo' class as well when I get around to revising it. My only
> concern is whether or not having "NodeConfiguration" and "NodeInfo"
> classes floating around might make the framework more confusing to
> read.

You make an excellent point, I didn't think of it too much. This is a 
great suggestion, will apply it.

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v4 6/8] dts: add autodoc pydantic
  2024-10-31 20:52     ` Nicholas Pratte
@ 2024-11-06 18:04       ` Luca Vizzarro
  0 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-06 18:04 UTC (permalink / raw)
  To: Nicholas Pratte; +Cc: dev, Paul Szczepanek, Patrick Robb

On 31/10/2024 20:52, Nicholas Pratte wrote:

>> --- a/doc/guides/tools/dts.rst
>> +++ b/doc/guides/tools/dts.rst
>> @@ -204,9 +204,10 @@ node, and then run the tests with the newly built binaries.
>>   Configuring DTS
>>   ~~~~~~~~~~~~~~~
>>
>> -DTS configuration is split into nodes and test runs and build targets within test runs,
>> -and follows a defined schema as described in `Configuration Schema`_.
>> -By default, DTS will try to use the ``dts/conf.yaml`` :ref:`config file <configuration_schema_example>`,
>> +DTS configuration is split into nodes and test runs, and must respect the the model definitions as
> 
> There are two 'the' in the sentence above: "must respect the the model
> definitions".

good catch, thanks!

^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v5 0/8] dts: Pydantic configuration
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
                   ` (7 preceding siblings ...)
  2024-10-28 17:49 ` [PATCH v4 0/8] dts: Pydantic configuration Luca Vizzarro
@ 2024-11-06 18:09 ` Luca Vizzarro
  2024-11-06 18:09   ` [PATCH v5 1/8] dts: add pydantic dependency Luca Vizzarro
                     ` (8 more replies)
  2024-11-08 11:39 ` [PATCH v6 0/9] " Luca Vizzarro
  9 siblings, 9 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-06 18:09 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

Hi there,

sending a v5 for the pydantic changes.

v5:
- rebased
- fixed typos
- renamed NodeInfo to OSSessionInfo
- fixed bug on DPDKRemoteTarballConfiguration object
v4:
- added autodoc_pydantic due to autodoc warnings
- fixed pydantic models docstrings
- updated docs
- refactored DPDKBuildInfo and NodeInfo which didn't belong in
  configuration
v3:
- removed the common FrozenModel and configured each BaseModel
  individually, due to mypy complaints
v2:
- rebased and merge conflicts resolved:
  - capabilities patch introducing TestCase has now been combined with
    TestSuiteSpec
  - external build patch added more configuration complexity which has
    been re-worked in pydantic adding exclusion via structured models
- split pydantic/warlock dependency chains
- deleted the config schema as no longer needed
- removed config schema generator
- turned all configuration dataclasses into Pydantic BaseModels
- refactored
- improved docstrings

Best,
Luca

Luca Vizzarro (8):
  dts: add pydantic dependency
  dts: add TestSuiteSpec class and discovery
  dts: refactor build and node info classes
  dts: use pydantic in the configuration
  dts: remove warlock dependency
  dts: add autodoc pydantic
  dts: improve configuration API docs
  dts: use TestSuiteSpec class imports

 doc/api/dts/conf_yaml_schema.json             |   1 -
 doc/api/dts/framework.config.rst              |   6 -
 doc/api/dts/framework.config.types.rst        |   8 -
 doc/guides/conf.py                            |  13 +
 doc/guides/tools/dts.rst                      | 192 +---
 dts/conf.yaml                                 |  11 +-
 dts/framework/config/__init__.py              | 848 ++++++++----------
 dts/framework/config/conf_yaml_schema.json    | 459 ----------
 dts/framework/config/types.py                 | 149 ---
 dts/framework/runner.py                       | 139 +--
 dts/framework/settings.py                     | 124 +--
 dts/framework/test_result.py                  |   6 +-
 dts/framework/test_suite.py                   | 189 +++-
 dts/framework/testbed_model/capability.py     |  12 +-
 dts/framework/testbed_model/node.py           |  15 +-
 dts/framework/testbed_model/os_session.py     |  27 +-
 dts/framework/testbed_model/port.py           |   4 +-
 dts/framework/testbed_model/posix_session.py  |  12 +-
 dts/framework/testbed_model/sut_node.py       | 204 +++--
 dts/framework/testbed_model/topology.py       |  11 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 dts/framework/utils.py                        |   2 +-
 dts/poetry.lock                               | 423 +++++----
 dts/pyproject.toml                            |   3 +-
 dts/tests/TestSuite_smoke_tests.py            |   2 +-
 26 files changed, 1068 insertions(+), 1798 deletions(-)
 delete mode 120000 doc/api/dts/conf_yaml_schema.json
 delete mode 100644 doc/api/dts/framework.config.types.rst
 delete mode 100644 dts/framework/config/conf_yaml_schema.json
 delete mode 100644 dts/framework/config/types.py

-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v5 1/8] dts: add pydantic dependency
  2024-11-06 18:09 ` [PATCH v5 0/8] dts: Pydantic configuration Luca Vizzarro
@ 2024-11-06 18:09   ` Luca Vizzarro
  2024-11-06 18:09   ` [PATCH v5 2/8] dts: add TestSuiteSpec class and discovery Luca Vizzarro
                     ` (7 subsequent siblings)
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-06 18:09 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

As part of configuration validation and deserialization improvements,
this adds pydantic as a project dependency. Pydantic is a library that
caters to all of the aforementioned needs, while improving the process
and code.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
---
 dts/poetry.lock    | 171 ++++++++++++++++++++++++++++++++++++++++++++-
 dts/pyproject.toml |   1 +
 2 files changed, 170 insertions(+), 2 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index cf5f6569c6..56c50ad52c 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,4 +1,4 @@
-# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand.
 
 [[package]]
 name = "aenum"
@@ -23,6 +23,17 @@ files = [
     {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
 ]
 
+[[package]]
+name = "annotated-types"
+version = "0.7.0"
+description = "Reusable constraint types to use with typing.Annotated"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"},
+    {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
+]
+
 [[package]]
 name = "attrs"
 version = "23.1.0"
@@ -567,6 +578,16 @@ files = [
     {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
     {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
     {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
@@ -762,6 +783,130 @@ files = [
     {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
 ]
 
+[[package]]
+name = "pydantic"
+version = "2.9.2"
+description = "Data validation using Python type hints"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic-2.9.2-py3-none-any.whl", hash = "sha256:f048cec7b26778210e28a0459867920654d48e5e62db0958433636cde4254f12"},
+    {file = "pydantic-2.9.2.tar.gz", hash = "sha256:d155cef71265d1e9807ed1c32b4c8deec042a44a50a4188b25ac67ecd81a9c0f"},
+]
+
+[package.dependencies]
+annotated-types = ">=0.6.0"
+pydantic-core = "2.23.4"
+typing-extensions = [
+    {version = ">=4.12.2", markers = "python_version >= \"3.13\""},
+    {version = ">=4.6.1", markers = "python_version < \"3.13\""},
+]
+
+[package.extras]
+email = ["email-validator (>=2.0.0)"]
+timezone = ["tzdata"]
+
+[[package]]
+name = "pydantic-core"
+version = "2.23.4"
+description = "Core functionality for Pydantic validation and serialization"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic_core-2.23.4-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:b10bd51f823d891193d4717448fab065733958bdb6a6b351967bd349d48d5c9b"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4fc714bdbfb534f94034efaa6eadd74e5b93c8fa6315565a222f7b6f42ca1166"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63e46b3169866bd62849936de036f901a9356e36376079b05efa83caeaa02ceb"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed1a53de42fbe34853ba90513cea21673481cd81ed1be739f7f2efb931b24916"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cfdd16ab5e59fc31b5e906d1a3f666571abc367598e3e02c83403acabc092e07"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:255a8ef062cbf6674450e668482456abac99a5583bbafb73f9ad469540a3a232"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a7cd62e831afe623fbb7aabbb4fe583212115b3ef38a9f6b71869ba644624a2"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f09e2ff1f17c2b51f2bc76d1cc33da96298f0a036a137f5440ab3ec5360b624f"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e38e63e6f3d1cec5a27e0afe90a085af8b6806ee208b33030e65b6516353f1a3"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0dbd8dbed2085ed23b5c04afa29d8fd2771674223135dc9bc937f3c09284d071"},
+    {file = "pydantic_core-2.23.4-cp310-none-win32.whl", hash = "sha256:6531b7ca5f951d663c339002e91aaebda765ec7d61b7d1e3991051906ddde119"},
+    {file = "pydantic_core-2.23.4-cp310-none-win_amd64.whl", hash = "sha256:7c9129eb40958b3d4500fa2467e6a83356b3b61bfff1b414c7361d9220f9ae8f"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:77733e3892bb0a7fa797826361ce8a9184d25c8dffaec60b7ffe928153680ba8"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1b84d168f6c48fabd1f2027a3d1bdfe62f92cade1fb273a5d68e621da0e44e6d"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:df49e7a0861a8c36d089c1ed57d308623d60416dab2647a4a17fe050ba85de0e"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ff02b6d461a6de369f07ec15e465a88895f3223eb75073ffea56b84d9331f607"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:996a38a83508c54c78a5f41456b0103c30508fed9abcad0a59b876d7398f25fd"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d97683ddee4723ae8c95d1eddac7c192e8c552da0c73a925a89fa8649bf13eea"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:216f9b2d7713eb98cb83c80b9c794de1f6b7e3145eef40400c62e86cee5f4e1e"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6f783e0ec4803c787bcea93e13e9932edab72068f68ecffdf86a99fd5918878b"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d0776dea117cf5272382634bd2a5c1b6eb16767c223c6a5317cd3e2a757c61a0"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d5f7a395a8cf1621939692dba2a6b6a830efa6b3cee787d82c7de1ad2930de64"},
+    {file = "pydantic_core-2.23.4-cp311-none-win32.whl", hash = "sha256:74b9127ffea03643e998e0c5ad9bd3811d3dac8c676e47db17b0ee7c3c3bf35f"},
+    {file = "pydantic_core-2.23.4-cp311-none-win_amd64.whl", hash = "sha256:98d134c954828488b153d88ba1f34e14259284f256180ce659e8d83e9c05eaa3"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:f3e0da4ebaef65158d4dfd7d3678aad692f7666877df0002b8a522cdf088f231"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f69a8e0b033b747bb3e36a44e7732f0c99f7edd5cea723d45bc0d6e95377ffee"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:723314c1d51722ab28bfcd5240d858512ffd3116449c557a1336cbe3919beb87"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bb2802e667b7051a1bebbfe93684841cc9351004e2badbd6411bf357ab8d5ac8"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d18ca8148bebe1b0a382a27a8ee60350091a6ddaf475fa05ef50dc35b5df6327"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33e3d65a85a2a4a0dc3b092b938a4062b1a05f3a9abde65ea93b233bca0e03f2"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:128585782e5bfa515c590ccee4b727fb76925dd04a98864182b22e89a4e6ed36"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:68665f4c17edcceecc112dfed5dbe6f92261fb9d6054b47d01bf6371a6196126"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:20152074317d9bed6b7a95ade3b7d6054845d70584216160860425f4fbd5ee9e"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:9261d3ce84fa1d38ed649c3638feefeae23d32ba9182963e465d58d62203bd24"},
+    {file = "pydantic_core-2.23.4-cp312-none-win32.whl", hash = "sha256:4ba762ed58e8d68657fc1281e9bb72e1c3e79cc5d464be146e260c541ec12d84"},
+    {file = "pydantic_core-2.23.4-cp312-none-win_amd64.whl", hash = "sha256:97df63000f4fea395b2824da80e169731088656d1818a11b95f3b173747b6cd9"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7530e201d10d7d14abce4fb54cfe5b94a0aefc87da539d0346a484ead376c3cc"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:df933278128ea1cd77772673c73954e53a1c95a4fdf41eef97c2b779271bd0bd"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cb3da3fd1b6a5d0279a01877713dbda118a2a4fc6f0d821a57da2e464793f05"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42c6dcb030aefb668a2b7009c85b27f90e51e6a3b4d5c9bc4c57631292015b0d"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:696dd8d674d6ce621ab9d45b205df149399e4bb9aa34102c970b721554828510"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2971bb5ffe72cc0f555c13e19b23c85b654dd2a8f7ab493c262071377bfce9f6"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8394d940e5d400d04cad4f75c0598665cbb81aecefaca82ca85bd28264af7f9b"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0dff76e0602ca7d4cdaacc1ac4c005e0ce0dcfe095d5b5259163a80d3a10d327"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7d32706badfe136888bdea71c0def994644e09fff0bfe47441deaed8e96fdbc6"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ed541d70698978a20eb63d8c5d72f2cc6d7079d9d90f6b50bad07826f1320f5f"},
+    {file = "pydantic_core-2.23.4-cp313-none-win32.whl", hash = "sha256:3d5639516376dce1940ea36edf408c554475369f5da2abd45d44621cb616f769"},
+    {file = "pydantic_core-2.23.4-cp313-none-win_amd64.whl", hash = "sha256:5a1504ad17ba4210df3a045132a7baeeba5a200e930f57512ee02909fc5c4cb5"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:d4488a93b071c04dc20f5cecc3631fc78b9789dd72483ba15d423b5b3689b555"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:81965a16b675b35e1d09dd14df53f190f9129c0202356ed44ab2728b1c905658"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ffa2ebd4c8530079140dd2d7f794a9d9a73cbb8e9d59ffe24c63436efa8f271"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:61817945f2fe7d166e75fbfb28004034b48e44878177fc54d81688e7b85a3665"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:29d2c342c4bc01b88402d60189f3df065fb0dda3654744d5a165a5288a657368"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5e11661ce0fd30a6790e8bcdf263b9ec5988e95e63cf901972107efc49218b13"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9d18368b137c6295db49ce7218b1a9ba15c5bc254c96d7c9f9e924a9bc7825ad"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ec4e55f79b1c4ffb2eecd8a0cfba9955a2588497d96851f4c8f99aa4a1d39b12"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:374a5e5049eda9e0a44c696c7ade3ff355f06b1fe0bb945ea3cac2bc336478a2"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5c364564d17da23db1106787675fc7af45f2f7b58b4173bfdd105564e132e6fb"},
+    {file = "pydantic_core-2.23.4-cp38-none-win32.whl", hash = "sha256:d7a80d21d613eec45e3d41eb22f8f94ddc758a6c4720842dc74c0581f54993d6"},
+    {file = "pydantic_core-2.23.4-cp38-none-win_amd64.whl", hash = "sha256:5f5ff8d839f4566a474a969508fe1c5e59c31c80d9e140566f9a37bba7b8d556"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a4fa4fc04dff799089689f4fd502ce7d59de529fc2f40a2c8836886c03e0175a"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a7df63886be5e270da67e0966cf4afbae86069501d35c8c1b3b6c168f42cb36"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dcedcd19a557e182628afa1d553c3895a9f825b936415d0dbd3cd0bbcfd29b4b"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f54b118ce5de9ac21c363d9b3caa6c800341e8c47a508787e5868c6b79c9323"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:86d2f57d3e1379a9525c5ab067b27dbb8a0642fb5d454e17a9ac434f9ce523e3"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:de6d1d1b9e5101508cb37ab0d972357cac5235f5c6533d1071964c47139257df"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1278e0d324f6908e872730c9102b0112477a7f7cf88b308e4fc36ce1bdb6d58c"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9a6b5099eeec78827553827f4c6b8615978bb4b6a88e5d9b93eddf8bb6790f55"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:e55541f756f9b3ee346b840103f32779c695a19826a4c442b7954550a0972040"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a5c7ba8ffb6d6f8f2ab08743be203654bb1aaa8c9dcb09f82ddd34eadb695605"},
+    {file = "pydantic_core-2.23.4-cp39-none-win32.whl", hash = "sha256:37b0fe330e4a58d3c58b24d91d1eb102aeec675a3db4c292ec3928ecd892a9a6"},
+    {file = "pydantic_core-2.23.4-cp39-none-win_amd64.whl", hash = "sha256:1498bec4c05c9c787bde9125cfdcc63a41004ff167f495063191b863399b1a29"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:f455ee30a9d61d3e1a15abd5068827773d6e4dc513e795f380cdd59932c782d5"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1e90d2e3bd2c3863d48525d297cd143fe541be8bbf6f579504b9712cb6b643ec"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e203fdf807ac7e12ab59ca2bfcabb38c7cf0b33c41efeb00f8e5da1d86af480"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e08277a400de01bc72436a0ccd02bdf596631411f592ad985dcee21445bd0068"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f220b0eea5965dec25480b6333c788fb72ce5f9129e8759ef876a1d805d00801"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:d06b0c8da4f16d1d1e352134427cb194a0a6e19ad5db9161bf32b2113409e728"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:ba1a0996f6c2773bd83e63f18914c1de3c9dd26d55f4ac302a7efe93fb8e7433"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:9a5bce9d23aac8f0cf0836ecfc033896aa8443b501c58d0602dbfd5bd5b37753"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:78ddaaa81421a29574a682b3179d4cf9e6d405a09b99d93ddcf7e5239c742e21"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:883a91b5dd7d26492ff2f04f40fbb652de40fcc0afe07e8129e8ae779c2110eb"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:88ad334a15b32a791ea935af224b9de1bf99bcd62fabf745d5f3442199d86d59"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:233710f069d251feb12a56da21e14cca67994eab08362207785cf8c598e74577"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:19442362866a753485ba5e4be408964644dd6a09123d9416c54cd49171f50744"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:624e278a7d29b6445e4e813af92af37820fafb6dcc55c012c834f9e26f9aaaef"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f5ef8f42bec47f21d07668a043f077d507e5bf4e668d5c6dfe6aaba89de1a5b8"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:aea443fffa9fbe3af1a9ba721a87f926fe548d32cab71d188a6ede77d0ff244e"},
+    {file = "pydantic_core-2.23.4.tar.gz", hash = "sha256:2584f7cf844ac4d970fba483a717dbe10c1c1c96a969bf65d61ffe94df1b2863"},
+]
+
+[package.dependencies]
+typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
+
 [[package]]
 name = "pydocstyle"
 version = "6.1.1"
@@ -880,6 +1025,7 @@ files = [
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
+    {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
     {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
     {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
     {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -887,8 +1033,16 @@ files = [
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
+    {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
     {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
     {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
+    {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
+    {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
+    {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
+    {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
+    {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
+    {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
+    {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
     {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
     {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
     {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -905,6 +1059,7 @@ files = [
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
+    {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
     {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
     {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
     {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -912,6 +1067,7 @@ files = [
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
+    {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
     {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
     {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
     {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -1327,6 +1483,17 @@ files = [
     {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
 ]
 
+[[package]]
+name = "typing-extensions"
+version = "4.12.2"
+description = "Backported and Experimental Type Hints for Python 3.8+"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"},
+    {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"},
+]
+
 [[package]]
 name = "urllib3"
 version = "2.0.7"
@@ -1362,4 +1529,4 @@ jsonschema = ">=4,<5"
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "6f20ce05310df93fed1d392160d1653ae5de5c6f260a5865eb3c6111a7c2b394"
+content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 506380ac2f..6c2d1ca8a4 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -28,6 +28,7 @@ scapy = "^2.5.0"
 pydocstyle = "6.1.1"
 typing-extensions = "^4.11.0"
 aenum = "^3.1.15"
+pydantic = "^2.9.2"
 
 [tool.poetry.group.dev.dependencies]
 mypy = "^1.10.0"
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v5 2/8] dts: add TestSuiteSpec class and discovery
  2024-11-06 18:09 ` [PATCH v5 0/8] dts: Pydantic configuration Luca Vizzarro
  2024-11-06 18:09   ` [PATCH v5 1/8] dts: add pydantic dependency Luca Vizzarro
@ 2024-11-06 18:09   ` Luca Vizzarro
  2024-11-06 18:09   ` [PATCH v5 3/8] dts: refactor build and node info classes Luca Vizzarro
                     ` (6 subsequent siblings)
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-06 18:09 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

Currently there is a lack of a definition which identifies all the test
suites available to test. This change intends to simplify the process to
discover all the test suites and identify them.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
---
 dts/framework/runner.py                   |   2 +-
 dts/framework/test_suite.py               | 189 +++++++++++++++++++---
 dts/framework/testbed_model/capability.py |  12 +-
 3 files changed, 177 insertions(+), 26 deletions(-)

diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 8bbe698eaf..195622c653 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -225,7 +225,7 @@ def _get_test_suites_with_cases(
         for test_suite_config in test_suite_configs:
             test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
             test_cases: list[type[TestCase]] = []
-            func_test_cases, perf_test_cases = test_suite_class.get_test_cases(
+            func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
                 test_suite_config.test_cases
             )
             if func:
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index cbe3b30ffc..936eb2cede 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2010-2014 Intel Corporation
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Features common to all test suites.
 
@@ -16,13 +17,20 @@
 import inspect
 from collections import Counter
 from collections.abc import Callable, Sequence
+from dataclasses import dataclass
 from enum import Enum, auto
+from functools import cached_property
+from importlib import import_module
 from ipaddress import IPv4Interface, IPv6Interface, ip_interface
+from pkgutil import iter_modules
+from types import ModuleType
 from typing import ClassVar, Protocol, TypeVar, Union, cast
 
+from pydantic.alias_generators import to_pascal
 from scapy.layers.inet import IP  # type: ignore[import-untyped]
 from scapy.layers.l2 import Ether  # type: ignore[import-untyped]
 from scapy.packet import Packet, Padding, raw  # type: ignore[import-untyped]
+from typing_extensions import Self
 
 from framework.testbed_model.capability import TestProtocol
 from framework.testbed_model.port import Port
@@ -33,7 +41,7 @@
     PacketFilteringConfig,
 )
 
-from .exception import ConfigurationError, TestCaseVerifyError
+from .exception import ConfigurationError, InternalError, TestCaseVerifyError
 from .logger import DTSLogger, get_dts_logger
 from .utils import get_packet_summaries
 
@@ -112,10 +120,24 @@ def __init__(
         self._tg_ip_address_ingress = ip_interface("192.168.101.3/24")
 
     @classmethod
-    def get_test_cases(
+    def get_test_cases(cls) -> list[type["TestCase"]]:
+        """A list of all the available test cases."""
+
+        def is_test_case(function: Callable) -> bool:
+            if inspect.isfunction(function):
+                # TestCase is not used at runtime, so we can't use isinstance() with `function`.
+                # But function.test_type exists.
+                if hasattr(function, "test_type"):
+                    return isinstance(function.test_type, TestCaseType)
+            return False
+
+        return [test_case for _, test_case in inspect.getmembers(cls, is_test_case)]
+
+    @classmethod
+    def filter_test_cases(
         cls, test_case_sublist: Sequence[str] | None = None
     ) -> tuple[set[type["TestCase"]], set[type["TestCase"]]]:
-        """Filter `test_case_subset` from this class.
+        """Filter `test_case_sublist` from this class.
 
         Test cases are regular (or bound) methods decorated with :func:`func_test`
         or :func:`perf_test`.
@@ -129,17 +151,8 @@ def get_test_cases(
             as methods are bound to instances and this method only has access to the class.
 
         Raises:
-            ConfigurationError: If a test case from `test_case_subset` is not found.
+            ConfigurationError: If a test case from `test_case_sublist` is not found.
         """
-
-        def is_test_case(function: Callable) -> bool:
-            if inspect.isfunction(function):
-                # TestCase is not used at runtime, so we can't use isinstance() with `function`.
-                # But function.test_type exists.
-                if hasattr(function, "test_type"):
-                    return isinstance(function.test_type, TestCaseType)
-            return False
-
         if test_case_sublist is None:
             test_case_sublist = []
 
@@ -149,22 +162,22 @@ def is_test_case(function: Callable) -> bool:
         func_test_cases = set()
         perf_test_cases = set()
 
-        for test_case_name, test_case_function in inspect.getmembers(cls, is_test_case):
-            if test_case_name in test_case_sublist_copy:
+        for test_case in cls.get_test_cases():
+            if test_case.name in test_case_sublist_copy:
                 # if test_case_sublist_copy is non-empty, remove the found test case
                 # so that we can look at the remainder at the end
-                test_case_sublist_copy.remove(test_case_name)
+                test_case_sublist_copy.remove(test_case.name)
             elif test_case_sublist:
                 # the original list not being empty means we're filtering test cases
-                # since we didn't remove test_case_name in the previous branch,
+                # since we didn't remove test_case.name in the previous branch,
                 # it doesn't match the filter and we don't want to remove it
                 continue
 
-            match test_case_function.test_type:
+            match test_case.test_type:
                 case TestCaseType.PERFORMANCE:
-                    perf_test_cases.add(test_case_function)
+                    perf_test_cases.add(test_case)
                 case TestCaseType.FUNCTIONAL:
-                    func_test_cases.add(test_case_function)
+                    func_test_cases.add(test_case)
 
         if test_case_sublist_copy:
             raise ConfigurationError(
@@ -536,6 +549,8 @@ class TestCase(TestProtocol, Protocol[TestSuiteMethodType]):
     test case function to :class:`TestCase` and sets common variables.
     """
 
+    #:
+    name: ClassVar[str]
     #:
     test_type: ClassVar[TestCaseType]
     #: necessary for mypy so that it can treat this class as the function it's shadowing
@@ -560,6 +575,7 @@ def make_decorator(
 
         def _decorator(func: TestSuiteMethodType) -> type[TestCase]:
             test_case = cast(type[TestCase], func)
+            test_case.name = func.__name__
             test_case.skip = cls.skip
             test_case.skip_reason = cls.skip_reason
             test_case.required_capabilities = set()
@@ -575,3 +591,136 @@ def _decorator(func: TestSuiteMethodType) -> type[TestCase]:
 func_test: Callable = TestCase.make_decorator(TestCaseType.FUNCTIONAL)
 #: The decorator for performance test cases.
 perf_test: Callable = TestCase.make_decorator(TestCaseType.PERFORMANCE)
+
+
+@dataclass
+class TestSuiteSpec:
+    """A class defining the specification of a test suite.
+
+    Apart from defining all the specs of a test suite, a helper function :meth:`discover_all` is
+    provided to automatically discover all the available test suites.
+
+    Attributes:
+        module_name: The name of the test suite's module.
+    """
+
+    #:
+    TEST_SUITES_PACKAGE_NAME = "tests"
+    #:
+    TEST_SUITE_MODULE_PREFIX = "TestSuite_"
+    #:
+    TEST_SUITE_CLASS_PREFIX = "Test"
+    #:
+    TEST_CASE_METHOD_PREFIX = "test_"
+    #:
+    FUNC_TEST_CASE_REGEX = r"test_(?!perf_)"
+    #:
+    PERF_TEST_CASE_REGEX = r"test_perf_"
+
+    module_name: str
+
+    @cached_property
+    def name(self) -> str:
+        """The name of the test suite's module."""
+        return self.module_name[len(self.TEST_SUITE_MODULE_PREFIX) :]
+
+    @cached_property
+    def module(self) -> ModuleType:
+        """A reference to the test suite's module."""
+        return import_module(f"{self.TEST_SUITES_PACKAGE_NAME}.{self.module_name}")
+
+    @cached_property
+    def class_name(self) -> str:
+        """The name of the test suite's class."""
+        return f"{self.TEST_SUITE_CLASS_PREFIX}{to_pascal(self.name)}"
+
+    @cached_property
+    def class_obj(self) -> type[TestSuite]:
+        """A reference to the test suite's class."""
+
+        def is_test_suite(obj) -> bool:
+            """Check whether `obj` is a :class:`TestSuite`.
+
+            The `obj` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
+
+            Args:
+                obj: The object to be checked.
+
+            Returns:
+                :data:`True` if `obj` is a subclass of `TestSuite`.
+            """
+            try:
+                if issubclass(obj, TestSuite) and obj is not TestSuite:
+                    return True
+            except TypeError:
+                return False
+            return False
+
+        for class_name, class_obj in inspect.getmembers(self.module, is_test_suite):
+            if class_name == self.class_name:
+                return class_obj
+
+        raise InternalError(
+            f"Expected class {self.class_name} not found in module {self.module_name}."
+        )
+
+    @classmethod
+    def discover_all(
+        cls, package_name: str | None = None, module_prefix: str | None = None
+    ) -> list[Self]:
+        """Discover all the test suites.
+
+        The test suites are discovered in the provided `package_name`. The full module name,
+        expected under that package, is prefixed with `module_prefix`.
+        The module name is a standard filename with words separated with underscores.
+        For each module found, search for a :class:`TestSuite` class which starts
+        with :attr:`~TestSuiteSpec.TEST_SUITE_CLASS_PREFIX`, continuing with the module name in
+        PascalCase.
+
+        The PascalCase convention applies to abbreviations, acronyms, initialisms and so on::
+
+            OS -> Os
+            TCP -> Tcp
+
+        Args:
+            package_name: The name of the package where to find the test suites. If :data:`None`,
+                the :attr:`~TestSuiteSpec.TEST_SUITES_PACKAGE_NAME` is used.
+            module_prefix: The name prefix defining the test suite module. If :data:`None`, the
+                :attr:`~TestSuiteSpec.TEST_SUITE_MODULE_PREFIX` constant is used.
+
+        Returns:
+            A list containing all the discovered test suites.
+        """
+        if package_name is None:
+            package_name = cls.TEST_SUITES_PACKAGE_NAME
+        if module_prefix is None:
+            module_prefix = cls.TEST_SUITE_MODULE_PREFIX
+
+        test_suites = []
+
+        test_suites_pkg = import_module(package_name)
+        for _, module_name, is_pkg in iter_modules(test_suites_pkg.__path__):
+            if not module_name.startswith(module_prefix) or is_pkg:
+                continue
+
+            test_suite = cls(module_name)
+            try:
+                if test_suite.class_obj:
+                    test_suites.append(test_suite)
+            except InternalError as err:
+                get_dts_logger().warning(err)
+
+        return test_suites
+
+
+AVAILABLE_TEST_SUITES: list[TestSuiteSpec] = TestSuiteSpec.discover_all()
+"""Constant to store all the available, discovered and imported test suites.
+
+The test suites should be gathered from this list to avoid importing more than once.
+"""
+
+
+def find_by_name(name: str) -> TestSuiteSpec | None:
+    """Find a requested test suite by name from the available ones."""
+    test_suites = filter(lambda t: t.name == name, AVAILABLE_TEST_SUITES)
+    return next(test_suites, None)
diff --git a/dts/framework/testbed_model/capability.py b/dts/framework/testbed_model/capability.py
index 2207957a7a..0d5f0e0b32 100644
--- a/dts/framework/testbed_model/capability.py
+++ b/dts/framework/testbed_model/capability.py
@@ -47,9 +47,9 @@ def test_scatter_mbuf_2048(self):
 
 import inspect
 from abc import ABC, abstractmethod
-from collections.abc import MutableSet, Sequence
+from collections.abc import MutableSet
 from dataclasses import dataclass
-from typing import Callable, ClassVar, Protocol
+from typing import TYPE_CHECKING, Callable, ClassVar, Protocol
 
 from typing_extensions import Self
 
@@ -66,6 +66,9 @@ def test_scatter_mbuf_2048(self):
 from .sut_node import SutNode
 from .topology import Topology, TopologyType
 
+if TYPE_CHECKING:
+    from framework.test_suite import TestCase
+
 
 class Capability(ABC):
     """The base class for various capabilities.
@@ -354,8 +357,7 @@ def set_required(self, test_case_or_suite: type["TestProtocol"]) -> None:
         if inspect.isclass(test_case_or_suite):
             if self.topology_type is not TopologyType.default:
                 self.add_to_required(test_case_or_suite)
-                func_test_cases, perf_test_cases = test_case_or_suite.get_test_cases()
-                for test_case in func_test_cases | perf_test_cases:
+                for test_case in test_case_or_suite.get_test_cases():
                     if test_case.topology_type.topology_type is TopologyType.default:
                         # test case topology has not been set, use the one set by the test suite
                         self.add_to_required(test_case)
@@ -446,7 +448,7 @@ class TestProtocol(Protocol):
     required_capabilities: ClassVar[set[Capability]] = set()
 
     @classmethod
-    def get_test_cases(cls, test_case_sublist: Sequence[str] | None = None) -> tuple[set, set]:
+    def get_test_cases(cls) -> list[type["TestCase"]]:
         """Get test cases. Should be implemented by subclasses containing test cases.
 
         Raises:
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v5 3/8] dts: refactor build and node info classes
  2024-11-06 18:09 ` [PATCH v5 0/8] dts: Pydantic configuration Luca Vizzarro
  2024-11-06 18:09   ` [PATCH v5 1/8] dts: add pydantic dependency Luca Vizzarro
  2024-11-06 18:09   ` [PATCH v5 2/8] dts: add TestSuiteSpec class and discovery Luca Vizzarro
@ 2024-11-06 18:09   ` Luca Vizzarro
  2024-11-06 18:09   ` [PATCH v5 4/8] dts: use pydantic in the configuration Luca Vizzarro
                     ` (5 subsequent siblings)
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-06 18:09 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

The DPDKBuildInfo and NodeInfo classes, representing information
gathered in runtime, were erroneously placed in the configuration
package. This moves them in more appropriate modules.

NodeInfo, specifically, is moved to os_session instead of node mostly
as a consequence of circular dependencies. And given os_session is the
top-most module to reference it, it appears to be the most suitable
place outside of node.

Finally NodeInfo, is better renamed to OSSessionInfo as it represents
the information on the target OS session.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
---
 dts/framework/config/__init__.py             | 31 --------------------
 dts/framework/test_result.py                 |  6 ++--
 dts/framework/testbed_model/os_session.py    | 23 +++++++++++++--
 dts/framework/testbed_model/posix_session.py |  8 ++---
 dts/framework/testbed_model/sut_node.py      | 22 ++++++++++----
 5 files changed, 46 insertions(+), 44 deletions(-)

diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index d0d95d00c7..7403ccbf14 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -318,24 +318,6 @@ class TGNodeConfiguration(NodeConfiguration):
     traffic_generator: TrafficGeneratorConfig
 
 
-@dataclass(slots=True, frozen=True)
-class NodeInfo:
-    """Supplemental node information.
-
-    Attributes:
-        os_name: The name of the running operating system of
-            the :class:`~framework.testbed_model.node.Node`.
-        os_version: The version of the running operating system of
-            the :class:`~framework.testbed_model.node.Node`.
-        kernel_version: The kernel version of the running operating system of
-            the :class:`~framework.testbed_model.node.Node`.
-    """
-
-    os_name: str
-    os_version: str
-    kernel_version: str
-
-
 @dataclass(slots=True, frozen=True)
 class DPDKBuildConfiguration:
     """DPDK build configuration.
@@ -493,19 +475,6 @@ def from_dict(cls, d: DPDKConfigurationDict) -> Self:
         )
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKBuildInfo:
-    """Various versions and other information about a DPDK build.
-
-    Attributes:
-        dpdk_version: The DPDK version that was built.
-        compiler_version: The version of the compiler used to build DPDK.
-    """
-
-    dpdk_version: str | None
-    compiler_version: str | None
-
-
 @dataclass(slots=True, frozen=True)
 class TestSuiteConfig:
     """Test suite configuration.
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 00263ad69e..6014d281b5 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -30,11 +30,13 @@
 
 from framework.testbed_model.capability import Capability
 
-from .config import DPDKBuildInfo, NodeInfo, TestRunConfiguration, TestSuiteConfig
+from .config import TestRunConfiguration, TestSuiteConfig
 from .exception import DTSError, ErrorSeverity
 from .logger import DTSLogger
 from .settings import SETTINGS
 from .test_suite import TestCase, TestSuite
+from .testbed_model.os_session import OSSessionInfo
+from .testbed_model.sut_node import DPDKBuildInfo
 
 
 @dataclass(slots=True, frozen=True)
@@ -421,7 +423,7 @@ def test_suites_with_cases(self, test_suites_with_cases: list[TestSuiteWithCases
             )
         self._test_suites_with_cases = test_suites_with_cases
 
-    def add_sut_info(self, sut_info: NodeInfo) -> None:
+    def add_sut_info(self, sut_info: OSSessionInfo) -> None:
         """Add SUT information gathered at runtime.
 
         Args:
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 6194ddb989..db37424954 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -24,11 +24,12 @@
 """
 from abc import ABC, abstractmethod
 from collections.abc import Iterable
+from dataclasses import dataclass
 from ipaddress import IPv4Interface, IPv6Interface
 from pathlib import Path, PurePath, PurePosixPath
 from typing import Union
 
-from framework.config import Architecture, NodeConfiguration, NodeInfo
+from framework.config import Architecture, NodeConfiguration
 from framework.logger import DTSLogger
 from framework.remote_session import (
     InteractiveRemoteSession,
@@ -44,6 +45,24 @@
 from .port import Port
 
 
+@dataclass(slots=True, frozen=True)
+class OSSessionInfo:
+    """Supplemental OS session information.
+
+    Attributes:
+        os_name: The name of the running operating system of
+            the :class:`~framework.testbed_model.node.Node`.
+        os_version: The version of the running operating system of
+            the :class:`~framework.testbed_model.node.Node`.
+        kernel_version: The kernel version of the running operating system of
+            the :class:`~framework.testbed_model.node.Node`.
+    """
+
+    os_name: str
+    os_version: str
+    kernel_version: str
+
+
 class OSSession(ABC):
     """OS-unaware to OS-aware translation API definition.
 
@@ -482,7 +501,7 @@ def get_compiler_version(self, compiler_name: str) -> str:
         """
 
     @abstractmethod
-    def get_node_info(self) -> NodeInfo:
+    def get_node_info(self) -> OSSessionInfo:
         """Collect additional information about the node.
 
         Returns:
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 5ab7c18fb7..d7a1f38cad 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -15,7 +15,7 @@
 from collections.abc import Iterable
 from pathlib import Path, PurePath, PurePosixPath
 
-from framework.config import Architecture, NodeInfo
+from framework.config import Architecture
 from framework.exception import DPDKBuildError, RemoteCommandExecutionError
 from framework.settings import SETTINGS
 from framework.utils import (
@@ -26,7 +26,7 @@
     extract_tarball,
 )
 
-from .os_session import OSSession
+from .os_session import OSSession, OSSessionInfo
 
 
 class PosixSession(OSSession):
@@ -386,11 +386,11 @@ def get_compiler_version(self, compiler_name: str) -> str:
             case _:
                 raise ValueError(f"Unknown compiler {compiler_name}")
 
-    def get_node_info(self) -> NodeInfo:
+    def get_node_info(self) -> OSSessionInfo:
         """Overrides :meth:`~.os_session.OSSession.get_node_info`."""
         os_release_info = self.send_command(
             "awk -F= '$1 ~ /^NAME$|^VERSION$/ {print $2}' /etc/os-release",
             SETTINGS.timeout,
         ).stdout.split("\n")
         kernel_version = self.send_command("uname -r", SETTINGS.timeout).stdout
-        return NodeInfo(os_release_info[0].strip(), os_release_info[1].strip(), kernel_version)
+        return OSSessionInfo(os_release_info[0].strip(), os_release_info[1].strip(), kernel_version)
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index e160386324..5474d436a1 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -14,13 +14,12 @@
 
 import os
 import time
+from dataclasses import dataclass
 from pathlib import PurePath
 
 from framework.config import (
     DPDKBuildConfiguration,
-    DPDKBuildInfo,
     DPDKLocation,
-    NodeInfo,
     SutNodeConfiguration,
     TestRunConfiguration,
 )
@@ -30,10 +29,23 @@
 from framework.utils import MesonArgs, TarCompressionFormat
 
 from .node import Node
-from .os_session import OSSession
+from .os_session import OSSession, OSSessionInfo
 from .virtual_device import VirtualDevice
 
 
+@dataclass(slots=True, frozen=True)
+class DPDKBuildInfo:
+    """Various versions and other information about a DPDK build.
+
+    Attributes:
+        dpdk_version: The DPDK version that was built.
+        compiler_version: The version of the compiler used to build DPDK.
+    """
+
+    dpdk_version: str | None
+    compiler_version: str | None
+
+
 class SutNode(Node):
     """The system under test node.
 
@@ -63,7 +75,7 @@ class SutNode(Node):
     _app_compile_timeout: float
     _dpdk_kill_session: OSSession | None
     _dpdk_version: str | None
-    _node_info: NodeInfo | None
+    _node_info: OSSessionInfo | None
     _compiler_version: str | None
     _path_to_devbind_script: PurePath | None
     _ports_bound_to_dpdk: bool
@@ -125,7 +137,7 @@ def dpdk_version(self) -> str | None:
         return self._dpdk_version
 
     @property
-    def node_info(self) -> NodeInfo:
+    def node_info(self) -> OSSessionInfo:
         """Additional node information."""
         if self._node_info is None:
             self._node_info = self.main_session.get_node_info()
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v5 4/8] dts: use pydantic in the configuration
  2024-11-06 18:09 ` [PATCH v5 0/8] dts: Pydantic configuration Luca Vizzarro
                     ` (2 preceding siblings ...)
  2024-11-06 18:09   ` [PATCH v5 3/8] dts: refactor build and node info classes Luca Vizzarro
@ 2024-11-06 18:09   ` Luca Vizzarro
  2024-11-07  0:33     ` Patrick Robb
  2024-11-06 18:09   ` [PATCH v5 5/8] dts: remove warlock dependency Luca Vizzarro
                     ` (4 subsequent siblings)
  8 siblings, 1 reply; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-06 18:09 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

This change brings in pydantic in place of warlock. Pydantic offers
a built-in model validation system in the classes, which allows for
a more resilient and simpler code. As a consequence of this change:

- most validation is now built-in
- further validation is added to verify:
  - cross referencing of node names and ports
  - test suite and test cases names
- dictionaries representing the config schema are removed
- the config schema is no longer used and therefore dropped
- the TrafficGeneratorType enum has been changed from inheriting
  StrEnum to the native str and Enum. This change was necessary to
  enable the discriminator for object unions
- the structure of the classes has been slightly changed to perfectly
  match the structure of the configuration files
- the test suite argument catches the ValidationError that
  TestSuiteConfig can now raise
- the DPDK location has been wrapped under another configuration
  mapping `dpdk_location`
- the DPDK locations are now structured and enforced by classes,
  further simplifying the validation and handling thanks to
  pattern matching

Bugzilla ID: 1508

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
---
 doc/api/dts/conf_yaml_schema.json             |   1 -
 doc/api/dts/framework.config.rst              |   6 -
 doc/api/dts/framework.config.types.rst        |   8 -
 dts/conf.yaml                                 |  11 +-
 dts/framework/config/__init__.py              | 822 +++++++++---------
 dts/framework/config/conf_yaml_schema.json    | 459 ----------
 dts/framework/config/types.py                 | 149 ----
 dts/framework/runner.py                       |  57 +-
 dts/framework/settings.py                     | 124 +--
 dts/framework/testbed_model/node.py           |  15 +-
 dts/framework/testbed_model/os_session.py     |   4 +-
 dts/framework/testbed_model/port.py           |   4 +-
 dts/framework/testbed_model/posix_session.py  |   4 +-
 dts/framework/testbed_model/sut_node.py       | 182 ++--
 dts/framework/testbed_model/topology.py       |  11 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 dts/framework/utils.py                        |   2 +-
 dts/tests/TestSuite_smoke_tests.py            |   2 +-
 19 files changed, 648 insertions(+), 1219 deletions(-)
 delete mode 120000 doc/api/dts/conf_yaml_schema.json
 delete mode 100644 doc/api/dts/framework.config.types.rst
 delete mode 100644 dts/framework/config/conf_yaml_schema.json
 delete mode 100644 dts/framework/config/types.py

diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json
deleted file mode 120000
index 5978642d76..0000000000
--- a/doc/api/dts/conf_yaml_schema.json
+++ /dev/null
@@ -1 +0,0 @@
-../../../dts/framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst
index 261997aefa..cc266276c1 100644
--- a/doc/api/dts/framework.config.rst
+++ b/doc/api/dts/framework.config.rst
@@ -6,9 +6,3 @@ config - Configuration Package
 .. automodule:: framework.config
    :members:
    :show-inheritance:
-
-.. toctree::
-   :hidden:
-   :maxdepth: 1
-
-   framework.config.types
diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst
deleted file mode 100644
index a50a0c874a..0000000000
--- a/doc/api/dts/framework.config.types.rst
+++ /dev/null
@@ -1,8 +0,0 @@
-.. SPDX-License-Identifier: BSD-3-Clause
-
-config.types - Configuration Types
-==================================
-
-.. automodule:: framework.config.types
-   :members:
-   :show-inheritance:
diff --git a/dts/conf.yaml b/dts/conf.yaml
index 8a65a481d6..2496262854 100644
--- a/dts/conf.yaml
+++ b/dts/conf.yaml
@@ -5,11 +5,12 @@
 test_runs:
   # define one test run environment
   - dpdk_build:
-      # dpdk_tree: Commented out because `tarball` is defined.
-      tarball: dpdk-tarball.tar.xz
-      # Either `dpdk_tree` or `tarball` can be defined, but not both.
-      remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
-                    # is located on the SUT node, instead of the execution host.
+      dpdk_location:
+        # dpdk_tree: Commented out because `tarball` is defined.
+        tarball: dpdk-tarball.tar.xz
+        # Either `dpdk_tree` or `tarball` can be defined, but not both.
+        remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
+                      # is located on the SUT node, instead of the execution host.
 
       # precompiled_build_dir: Commented out because `build_options` is defined.
       build_options:
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index 7403ccbf14..252e945e12 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -2,17 +2,18 @@
 # Copyright(c) 2010-2021 Intel Corporation
 # Copyright(c) 2022-2023 University of New Hampshire
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Testbed configuration and test suite specification.
 
 This package offers classes that hold real-time information about the testbed, hold test run
 configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
-the YAML test run configuration file
-and validates it according to :download:`the schema <conf_yaml_schema.json>`.
+the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
+model.
 
 The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
-this package. The allowed keys and types inside this dictionary are defined in
-the :doc:`types <framework.config.types>` module.
+this package. The allowed keys and types inside this dictionary map directly to the
+:class:`Configuration` model, its fields and sub-models.
 
 The test run configuration has two main sections:
 
@@ -24,39 +25,28 @@
 
 The real-time information about testbed is supposed to be gathered at runtime.
 
-The classes defined in this package make heavy use of :mod:`dataclasses`.
-All of them use slots and are frozen:
+The classes defined in this package make heavy use of :mod:`pydantic`.
+Nearly all of them are frozen:
 
-    * Slots enables some optimizations, by pre-allocating space for the defined
-      attributes in the underlying data structure,
     * Frozen makes the object immutable. This enables further optimizations,
       and makes it thread safe should we ever want to move in that direction.
 """
 
-import json
-import os.path
 import tarfile
-from dataclasses import dataclass, fields
-from enum import auto, unique
-from pathlib import Path
-from typing import Union
+from enum import Enum, auto, unique
+from functools import cached_property
+from pathlib import Path, PurePath
+from typing import TYPE_CHECKING, Annotated, Any, Literal, NamedTuple
 
-import warlock  # type: ignore[import-untyped]
 import yaml
+from pydantic import BaseModel, Field, ValidationError, field_validator, model_validator
 from typing_extensions import Self
 
-from framework.config.types import (
-    ConfigurationDict,
-    DPDKBuildConfigDict,
-    DPDKConfigurationDict,
-    NodeConfigDict,
-    PortConfigDict,
-    TestRunConfigDict,
-    TestSuiteConfigDict,
-    TrafficGeneratorConfigDict,
-)
 from framework.exception import ConfigurationError
-from framework.utils import StrEnum
+from framework.utils import REGEX_FOR_PCI_ADDRESS, StrEnum
+
+if TYPE_CHECKING:
+    from framework.test_suite import TestSuiteSpec
 
 
 @unique
@@ -118,15 +108,14 @@ class Compiler(StrEnum):
 
 
 @unique
-class TrafficGeneratorType(StrEnum):
+class TrafficGeneratorType(str, Enum):
     """The supported traffic generators."""
 
     #:
-    SCAPY = auto()
+    SCAPY = "SCAPY"
 
 
-@dataclass(slots=True, frozen=True)
-class HugepageConfiguration:
+class HugepageConfiguration(BaseModel, frozen=True, extra="forbid"):
     r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
@@ -138,12 +127,10 @@ class HugepageConfiguration:
     force_first_numa: bool
 
 
-@dataclass(slots=True, frozen=True)
-class PortConfig:
+class PortConfig(BaseModel, frozen=True, extra="forbid"):
     r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
-        node: The :class:`~framework.testbed_model.node.Node` where this port exists.
         pci: The PCI address of the port.
         os_driver_for_dpdk: The operating system driver name for use with DPDK.
         os_driver: The operating system driver name when the operating system controls the port.
@@ -152,70 +139,57 @@ class PortConfig:
         peer_pci: The PCI address of the port connected to this port.
     """
 
-    node: str
-    pci: str
-    os_driver_for_dpdk: str
-    os_driver: str
-    peer_node: str
-    peer_pci: str
-
-    @classmethod
-    def from_dict(cls, node: str, d: PortConfigDict) -> Self:
-        """A convenience method that creates the object from fewer inputs.
-
-        Args:
-            node: The node where this port exists.
-            d: The configuration dictionary.
-
-        Returns:
-            The port configuration instance.
-        """
-        return cls(node=node, **d)
-
-
-@dataclass(slots=True, frozen=True)
-class TrafficGeneratorConfig:
-    """The configuration of traffic generators.
-
-    The class will be expanded when more configuration is needed.
+    pci: str = Field(
+        description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS
+    )
+    os_driver_for_dpdk: str = Field(
+        description="The driver that the kernel should bind this device to for DPDK to use it.",
+        examples=["vfio-pci", "mlx5_core"],
+    )
+    os_driver: str = Field(
+        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
+    )
+    peer_node: str = Field(description="The name of the peer node this port is connected to.")
+    peer_pci: str = Field(
+        description="The PCI address of the peer port this port is connected to.",
+        pattern=REGEX_FOR_PCI_ADDRESS,
+    )
+
+
+class TrafficGeneratorConfig(BaseModel, frozen=True, extra="forbid"):
+    """A protocol required to define traffic generator types.
 
     Attributes:
-        traffic_generator_type: The type of the traffic generator.
+        type: The traffic generator type, the child class is required to define to be distinguished
+            among others.
     """
 
-    traffic_generator_type: TrafficGeneratorType
+    type: TrafficGeneratorType
 
-    @staticmethod
-    def from_dict(d: TrafficGeneratorConfigDict) -> "TrafficGeneratorConfig":
-        """A convenience method that produces traffic generator config of the proper type.
 
-        Args:
-            d: The configuration dictionary.
+class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="forbid"):
+    """Scapy traffic generator specific configuration."""
 
-        Returns:
-            The traffic generator configuration instance.
+    type: Literal[TrafficGeneratorType.SCAPY]
 
-        Raises:
-            ConfigurationError: An unknown traffic generator type was encountered.
-        """
-        match TrafficGeneratorType(d["type"]):
-            case TrafficGeneratorType.SCAPY:
-                return ScapyTrafficGeneratorConfig(
-                    traffic_generator_type=TrafficGeneratorType.SCAPY
-                )
-            case _:
-                raise ConfigurationError(f'Unknown traffic generator type "{d["type"]}".')
 
+#: A union type discriminating traffic generators by the `type` field.
+TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
 
-@dataclass(slots=True, frozen=True)
-class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
-    """Scapy traffic generator specific configuration."""
 
-    pass
+#: A field representing logical core ranges.
+LogicalCores = Annotated[
+    str,
+    Field(
+        description="Comma-separated list of logical cores to use. "
+        "An empty string means use all lcores.",
+        examples=["1,2,3,4,5,18-22", "10-15"],
+        pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
+    ),
+]
 
 
-@dataclass(slots=True, frozen=True)
-class NodeConfiguration:
+class NodeConfiguration(BaseModel, frozen=True, extra="forbid"):
     r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
@@ -234,285 +208,317 @@ class NodeConfiguration:
         ports: The ports that can be used in testing.
     """
 
-    name: str
-    hostname: str
-    user: str
-    password: str | None
+    name: str = Field(description="A unique identifier for this node.")
+    hostname: str = Field(description="The hostname or IP address of the node.")
+    user: str = Field(description="The login user to use to connect to this node.")
+    password: str | None = Field(
+        default=None,
+        description="The login password to use to connect to this node. "
+        "SSH keys are STRONGLY preferred, use only as last resort.",
+    )
     arch: Architecture
     os: OS
-    lcores: str
-    use_first_core: bool
-    hugepages: HugepageConfiguration | None
-    ports: list[PortConfig]
-
-    @staticmethod
-    def from_dict(
-        d: NodeConfigDict,
-    ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
-        """A convenience method that processes the inputs before creating a specialized instance.
-
-        Args:
-            d: The configuration dictionary.
-
-        Returns:
-            Either an SUT or TG configuration instance.
-        """
-        hugepage_config = None
-        if "hugepages_2mb" in d:
-            hugepage_config_dict = d["hugepages_2mb"]
-            if "force_first_numa" not in hugepage_config_dict:
-                hugepage_config_dict["force_first_numa"] = False
-            hugepage_config = HugepageConfiguration(**hugepage_config_dict)
-
-        # The calls here contain duplicated code which is here because Mypy doesn't
-        # properly support dictionary unpacking with TypedDicts
-        if "traffic_generator" in d:
-            return TGNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                traffic_generator=TrafficGeneratorConfig.from_dict(d["traffic_generator"]),
-            )
-        else:
-            return SutNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                memory_channels=d.get("memory_channels", 1),
-            )
+    lcores: LogicalCores = "1"
+    use_first_core: bool = Field(
+        default=False, description="DPDK won't use the first physical core if set to False."
+    )
+    hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
+    ports: list[PortConfig] = Field(min_length=1)
 
 
-@dataclass(slots=True, frozen=True)
-class SutNodeConfiguration(NodeConfiguration):
+class SutNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
     """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
 
     Attributes:
         memory_channels: The number of memory channels to use when running DPDK.
     """
 
-    memory_channels: int
+    memory_channels: int = Field(
+        default=1, description="Number of memory channels to use when running DPDK."
+    )
 
 
-@dataclass(slots=True, frozen=True)
-class TGNodeConfiguration(NodeConfiguration):
+class TGNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
     """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
 
     Attributes:
         traffic_generator: The configuration of the traffic generator present on the TG node.
     """
 
-    traffic_generator: TrafficGeneratorConfig
+    traffic_generator: TrafficGeneratorConfigTypes
+
+
+#: Union type for all the node configuration types.
+NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKBuildConfiguration:
-    """DPDK build configuration.
+def resolve_path(path: Path) -> Path:
+    """Resolve a path into a real path."""
+    return path.resolve()
 
-    The configuration used for building DPDK.
+
+class BaseDPDKLocation(BaseModel, frozen=True, extra="forbid"):
+    """DPDK location.
+
+    The path to the DPDK sources, build dir and type of location.
 
     Attributes:
-        arch: The target architecture to build for.
-        os: The target os to build for.
-        cpu: The target CPU to build for.
-        compiler: The compiler executable to use.
-        compiler_wrapper: This string will be put in front of the compiler when
-            executing the build. Useful for adding wrapper commands, such as ``ccache``.
-        name: The name of the compiler.
+        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
+            located on the SUT node, instead of the execution host.
     """
 
-    arch: Architecture
-    os: OS
-    cpu: CPUType
-    compiler: Compiler
-    compiler_wrapper: str
-    name: str
+    remote: bool = False
 
-    @classmethod
-    def from_dict(cls, d: DPDKBuildConfigDict) -> Self:
-        r"""A convenience method that processes the inputs before creating an instance.
 
-        `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
-        `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
+class LocalDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
+    """Local DPDK location parent class.
 
-        Args:
-            d: The configuration dictionary.
+    This class is meant to represent any location that is present only locally.
+    """
 
-        Returns:
-            The DPDK build configuration instance.
-        """
-        return cls(
-            arch=Architecture(d["arch"]),
-            os=OS(d["os"]),
-            cpu=CPUType(d["cpu"]),
-            compiler=Compiler(d["compiler"]),
-            compiler_wrapper=d.get("compiler_wrapper", ""),
-            name=f"{d['arch']}-{d['os']}-{d['cpu']}-{d['compiler']}",
-        )
+    remote: Literal[False] = False
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKLocation:
-    """DPDK location.
+class LocalDPDKTreeLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
+    """Local DPDK tree location.
 
-    The path to the DPDK sources, build dir and type of location.
+    This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
+    validation.
 
     Attributes:
-        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
-            must be provided.
-        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
-            provided.
-        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
-            located on the SUT node, instead of the execution host.
-        build_dir: If it's defined, DPDK has been pre-compiled and the build directory is located in
-            a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will be using
-            `build_options` from configuration to build the DPDK from source.
+        dpdk_tree: The path to the DPDK source tree directory.
     """
 
-    dpdk_tree: str | None
-    tarball: str | None
-    remote: bool
-    build_dir: str | None
+    dpdk_tree: Path
 
-    @classmethod
-    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
-        """A convenience method that processes and validates the inputs before creating an instance.
+    #: Resolve the local DPDK tree path
+    resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
 
-        Validate existence and format of `dpdk_tree` or `tarball` on local filesystem, if
-        `remote` is False.
+    @model_validator(mode="after")
+    def validate_dpdk_tree_path(self) -> Self:
+        """Validate the provided DPDK tree path."""
+        assert self.dpdk_tree.exists(), "DPDK tree not found in local filesystem."
+        assert self.dpdk_tree.is_dir(), "The DPDK tree path must be a directory."
+        return self
 
-        Args:
-            d: The configuration dictionary.
 
-        Returns:
-            The DPDK location instance.
+class LocalDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
+    """Local DPDK tarball location.
 
-        Raises:
-            ConfigurationError: If `dpdk_tree` or `tarball` not found in local filesystem or they
-                aren't in the right format.
-        """
-        dpdk_tree = d.get("dpdk_tree")
-        tarball = d.get("tarball")
-        remote = d.get("remote", False)
-
-        if not remote:
-            if dpdk_tree:
-                if not Path(dpdk_tree).exists():
-                    raise ConfigurationError(
-                        f"DPDK tree '{dpdk_tree}' not found in local filesystem."
-                    )
-
-                if not Path(dpdk_tree).is_dir():
-                    raise ConfigurationError(f"The DPDK tree '{dpdk_tree}' must be a directory.")
-
-                dpdk_tree = os.path.realpath(dpdk_tree)
-
-            if tarball:
-                if not Path(tarball).exists():
-                    raise ConfigurationError(
-                        f"DPDK tarball '{tarball}' not found in local filesystem."
-                    )
-
-                if not tarfile.is_tarfile(tarball):
-                    raise ConfigurationError(
-                        f"The DPDK tarball '{tarball}' must be a valid tar archive."
-                    )
-
-        return cls(
-            dpdk_tree=dpdk_tree,
-            tarball=tarball,
-            remote=remote,
-            build_dir=d.get("precompiled_build_dir"),
-        )
+    This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
+    validation.
+
+    Attributes:
+        tarball: The path to the DPDK tarball.
+    """
 
+    tarball: Path
 
-@dataclass
-class DPDKConfiguration:
-    """The configuration of the DPDK build.
+    #: Resolve the local tarball path
+    resolve_tarball_path = field_validator("tarball")(resolve_path)
 
-    The configuration contain the location of the DPDK and configuration used for
-    building it.
+    @model_validator(mode="after")
+    def validate_tarball_path(self) -> Self:
+        """Validate the provided tarball."""
+        assert self.tarball.exists(), "DPDK tarball not found in local filesystem."
+        assert tarfile.is_tarfile(self.tarball), "The DPDK tarball must be a valid tar archive."
+        return self
+
+
+class RemoteDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
+    """Remote DPDK location parent class.
+
+    This class is meant to represent any location that is present only remotely.
+    """
+
+    remote: Literal[True] = True
+
+
+class RemoteDPDKTreeLocation(RemoteDPDKLocation, frozen=True, extra="forbid"):
+    """Remote DPDK tree location.
+
+    This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
+
+    Attributes:
+        dpdk_tree: The path to the DPDK source tree directory.
+    """
+
+    dpdk_tree: PurePath
+
+
+class RemoteDPDKTarballLocation(RemoteDPDKLocation, frozen=True, extra="forbid"):
+    """Remote DPDK tarball location.
+
+    This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
+    validation.
+
+    Attributes:
+        tarball: The path to the DPDK tarball.
+    """
+
+    tarball: PurePath
+
+
+#: Union type for different DPDK locations
+DPDKLocation = (
+    LocalDPDKTreeLocation
+    | LocalDPDKTarballLocation
+    | RemoteDPDKTreeLocation
+    | RemoteDPDKTarballLocation
+)
+
+
+class BaseDPDKBuildConfiguration(BaseModel, frozen=True, extra="forbid"):
+    """The base configuration for different types of build.
+
+    The configuration contain the location of the DPDK and configuration used for building it.
 
     Attributes:
         dpdk_location: The location of the DPDK tree.
-        dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
-            DTS will use pre-built DPDK from `build_dir` in a :class:`DPDKLocation`.
     """
 
     dpdk_location: DPDKLocation
-    dpdk_build_config: DPDKBuildConfiguration | None
 
-    @classmethod
-    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
 
-        Args:
-            d: The configuration dictionary.
+class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
+    """DPDK precompiled build configuration.
 
-        Returns:
-            The DPDK configuration.
-        """
-        return cls(
-            dpdk_location=DPDKLocation.from_dict(d),
-            dpdk_build_config=(
-                DPDKBuildConfiguration.from_dict(d["build_options"])
-                if d.get("build_options")
-                else None
-            ),
-        )
+    Attributes:
+        precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory
+            is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will
+            be using `dpdk_build_config` from configuration to build the DPDK from source.
+    """
+
+    precompiled_build_dir: str = Field(min_length=1)
+
+
+class DPDKBuildOptionsConfiguration(BaseModel, frozen=True, extra="forbid"):
+    """DPDK build options configuration.
+
+    The build options used for building DPDK.
+
+    Attributes:
+        arch: The target architecture to build for.
+        os: The target os to build for.
+        cpu: The target CPU to build for.
+        compiler: The compiler executable to use.
+        compiler_wrapper: This string will be put in front of the compiler when executing the build.
+            Useful for adding wrapper commands, such as ``ccache``.
+    """
+
+    arch: Architecture
+    os: OS
+    cpu: CPUType
+    compiler: Compiler
+    compiler_wrapper: str = ""
 
+    @cached_property
+    def name(self) -> str:
+        """The name of the compiler."""
+        return f"{self.arch}-{self.os}-{self.cpu}-{self.compiler}"
 
-@dataclass(slots=True, frozen=True)
-class TestSuiteConfig:
+
+class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
+    """DPDK uncompiled build configuration.
+
+    Attributes:
+        build_options: The build options to compile DPDK.
+    """
+
+    build_options: DPDKBuildOptionsConfiguration
+
+
+#: Union type for different build configurations
+DPDKBuildConfiguration = DPDKPrecompiledBuildConfiguration | DPDKUncompiledBuildConfiguration
+
+
+class TestSuiteConfig(BaseModel, frozen=True, extra="forbid"):
     """Test suite configuration.
 
-    Information about a single test suite to be executed.
+    Information about a single test suite to be executed. This can also be represented as a string
+    instead of a mapping, example:
+
+    .. code:: yaml
+
+        test_runs:
+        - test_suites:
+            # As string representation:
+            - hello_world # test all of `hello_world`, or
+            - hello_world hello_world_single_core # test only `hello_world_single_core`
+            # or as model fields:
+            - test_suite: hello_world
+              test_cases: [hello_world_single_core] # without this field all test cases are run
 
     Attributes:
-        test_suite: The name of the test suite module without the starting ``TestSuite_``.
-        test_cases: The names of test cases from this test suite to execute.
+        test_suite_name: The name of the test suite module without the starting ``TestSuite_``.
+        test_cases_names: The names of test cases from this test suite to execute.
             If empty, all test cases will be executed.
     """
 
-    test_suite: str
-    test_cases: list[str]
-
+    test_suite_name: str = Field(
+        title="Test suite name",
+        description="The identifying module name of the test suite without the prefix.",
+        alias="test_suite",
+    )
+    test_cases_names: list[str] = Field(
+        default_factory=list,
+        title="Test cases by name",
+        description="The identifying name of the test cases of the test suite.",
+        alias="test_cases",
+    )
+
+    @cached_property
+    def test_suite_spec(self) -> "TestSuiteSpec":
+        """The specification of the requested test suite."""
+        from framework.test_suite import find_by_name
+
+        test_suite_spec = find_by_name(self.test_suite_name)
+        assert (
+            test_suite_spec is not None
+        ), f"{self.test_suite_name} is not a valid test suite module name."
+        return test_suite_spec
+
+    @model_validator(mode="before")
     @classmethod
-    def from_dict(
-        cls,
-        entry: str | TestSuiteConfigDict,
-    ) -> Self:
-        """Create an instance from two different types.
+    def convert_from_string(cls, data: Any) -> Any:
+        """Convert the string representation of the model into a valid mapping."""
+        if isinstance(data, str):
+            [test_suite, *test_cases] = data.split()
+            return dict(test_suite=test_suite, test_cases=test_cases)
+        return data
+
+    @model_validator(mode="after")
+    def validate_names(self) -> Self:
+        """Validate the supplied test suite and test cases names.
+
+        This validator relies on the cached property `test_suite_spec` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        available_test_cases = map(
+            lambda t: t.name, self.test_suite_spec.class_obj.get_test_cases()
+        )
+        for requested_test_case in self.test_cases_names:
+            assert requested_test_case in available_test_cases, (
+                f"{requested_test_case} is not a valid test case "
+                f"of test suite {self.test_suite_name}."
+            )
 
-        Args:
-            entry: Either a suite name or a dictionary containing the config.
+        return self
 
-        Returns:
-            The test suite configuration instance.
-        """
-        if isinstance(entry, str):
-            return cls(test_suite=entry, test_cases=[])
-        elif isinstance(entry, dict):
-            return cls(test_suite=entry["suite"], test_cases=entry["cases"])
-        else:
-            raise TypeError(f"{type(entry)} is not valid for a test suite config.")
 
+class TestRunSUTNodeConfiguration(BaseModel, frozen=True, extra="forbid"):
+    """The SUT node configuration of a test run.
 
-@dataclass(slots=True, frozen=True)
-class TestRunConfiguration:
+    Attributes:
+        node_name: The SUT node to use in this test run.
+        vdevs: The names of virtual devices to test.
+    """
+
+    node_name: str
+    vdevs: list[str] = Field(default_factory=list)
+
+
+class TestRunConfiguration(BaseModel, frozen=True, extra="forbid"):
     """The configuration of a test run.
 
     The configuration contains testbed information, what tests to execute
@@ -524,144 +530,130 @@ class TestRunConfiguration:
         func: Whether to run functional tests.
         skip_smoke_tests: Whether to skip smoke tests.
         test_suites: The names of test suites and/or test cases to execute.
-        system_under_test_node: The SUT node to use in this test run.
-        traffic_generator_node: The TG node to use in this test run.
-        vdevs: The names of virtual devices to test.
+        system_under_test_node: The SUT node configuration to use in this test run.
+        traffic_generator_node: The TG node name to use in this test run.
         random_seed: The seed to use for pseudo-random generation.
     """
 
-    dpdk_config: DPDKConfiguration
-    perf: bool
-    func: bool
-    skip_smoke_tests: bool
-    test_suites: list[TestSuiteConfig]
-    system_under_test_node: SutNodeConfiguration
-    traffic_generator_node: TGNodeConfiguration
-    vdevs: list[str]
-    random_seed: int | None
-
-    @classmethod
-    def from_dict(
-        cls,
-        d: TestRunConfigDict,
-        node_map: dict[str, SutNodeConfiguration | TGNodeConfiguration],
-    ) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
-
-        The DPDK build and the test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
-
-        Args:
-            d: The test run configuration dictionary.
-            node_map: A dictionary mapping node names to their config objects.
-
-        Returns:
-            The test run configuration instance.
-        """
-        test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"]))
-        sut_name = d["system_under_test_node"]["node_name"]
-        skip_smoke_tests = d.get("skip_smoke_tests", False)
-        assert sut_name in node_map, f"Unknown SUT {sut_name} in test run {d}"
-        system_under_test_node = node_map[sut_name]
-        assert isinstance(
-            system_under_test_node, SutNodeConfiguration
-        ), f"Invalid SUT configuration {system_under_test_node}"
-
-        tg_name = d["traffic_generator_node"]
-        assert tg_name in node_map, f"Unknown TG {tg_name} in test run {d}"
-        traffic_generator_node = node_map[tg_name]
-        assert isinstance(
-            traffic_generator_node, TGNodeConfiguration
-        ), f"Invalid TG configuration {traffic_generator_node}"
-
-        vdevs = (
-            d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else []
-        )
-        random_seed = d.get("random_seed", None)
-        return cls(
-            dpdk_config=DPDKConfiguration.from_dict(d["dpdk_build"]),
-            perf=d["perf"],
-            func=d["func"],
-            skip_smoke_tests=skip_smoke_tests,
-            test_suites=test_suites,
-            system_under_test_node=system_under_test_node,
-            traffic_generator_node=traffic_generator_node,
-            vdevs=vdevs,
-            random_seed=random_seed,
-        )
-
-    def copy_and_modify(self, **kwargs) -> Self:
-        """Create a shallow copy with any of the fields modified.
+    dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
+    perf: bool = Field(description="Enable performance testing.")
+    func: bool = Field(description="Enable functional testing.")
+    skip_smoke_tests: bool = False
+    test_suites: list[TestSuiteConfig] = Field(min_length=1)
+    system_under_test_node: TestRunSUTNodeConfiguration
+    traffic_generator_node: str
+    random_seed: int | None = None
 
-        The only new data are those passed to this method.
-        The rest are copied from the object's fields calling the method.
 
-        Args:
-            **kwargs: The names and types of keyword arguments are defined
-                by the fields of the :class:`TestRunConfiguration` class.
+class TestRunWithNodesConfiguration(NamedTuple):
+    """Tuple containing the configuration of the test run and its associated nodes."""
 
-        Returns:
-            The copied and modified test run configuration.
-        """
-        new_config = {}
-        for field in fields(self):
-            if field.name in kwargs:
-                new_config[field.name] = kwargs[field.name]
-            else:
-                new_config[field.name] = getattr(self, field.name)
-
-        return type(self)(**new_config)
+    #:
+    test_run_config: TestRunConfiguration
+    #:
+    sut_node_config: SutNodeConfiguration
+    #:
+    tg_node_config: TGNodeConfiguration
 
 
-@dataclass(slots=True, frozen=True)
-class Configuration:
+class Configuration(BaseModel, extra="forbid"):
     """DTS testbed and test configuration.
 
-    The node configuration is not stored in this object. Rather, all used node configurations
-    are stored inside the test run configuration where the nodes are actually used.
-
     Attributes:
         test_runs: Test run configurations.
+        nodes: Node configurations.
     """
 
-    test_runs: list[TestRunConfiguration]
+    test_runs: list[TestRunConfiguration] = Field(min_length=1)
+    nodes: list[NodeConfigurationTypes] = Field(min_length=1)
 
-    @classmethod
-    def from_dict(cls, d: ConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
+    @cached_property
+    def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]:
+        """List of test runs with the associated nodes."""
+        test_runs_with_nodes = []
 
-        DPDK build and test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
+        for test_run_no, test_run in enumerate(self.test_runs):
+            sut_node_name = test_run.system_under_test_node.node_name
+            sut_node = next(filter(lambda n: n.name == sut_node_name, self.nodes), None)
 
-        Args:
-            d: The configuration dictionary.
+            assert sut_node is not None, (
+                f"test_runs.{test_run_no}.sut_node_config.node_name "
+                f"({test_run.system_under_test_node.node_name}) is not a valid node name"
+            )
+            assert isinstance(sut_node, SutNodeConfiguration), (
+                f"test_runs.{test_run_no}.sut_node_config.node_name is a valid node name, "
+                "but it is not a valid SUT node"
+            )
 
-        Returns:
-            The whole configuration instance.
-        """
-        nodes: list[SutNodeConfiguration | TGNodeConfiguration] = list(
-            map(NodeConfiguration.from_dict, d["nodes"])
-        )
-        assert len(nodes) > 0, "There must be a node to test"
+            tg_node_name = test_run.traffic_generator_node
+            tg_node = next(filter(lambda n: n.name == tg_node_name, self.nodes), None)
 
-        node_map = {node.name: node for node in nodes}
-        assert len(nodes) == len(node_map), "Duplicate node names are not allowed"
+            assert tg_node is not None, (
+                f"test_runs.{test_run_no}.tg_node_name "
+                f"({test_run.traffic_generator_node}) is not a valid node name"
+            )
+            assert isinstance(tg_node, TGNodeConfiguration), (
+                f"test_runs.{test_run_no}.tg_node_name is a valid node name, "
+                "but it is not a valid TG node"
+            )
 
-        test_runs: list[TestRunConfiguration] = list(
-            map(TestRunConfiguration.from_dict, d["test_runs"], [node_map for _ in d])
-        )
+            test_runs_with_nodes.append(TestRunWithNodesConfiguration(test_run, sut_node, tg_node))
+
+        return test_runs_with_nodes
+
+    @field_validator("nodes")
+    @classmethod
+    def validate_node_names(cls, nodes: list[NodeConfiguration]) -> list[NodeConfiguration]:
+        """Validate that the node names are unique."""
+        nodes_by_name: dict[str, int] = {}
+        for node_no, node in enumerate(nodes):
+            assert node.name not in nodes_by_name, (
+                f"node {node_no} cannot have the same name as node {nodes_by_name[node.name]} "
+                f"({node.name})"
+            )
+            nodes_by_name[node.name] = node_no
+
+        return nodes
+
+    @model_validator(mode="after")
+    def validate_ports(self) -> Self:
+        """Validate that the ports are all linked to valid ones."""
+        port_links: dict[tuple[str, str], Literal[False] | tuple[int, int]] = {
+            (node.name, port.pci): False for node in self.nodes for port in node.ports
+        }
+
+        for node_no, node in enumerate(self.nodes):
+            for port_no, port in enumerate(node.ports):
+                peer_port_identifier = (port.peer_node, port.peer_pci)
+                peer_port = port_links.get(peer_port_identifier, None)
+                assert peer_port is not None, (
+                    "invalid peer port specified for " f"nodes.{node_no}.ports.{port_no}"
+                )
+                assert peer_port is False, (
+                    f"the peer port specified for nodes.{node_no}.ports.{port_no} "
+                    f"is already linked to nodes.{peer_port[0]}.ports.{peer_port[1]}"
+                )
+                port_links[peer_port_identifier] = (node_no, port_no)
 
-        return cls(test_runs=test_runs)
+        return self
+
+    @model_validator(mode="after")
+    def validate_test_runs_with_nodes(self) -> Self:
+        """Validate the test runs to nodes associations.
+
+        This validator relies on the cached property `test_runs_with_nodes` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        if self.test_runs_with_nodes:
+            pass
+        return self
 
 
 def load_config(config_file_path: Path) -> Configuration:
     """Load DTS test run configuration from a file.
 
-    Load the YAML test run configuration file
-    and :download:`the configuration file schema <conf_yaml_schema.json>`,
-    validate the test run configuration file, and create a test run configuration object.
+    Load the YAML test run configuration file, validate it, and create a test run configuration
+    object.
 
     The YAML test run configuration file is specified in the :option:`--config-file` command line
     argument or the :envvar:`DTS_CFG_FILE` environment variable.
@@ -671,14 +663,14 @@ def load_config(config_file_path: Path) -> Configuration:
 
     Returns:
         The parsed test run configuration.
+
+    Raises:
+        ConfigurationError: If the supplied configuration file is invalid.
     """
     with open(config_file_path, "r") as f:
         config_data = yaml.safe_load(f)
 
-    schema_path = os.path.join(Path(__file__).parent.resolve(), "conf_yaml_schema.json")
-
-    with open(schema_path, "r") as f:
-        schema = json.load(f)
-    config = warlock.model_factory(schema, name="_Config")(config_data)
-    config_obj: Configuration = Configuration.from_dict(dict(config))  # type: ignore[arg-type]
-    return config_obj
+    try:
+        return Configuration.model_validate(config_data)
+    except ValidationError as e:
+        raise ConfigurationError("failed to load the supplied configuration") from e
diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
deleted file mode 100644
index cc3e78cef5..0000000000
--- a/dts/framework/config/conf_yaml_schema.json
+++ /dev/null
@@ -1,459 +0,0 @@
-{
-  "$schema": "https://json-schema.org/draft-07/schema",
-  "title": "DTS Config Schema",
-  "definitions": {
-    "node_name": {
-      "type": "string",
-      "description": "A unique identifier for a node"
-    },
-    "NIC": {
-      "type": "string",
-      "enum": [
-        "ALL",
-        "ConnectX3_MT4103",
-        "ConnectX4_LX_MT4117",
-        "ConnectX4_MT4115",
-        "ConnectX5_MT4119",
-        "ConnectX5_MT4121",
-        "I40E_10G-10G_BASE_T_BC",
-        "I40E_10G-10G_BASE_T_X722",
-        "I40E_10G-SFP_X722",
-        "I40E_10G-SFP_XL710",
-        "I40E_10G-X722_A0",
-        "I40E_1G-1G_BASE_T_X722",
-        "I40E_25G-25G_SFP28",
-        "I40E_40G-QSFP_A",
-        "I40E_40G-QSFP_B",
-        "IAVF-ADAPTIVE_VF",
-        "IAVF-VF",
-        "IAVF_10G-X722_VF",
-        "ICE_100G-E810C_QSFP",
-        "ICE_25G-E810C_SFP",
-        "ICE_25G-E810_XXV_SFP",
-        "IGB-I350_VF",
-        "IGB_1G-82540EM",
-        "IGB_1G-82545EM_COPPER",
-        "IGB_1G-82571EB_COPPER",
-        "IGB_1G-82574L",
-        "IGB_1G-82576",
-        "IGB_1G-82576_QUAD_COPPER",
-        "IGB_1G-82576_QUAD_COPPER_ET2",
-        "IGB_1G-82580_COPPER",
-        "IGB_1G-I210_COPPER",
-        "IGB_1G-I350_COPPER",
-        "IGB_1G-I354_SGMII",
-        "IGB_1G-PCH_LPTLP_I218_LM",
-        "IGB_1G-PCH_LPTLP_I218_V",
-        "IGB_1G-PCH_LPT_I217_LM",
-        "IGB_1G-PCH_LPT_I217_V",
-        "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
-        "IGC-I225_LM",
-        "IGC-I226_LM",
-        "IXGBE_10G-82599_SFP",
-        "IXGBE_10G-82599_SFP_SF_QP",
-        "IXGBE_10G-82599_T3_LOM",
-        "IXGBE_10G-82599_VF",
-        "IXGBE_10G-X540T",
-        "IXGBE_10G-X540_VF",
-        "IXGBE_10G-X550EM_A_SFP",
-        "IXGBE_10G-X550EM_X_10G_T",
-        "IXGBE_10G-X550EM_X_SFP",
-        "IXGBE_10G-X550EM_X_VF",
-        "IXGBE_10G-X550T",
-        "IXGBE_10G-X550_VF",
-        "brcm_57414",
-        "brcm_P2100G",
-        "cavium_0011",
-        "cavium_a034",
-        "cavium_a063",
-        "cavium_a064",
-        "fastlinq_ql41000",
-        "fastlinq_ql41000_vf",
-        "fastlinq_ql45000",
-        "fastlinq_ql45000_vf",
-        "hi1822",
-        "virtio"
-      ]
-    },
-
-    "ARCH": {
-      "type": "string",
-      "enum": [
-        "x86_64",
-        "arm64",
-        "ppc64le"
-      ]
-    },
-    "OS": {
-      "type": "string",
-      "enum": [
-        "linux"
-      ]
-    },
-    "cpu": {
-      "type": "string",
-      "description": "Native should be the default on x86",
-      "enum": [
-        "native",
-        "armv8a",
-        "dpaa2",
-        "thunderx",
-        "xgene1"
-      ]
-    },
-    "compiler": {
-      "type": "string",
-      "enum": [
-        "gcc",
-        "clang",
-        "icc",
-        "mscv"
-      ]
-    },
-    "build_options": {
-      "type": "object",
-      "properties": {
-        "arch": {
-          "type": "string",
-          "enum": [
-            "ALL",
-            "x86_64",
-            "arm64",
-            "ppc64le",
-            "other"
-          ]
-        },
-        "os": {
-          "$ref": "#/definitions/OS"
-        },
-        "cpu": {
-          "$ref": "#/definitions/cpu"
-        },
-        "compiler": {
-          "$ref": "#/definitions/compiler"
-        },
-        "compiler_wrapper": {
-          "type": "string",
-          "description": "This will be added before compiler to the CC variable when building DPDK. Optional."
-        }
-      },
-      "additionalProperties": false,
-      "required": [
-        "arch",
-        "os",
-        "cpu",
-        "compiler"
-      ]
-    },
-    "dpdk_build": {
-      "type": "object",
-      "description": "DPDK source and build configuration.",
-      "properties": {
-        "dpdk_tree": {
-          "type": "string",
-          "description": "The path to the DPDK source tree directory to test. Only one of `dpdk_tree` or `tarball` must be provided."
-        },
-        "tarball": {
-          "type": "string",
-          "description": "The path to the DPDK source tarball to test. Only one of `dpdk_tree` or `tarball` must be provided."
-        },
-        "remote": {
-          "type": "boolean",
-          "description": "Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball` is located on the SUT node, instead of the execution host."
-        },
-        "precompiled_build_dir": {
-          "type": "string",
-          "description": "If it's defined, DPDK has been pre-built and the build directory is located in a subdirectory of DPDK tree root directory. Otherwise, will be using a `build_options` to build the DPDK from source. Either this or `build_options` must be defined, but not both."
-        },
-        "build_options": {
-          "$ref": "#/definitions/build_options",
-          "description": "Either this or `precompiled_build_dir` must be defined, but not both. DPDK build configuration supported by DTS."
-        }
-      },
-      "allOf": [
-        {
-          "oneOf": [
-            {
-            "required": [
-              "dpdk_tree"
-              ]
-            },
-            {
-              "required": [
-                "tarball"
-              ]
-            }
-          ]
-        },
-        {
-          "oneOf": [
-            {
-              "required": [
-                "precompiled_build_dir"
-              ]
-            },
-            {
-              "required": [
-                "build_options"
-              ]
-            }
-          ]
-        }
-      ],
-      "additionalProperties": false
-    },
-    "hugepages_2mb": {
-      "type": "object",
-      "description": "Optional hugepage configuration. If not specified, hugepages won't be configured and DTS will use system configuration.",
-      "properties": {
-        "number_of": {
-          "type": "integer",
-          "description": "The number of hugepages to configure. Hugepage size will be the system default."
-        },
-        "force_first_numa": {
-          "type": "boolean",
-          "description": "Set to True to force configuring hugepages on the first NUMA node. Defaults to False."
-        }
-      },
-      "additionalProperties": false,
-      "required": [
-        "number_of"
-      ]
-    },
-    "mac_address": {
-      "type": "string",
-      "description": "A MAC address",
-      "pattern": "^([0-9A-Fa-f]{2}[:-]){5}([0-9A-Fa-f]{2})$"
-    },
-    "pci_address": {
-      "type": "string",
-      "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
-    },
-    "port_peer_address": {
-      "description": "Peer is a TRex port, and IXIA port or a PCI address",
-      "oneOf": [
-        {
-          "description": "PCI peer port",
-          "$ref": "#/definitions/pci_address"
-        }
-      ]
-    },
-    "test_suite": {
-      "type": "string",
-      "enum": [
-        "hello_world",
-        "os_udp",
-        "pmd_buffer_scatter",
-        "vlan"
-      ]
-    },
-    "test_target": {
-      "type": "object",
-      "properties": {
-        "suite": {
-          "$ref": "#/definitions/test_suite"
-        },
-        "cases": {
-          "type": "array",
-          "description": "If specified, only this subset of test suite's test cases will be run.",
-          "items": {
-            "type": "string"
-          },
-          "minimum": 1
-        }
-      },
-      "required": [
-        "suite"
-      ],
-      "additionalProperties": false
-    }
-  },
-  "type": "object",
-  "properties": {
-    "nodes": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "name": {
-            "type": "string",
-            "description": "A unique identifier for this node"
-          },
-          "hostname": {
-            "type": "string",
-            "description": "A hostname from which the node running DTS can access this node. This can also be an IP address."
-          },
-          "user": {
-            "type": "string",
-            "description": "The user to access this node with."
-          },
-          "password": {
-            "type": "string",
-            "description": "The password to use on this node. Use only as a last resort. SSH keys are STRONGLY preferred."
-          },
-          "arch": {
-            "$ref": "#/definitions/ARCH"
-          },
-          "os": {
-            "$ref": "#/definitions/OS"
-          },
-          "lcores": {
-            "type": "string",
-            "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
-            "description": "Optional comma-separated list of logical cores to use, e.g.: 1,2,3,4,5,18-22. Defaults to 1. An empty string means use all lcores."
-          },
-          "use_first_core": {
-            "type": "boolean",
-            "description": "Indicate whether DPDK should use the first physical core. It won't be used by default."
-          },
-          "memory_channels": {
-            "type": "integer",
-            "description": "How many memory channels to use. Optional, defaults to 1."
-          },
-          "hugepages_2mb": {
-            "$ref": "#/definitions/hugepages_2mb"
-          },
-          "ports": {
-            "type": "array",
-            "items": {
-              "type": "object",
-              "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.",
-              "properties": {
-                "pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The local PCI address of the port"
-                },
-                "os_driver_for_dpdk": {
-                  "type": "string",
-                  "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)"
-                },
-                "os_driver": {
-                  "type": "string",
-                  "description": "The driver normally used by this port (ex: i40e)"
-                },
-                "peer_node": {
-                  "type": "string",
-                  "description": "The name of the node the peer port is on"
-                },
-                "peer_pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The PCI address of the peer port"
-                }
-              },
-              "additionalProperties": false,
-              "required": [
-                "pci",
-                "os_driver_for_dpdk",
-                "os_driver",
-                "peer_node",
-                "peer_pci"
-              ]
-            },
-            "minimum": 1
-          },
-          "traffic_generator": {
-            "oneOf": [
-              {
-                "type": "object",
-                "description": "Scapy traffic generator. Used for functional testing.",
-                "properties": {
-                  "type": {
-                    "type": "string",
-                    "enum": [
-                      "SCAPY"
-                    ]
-                  }
-                }
-              }
-            ]
-          }
-        },
-        "additionalProperties": false,
-        "required": [
-          "name",
-          "hostname",
-          "user",
-          "arch",
-          "os"
-        ]
-      },
-      "minimum": 1
-    },
-    "test_runs": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "dpdk_build": {
-            "$ref": "#/definitions/dpdk_build"
-          },
-          "perf": {
-            "type": "boolean",
-            "description": "Enable performance testing."
-          },
-          "func": {
-            "type": "boolean",
-            "description": "Enable functional testing."
-          },
-          "test_suites": {
-            "type": "array",
-            "items": {
-              "oneOf": [
-                {
-                  "$ref": "#/definitions/test_suite"
-                },
-                {
-                  "$ref": "#/definitions/test_target"
-                }
-              ]
-            }
-          },
-          "skip_smoke_tests": {
-            "description": "Optional field that allows you to skip smoke testing",
-            "type": "boolean"
-          },
-          "system_under_test_node": {
-            "type":"object",
-            "properties": {
-              "node_name": {
-                "$ref": "#/definitions/node_name"
-              },
-              "vdevs": {
-                "description": "Optional list of names of vdevs to be used in the test run",
-                "type": "array",
-                "items": {
-                  "type": "string"
-                }
-              }
-            },
-            "required": [
-              "node_name"
-            ]
-          },
-          "traffic_generator_node": {
-            "$ref": "#/definitions/node_name"
-          },
-          "random_seed": {
-            "type": "integer",
-            "description": "Optional field. Allows you to set a seed for pseudo-random generation."
-          }
-        },
-        "additionalProperties": false,
-        "required": [
-          "dpdk_build",
-          "perf",
-          "func",
-          "test_suites",
-          "system_under_test_node",
-          "traffic_generator_node"
-        ]
-      },
-      "minimum": 1
-    }
-  },
-  "required": [
-    "test_runs",
-    "nodes"
-  ],
-  "additionalProperties": false
-}
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
deleted file mode 100644
index 02e738a61e..0000000000
--- a/dts/framework/config/types.py
+++ /dev/null
@@ -1,149 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-"""Configuration dictionary contents specification.
-
-These type definitions serve as documentation of the configuration dictionary contents.
-
-The definitions use the built-in :class:`~typing.TypedDict` construct.
-"""
-
-from typing import TypedDict
-
-
-class PortConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    pci: str
-    #:
-    os_driver_for_dpdk: str
-    #:
-    os_driver: str
-    #:
-    peer_node: str
-    #:
-    peer_pci: str
-
-
-class TrafficGeneratorConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    type: str
-
-
-class HugepageConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    number_of: int
-    #:
-    force_first_numa: bool
-
-
-class NodeConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    hugepages_2mb: HugepageConfigurationDict
-    #:
-    name: str
-    #:
-    hostname: str
-    #:
-    user: str
-    #:
-    password: str
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    lcores: str
-    #:
-    use_first_core: bool
-    #:
-    ports: list[PortConfigDict]
-    #:
-    memory_channels: int
-    #:
-    traffic_generator: TrafficGeneratorConfigDict
-
-
-class DPDKBuildConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    cpu: str
-    #:
-    compiler: str
-    #:
-    compiler_wrapper: str
-
-
-class DPDKConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    dpdk_tree: str | None
-    #:
-    tarball: str | None
-    #:
-    remote: bool
-    #:
-    precompiled_build_dir: str | None
-    #:
-    build_options: DPDKBuildConfigDict
-
-
-class TestSuiteConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    suite: str
-    #:
-    cases: list[str]
-
-
-class TestRunSUTConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    node_name: str
-    #:
-    vdevs: list[str]
-
-
-class TestRunConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    dpdk_build: DPDKConfigurationDict
-    #:
-    perf: bool
-    #:
-    func: bool
-    #:
-    skip_smoke_tests: bool
-    #:
-    test_suites: TestSuiteConfigDict
-    #:
-    system_under_test_node: TestRunSUTConfigDict
-    #:
-    traffic_generator_node: str
-    #:
-    random_seed: int
-
-
-class ConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    nodes: list[NodeConfigDict]
-    #:
-    test_runs: list[TestRunConfigDict]
diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 195622c653..c3d9a27a8c 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -30,7 +30,15 @@
 from framework.testbed_model.sut_node import SutNode
 from framework.testbed_model.tg_node import TGNode
 
-from .config import Configuration, TestRunConfiguration, TestSuiteConfig, load_config
+from .config import (
+    Configuration,
+    DPDKPrecompiledBuildConfiguration,
+    SutNodeConfiguration,
+    TestRunConfiguration,
+    TestSuiteConfig,
+    TGNodeConfiguration,
+    load_config,
+)
 from .exception import (
     BlockingTestSuiteError,
     ConfigurationError,
@@ -133,11 +141,10 @@ def run(self) -> None:
             self._result.update_setup(Result.PASS)
 
             # for all test run sections
-            for test_run_config in self._configuration.test_runs:
+            for test_run_with_nodes_config in self._configuration.test_runs_with_nodes:
+                test_run_config, sut_node_config, tg_node_config = test_run_with_nodes_config
                 self._logger.set_stage(DtsStage.test_run_setup)
-                self._logger.info(
-                    f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
-                )
+                self._logger.info(f"Running test run with SUT '{sut_node_config.name}'.")
                 self._init_random_seed(test_run_config)
                 test_run_result = self._result.add_test_run(test_run_config)
                 # we don't want to modify the original config, so create a copy
@@ -145,7 +152,7 @@ def run(self) -> None:
                     SETTINGS.test_suites if SETTINGS.test_suites else test_run_config.test_suites
                 )
                 if not test_run_config.skip_smoke_tests:
-                    test_run_test_suites[:0] = [TestSuiteConfig.from_dict("smoke_tests")]
+                    test_run_test_suites[:0] = [TestSuiteConfig(test_suite="smoke_tests")]
                 try:
                     test_suites_with_cases = self._get_test_suites_with_cases(
                         test_run_test_suites, test_run_config.func, test_run_config.perf
@@ -161,6 +168,8 @@ def run(self) -> None:
                     self._connect_nodes_and_run_test_run(
                         sut_nodes,
                         tg_nodes,
+                        sut_node_config,
+                        tg_node_config,
                         test_run_config,
                         test_run_result,
                         test_suites_with_cases,
@@ -223,10 +232,10 @@ def _get_test_suites_with_cases(
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
+            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
             test_cases: list[type[TestCase]] = []
             func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
-                test_suite_config.test_cases
+                test_suite_config.test_cases_names
             )
             if func:
                 test_cases.extend(func_test_cases)
@@ -305,6 +314,8 @@ def _connect_nodes_and_run_test_run(
         self,
         sut_nodes: dict[str, SutNode],
         tg_nodes: dict[str, TGNode],
+        sut_node_config: SutNodeConfiguration,
+        tg_node_config: TGNodeConfiguration,
         test_run_config: TestRunConfiguration,
         test_run_result: TestRunResult,
         test_suites_with_cases: Iterable[TestSuiteWithCases],
@@ -319,24 +330,26 @@ def _connect_nodes_and_run_test_run(
         Args:
             sut_nodes: A dictionary storing connected/to be connected SUT nodes.
             tg_nodes: A dictionary storing connected/to be connected TG nodes.
+            sut_node_config: The test run's SUT node configuration.
+            tg_node_config: The test run's TG node configuration.
             test_run_config: A test run configuration.
             test_run_result: The test run's result.
             test_suites_with_cases: The test suites with test cases to run.
         """
-        sut_node = sut_nodes.get(test_run_config.system_under_test_node.name)
-        tg_node = tg_nodes.get(test_run_config.traffic_generator_node.name)
+        sut_node = sut_nodes.get(sut_node_config.name)
+        tg_node = tg_nodes.get(tg_node_config.name)
 
         try:
             if not sut_node:
-                sut_node = SutNode(test_run_config.system_under_test_node)
+                sut_node = SutNode(sut_node_config)
                 sut_nodes[sut_node.name] = sut_node
             if not tg_node:
-                tg_node = TGNode(test_run_config.traffic_generator_node)
+                tg_node = TGNode(tg_node_config)
                 tg_nodes[tg_node.name] = tg_node
         except Exception as e:
-            failed_node = test_run_config.system_under_test_node.name
+            failed_node = test_run_config.system_under_test_node.node_name
             if sut_node:
-                failed_node = test_run_config.traffic_generator_node.name
+                failed_node = test_run_config.traffic_generator_node
             self._logger.exception(f"The Creation of node {failed_node} failed.")
             test_run_result.update_setup(Result.FAIL, e)
 
@@ -369,14 +382,22 @@ def _run_test_run(
             ConfigurationError: If the DPDK sources or build is not set up from config or settings.
         """
         self._logger.info(
-            f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
+            f"Running test run with SUT '{test_run_config.system_under_test_node.node_name}'."
         )
         test_run_result.add_sut_info(sut_node.node_info)
         try:
-            dpdk_location = SETTINGS.dpdk_location or test_run_config.dpdk_config.dpdk_location
-            sut_node.set_up_test_run(test_run_config, dpdk_location)
+            dpdk_build_config = test_run_config.dpdk_config
+            if new_location := SETTINGS.dpdk_location:
+                dpdk_build_config = dpdk_build_config.model_copy(
+                    update={"dpdk_location": new_location}
+                )
+            if dir := SETTINGS.precompiled_build_dir:
+                dpdk_build_config = DPDKPrecompiledBuildConfiguration(
+                    dpdk_location=dpdk_build_config.dpdk_location, precompiled_build_dir=dir
+                )
+            sut_node.set_up_test_run(test_run_config, dpdk_build_config)
             test_run_result.add_dpdk_build_info(sut_node.get_dpdk_build_info())
-            tg_node.set_up_test_run(test_run_config, dpdk_location)
+            tg_node.set_up_test_run(test_run_config, dpdk_build_config)
             test_run_result.update_setup(Result.PASS)
         except Exception as e:
             self._logger.exception("Test run setup failed.")
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index a32137dbb8..5a8e6e5aee 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -60,9 +60,8 @@
 .. option:: --precompiled-build-dir
 .. envvar:: DTS_PRECOMPILED_BUILD_DIR
 
-    Define the subdirectory under the DPDK tree root directory where the pre-compiled binaries are
-    located. If set, DTS will build DPDK under the `build` directory instead. Can only be used with
-    --dpdk-tree or --tarball.
+    Define the subdirectory under the DPDK tree root directory or tarball where the pre-compiled
+    binaries are located.
 
 .. option:: --test-suite
 .. envvar:: DTS_TEST_SUITES
@@ -95,13 +94,21 @@
 import argparse
 import os
 import sys
-import tarfile
 from argparse import Action, ArgumentDefaultsHelpFormatter, _get_action_name
 from dataclasses import dataclass, field
 from pathlib import Path
 from typing import Callable
 
-from .config import DPDKLocation, TestSuiteConfig
+from pydantic import ValidationError
+
+from .config import (
+    DPDKLocation,
+    LocalDPDKTarballLocation,
+    LocalDPDKTreeLocation,
+    RemoteDPDKTarballLocation,
+    RemoteDPDKTreeLocation,
+    TestSuiteConfig,
+)
 
 
 @dataclass(slots=True)
@@ -122,6 +129,8 @@ class Settings:
     #:
     dpdk_location: DPDKLocation | None = None
     #:
+    precompiled_build_dir: str | None = None
+    #:
     compile_timeout: float = 1200
     #:
     test_suites: list[TestSuiteConfig] = field(default_factory=list)
@@ -383,13 +392,11 @@ def _get_parser() -> _DTSArgumentParser:
 
     action = dpdk_build.add_argument(
         "--precompiled-build-dir",
-        help="Define the subdirectory under the DPDK tree root directory where the pre-compiled "
-        "binaries are located. If set, DTS will build DPDK under the `build` directory instead. "
-        "Can only be used with --dpdk-tree or --tarball.",
+        help="Define the subdirectory under the DPDK tree root directory or tarball where the "
+        "pre-compiled binaries are located.",
         metavar="DIR_NAME",
     )
     _add_env_var_to_action(action)
-    _required_with_one_of(parser, action, "dpdk_tarball_path", "dpdk_tree_path")
 
     action = parser.add_argument(
         "--compile-timeout",
@@ -442,61 +449,61 @@ def _get_parser() -> _DTSArgumentParser:
 
 
 def _process_dpdk_location(
+    parser: _DTSArgumentParser,
     dpdk_tree: str | None,
     tarball: str | None,
     remote: bool,
-    build_dir: str | None,
-):
+) -> DPDKLocation | None:
     """Process and validate DPDK build arguments.
 
     Ensures that either `dpdk_tree` or `tarball` is provided. Validate existence and format of
     `dpdk_tree` or `tarball` on local filesystem, if `remote` is False. Constructs and returns
-    the :class:`DPDKLocation` with the provided parameters if validation is successful.
+    any valid :class:`DPDKLocation` with the provided parameters if validation is successful.
 
     Args:
-        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
-            must be provided.
-        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
-            provided.
+        dpdk_tree: The path to the DPDK source tree directory.
+        tarball: The path to the DPDK tarball.
         remote: If :data:`True`, `dpdk_tree` or `tarball` is located on the SUT node, instead of the
             execution host.
-        build_dir: If it's defined, DPDK has been pre-built and the build directory is located in a
-            subdirectory of `dpdk_tree` or `tarball` root directory.
 
     Returns:
         A DPDK location if construction is successful, otherwise None.
-
-    Raises:
-        argparse.ArgumentTypeError: If `dpdk_tree` or `tarball` not found in local filesystem or
-            they aren't in the right format.
     """
-    if not (dpdk_tree or tarball):
-        return None
-
-    if not remote:
-        if dpdk_tree:
-            if not Path(dpdk_tree).exists():
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tree '{dpdk_tree}' not found in local filesystem."
-                )
-
-            if not Path(dpdk_tree).is_dir():
-                raise argparse.ArgumentTypeError(f"DPDK tree '{dpdk_tree}' must be a directory.")
-
-            dpdk_tree = os.path.realpath(dpdk_tree)
-
-        if tarball:
-            if not Path(tarball).exists():
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tarball '{tarball}' not found in local filesystem."
-                )
-
-            if not tarfile.is_tarfile(tarball):
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tarball '{tarball}' must be a valid tar archive."
-                )
-
-    return DPDKLocation(dpdk_tree=dpdk_tree, tarball=tarball, remote=remote, build_dir=build_dir)
+    if dpdk_tree:
+        action = parser.find_action("dpdk_tree", _is_from_env)
+
+        try:
+            if remote:
+                return RemoteDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
+            else:
+                return LocalDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
+        except ValidationError as e:
+            print(
+                "An error has occurred while validating the DPDK tree supplied in the "
+                f"{'environment variable' if action else 'arguments'}:",
+                file=sys.stderr,
+            )
+            print(e, file=sys.stderr)
+            sys.exit(1)
+
+    if tarball:
+        action = parser.find_action("tarball", _is_from_env)
+
+        try:
+            if remote:
+                return RemoteDPDKTarballLocation.model_validate({"tarball": tarball})
+            else:
+                return LocalDPDKTarballLocation.model_validate({"tarball": tarball})
+        except ValidationError as e:
+            print(
+                "An error has occurred while validating the DPDK tarball supplied in the "
+                f"{'environment variable' if action else 'arguments'}:",
+                file=sys.stderr,
+            )
+            print(e, file=sys.stderr)
+            sys.exit(1)
+
+    return None
 
 
 def _process_test_suites(
@@ -512,11 +519,24 @@ def _process_test_suites(
     Returns:
         A list of test suite configurations to execute.
     """
-    if parser.find_action("test_suites", _is_from_env):
+    action = parser.find_action("test_suites", _is_from_env)
+    if action:
         # Environment variable in the form of "SUITE1 CASE1 CASE2, SUITE2 CASE1, SUITE3, ..."
         args = [suite_with_cases.split() for suite_with_cases in args[0][0].split(",")]
 
-    return [TestSuiteConfig(test_suite, test_cases) for [test_suite, *test_cases] in args]
+    try:
+        return [
+            TestSuiteConfig(test_suite=test_suite, test_cases=test_cases)
+            for [test_suite, *test_cases] in args
+        ]
+    except ValidationError as e:
+        print(
+            "An error has occurred while validating the test suites supplied in the "
+            f"{'environment variable' if action else 'arguments'}:",
+            file=sys.stderr,
+        )
+        print(e, file=sys.stderr)
+        sys.exit(1)
 
 
 def get_settings() -> Settings:
@@ -532,7 +552,7 @@ def get_settings() -> Settings:
     args = parser.parse_args()
 
     args.dpdk_location = _process_dpdk_location(
-        args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source, args.precompiled_build_dir
+        parser, args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source
     )
     args.test_suites = _process_test_suites(parser, args.test_suites)
 
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index 62867fd80c..6031eaf937 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -17,7 +17,12 @@
 from ipaddress import IPv4Interface, IPv6Interface
 from typing import Union
 
-from framework.config import OS, DPDKLocation, NodeConfiguration, TestRunConfiguration
+from framework.config import (
+    OS,
+    DPDKBuildConfiguration,
+    NodeConfiguration,
+    TestRunConfiguration,
+)
 from framework.exception import ConfigurationError
 from framework.logger import DTSLogger, get_dts_logger
 
@@ -89,13 +94,15 @@ def __init__(self, node_config: NodeConfiguration):
         self._init_ports()
 
     def _init_ports(self) -> None:
-        self.ports = [Port(port_config) for port_config in self.config.ports]
+        self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
         self.main_session.update_ports(self.ports)
         for port in self.ports:
             self.configure_port_state(port)
 
     def set_up_test_run(
-        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+        self,
+        test_run_config: TestRunConfiguration,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Test run setup steps.
 
@@ -105,7 +112,7 @@ def set_up_test_run(
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
-            dpdk_location: The target source of the DPDK tree.
+            dpdk_build_config: The build configuration of DPDK.
         """
         self._setup_hugepages()
 
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index db37424954..294f5b36ba 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -364,7 +364,7 @@ def extract_remote_tarball(
         """
 
     @abstractmethod
-    def is_remote_dir(self, remote_path: str) -> bool:
+    def is_remote_dir(self, remote_path: PurePath) -> bool:
         """Check if the `remote_path` is a directory.
 
         Args:
@@ -375,7 +375,7 @@ def is_remote_dir(self, remote_path: str) -> bool:
         """
 
     @abstractmethod
-    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
+    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
         """Check if the `remote_tarball_path` is a tar archive.
 
         Args:
diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
index 82c84cf4f8..817405bea4 100644
--- a/dts/framework/testbed_model/port.py
+++ b/dts/framework/testbed_model/port.py
@@ -54,7 +54,7 @@ class Port:
     mac_address: str = ""
     logical_name: str = ""
 
-    def __init__(self, config: PortConfig):
+    def __init__(self, node_name: str, config: PortConfig):
         """Initialize the port from `node_name` and `config`.
 
         Args:
@@ -62,7 +62,7 @@ def __init__(self, config: PortConfig):
             config: The test run configuration of the port.
         """
         self.identifier = PortIdentifier(
-            node=config.node,
+            node=node_name,
             pci=config.pci,
         )
         self.os_driver = config.os_driver
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index d7a1f38cad..c0cca2ac50 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -201,12 +201,12 @@ def extract_remote_tarball(
         if expected_dir:
             self.send_command(f"ls {expected_dir}", verify=True)
 
-    def is_remote_dir(self, remote_path: str) -> bool:
+    def is_remote_dir(self, remote_path: PurePath) -> bool:
         """Overrides :meth:`~.os_session.OSSession.is_remote_dir`."""
         result = self.send_command(f"test -d {remote_path}")
         return not result.return_code
 
-    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
+    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
         """Overrides :meth:`~.os_session.OSSession.is_remote_tarfile`."""
         result = self.send_command(f"tar -tvf {remote_tarball_path}")
         return not result.return_code
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 5474d436a1..be3faf7474 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -15,11 +15,17 @@
 import os
 import time
 from dataclasses import dataclass
-from pathlib import PurePath
+from pathlib import Path, PurePath
 
 from framework.config import (
     DPDKBuildConfiguration,
-    DPDKLocation,
+    DPDKBuildOptionsConfiguration,
+    DPDKPrecompiledBuildConfiguration,
+    DPDKUncompiledBuildConfiguration,
+    LocalDPDKTarballLocation,
+    LocalDPDKTreeLocation,
+    RemoteDPDKTarballLocation,
+    RemoteDPDKTreeLocation,
     SutNodeConfiguration,
     TestRunConfiguration,
 )
@@ -178,7 +184,9 @@ def get_dpdk_build_info(self) -> DPDKBuildInfo:
         return DPDKBuildInfo(dpdk_version=self.dpdk_version, compiler_version=self.compiler_version)
 
     def set_up_test_run(
-        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+        self,
+        test_run_config: TestRunConfiguration,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Extend the test run setup with vdev config and DPDK build set up.
 
@@ -188,12 +196,12 @@ def set_up_test_run(
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
-            dpdk_location: The target source of the DPDK tree.
+            dpdk_build_config: The build configuration of DPDK.
         """
-        super().set_up_test_run(test_run_config, dpdk_location)
-        for vdev in test_run_config.vdevs:
+        super().set_up_test_run(test_run_config, dpdk_build_config)
+        for vdev in test_run_config.system_under_test_node.vdevs:
             self.virtual_devices.append(VirtualDevice(vdev))
-        self._set_up_dpdk(dpdk_location, test_run_config.dpdk_config.dpdk_build_config)
+        self._set_up_dpdk(dpdk_build_config)
 
     def tear_down_test_run(self) -> None:
         """Extend the test run teardown with virtual device teardown and DPDK teardown."""
@@ -202,7 +210,8 @@ def tear_down_test_run(self) -> None:
         self._tear_down_dpdk()
 
     def _set_up_dpdk(
-        self, dpdk_location: DPDKLocation, dpdk_build_config: DPDKBuildConfiguration | None
+        self,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Set up DPDK the SUT node and bind ports.
 
@@ -211,21 +220,26 @@ def _set_up_dpdk(
         are bound to those that DPDK needs.
 
         Args:
-            dpdk_location: The location of the DPDK tree.
-            dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
-                DTS will use pre-built DPDK from a :dataclass:`DPDKLocation`.
+            dpdk_build_config: A DPDK build configuration to test.
         """
-        self._set_remote_dpdk_tree_path(dpdk_location.dpdk_tree, dpdk_location.remote)
-        if not self._remote_dpdk_tree_path:
-            if dpdk_location.dpdk_tree:
-                self._copy_dpdk_tree(dpdk_location.dpdk_tree)
-            elif dpdk_location.tarball:
-                self._prepare_and_extract_dpdk_tarball(dpdk_location.tarball, dpdk_location.remote)
-
-        self._set_remote_dpdk_build_dir(dpdk_location.build_dir)
-        if not self.remote_dpdk_build_dir and dpdk_build_config:
-            self._configure_dpdk_build(dpdk_build_config)
-            self._build_dpdk()
+        match dpdk_build_config.dpdk_location:
+            case RemoteDPDKTreeLocation(dpdk_tree=dpdk_tree):
+                self._set_remote_dpdk_tree_path(dpdk_tree)
+            case LocalDPDKTreeLocation(dpdk_tree=dpdk_tree):
+                self._copy_dpdk_tree(dpdk_tree)
+            case RemoteDPDKTarballLocation(tarball=tarball):
+                self._validate_remote_dpdk_tarball(tarball)
+                self._prepare_and_extract_dpdk_tarball(tarball)
+            case LocalDPDKTarballLocation(tarball=tarball):
+                remote_tarball = self._copy_dpdk_tarball_to_remote(tarball)
+                self._prepare_and_extract_dpdk_tarball(remote_tarball)
+
+        match dpdk_build_config:
+            case DPDKPrecompiledBuildConfiguration(precompiled_build_dir=build_dir):
+                self._set_remote_dpdk_build_dir(build_dir)
+            case DPDKUncompiledBuildConfiguration(build_options=build_options):
+                self._configure_dpdk_build(build_options)
+                self._build_dpdk()
 
         self.bind_ports_to_driver()
 
@@ -238,37 +252,29 @@ def _tear_down_dpdk(self) -> None:
         self.compiler_version = None
         self.bind_ports_to_driver(for_dpdk=False)
 
-    def _set_remote_dpdk_tree_path(self, dpdk_tree: str | None, remote: bool):
+    def _set_remote_dpdk_tree_path(self, dpdk_tree: PurePath):
         """Set the path to the remote DPDK source tree based on the provided DPDK location.
 
-        If :data:`dpdk_tree` and :data:`remote` are defined, check existence of :data:`dpdk_tree`
-        on SUT node and sets the `_remote_dpdk_tree_path` property. Otherwise, sets nothing.
-
         Verify DPDK source tree existence on the SUT node, if exists sets the
         `_remote_dpdk_tree_path` property, otherwise sets nothing.
 
         Args:
             dpdk_tree: The path to the DPDK source tree directory.
-            remote: Indicates whether the `dpdk_tree` is already on the SUT node, instead of the
-                execution host.
 
         Raises:
             RemoteFileNotFoundError: If the DPDK source tree is expected to be on the SUT node but
                 is not found.
         """
-        if remote and dpdk_tree:
-            if not self.main_session.remote_path_exists(dpdk_tree):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
-                )
-            if not self.main_session.is_remote_dir(dpdk_tree):
-                raise ConfigurationError(
-                    f"Remote DPDK source tree '{dpdk_tree}' must be a directory."
-                )
-
-            self.__remote_dpdk_tree_path = PurePath(dpdk_tree)
-
-    def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
+        if not self.main_session.remote_path_exists(dpdk_tree):
+            raise RemoteFileNotFoundError(
+                f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
+            )
+        if not self.main_session.is_remote_dir(dpdk_tree):
+            raise ConfigurationError(f"Remote DPDK source tree '{dpdk_tree}' must be a directory.")
+
+        self.__remote_dpdk_tree_path = dpdk_tree
+
+    def _copy_dpdk_tree(self, dpdk_tree_path: Path) -> None:
         """Copy the DPDK source tree to the SUT.
 
         Args:
@@ -288,25 +294,45 @@ def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
             self._remote_tmp_dir, PurePath(dpdk_tree_path).name
         )
 
-    def _prepare_and_extract_dpdk_tarball(self, dpdk_tarball: str, remote: bool) -> None:
-        """Ensure the DPDK tarball is available on the SUT node and extract it.
+    def _validate_remote_dpdk_tarball(self, dpdk_tarball: PurePath) -> None:
+        """Validate the DPDK tarball on the SUT node.
 
-        This method ensures that the DPDK source tree tarball is available on the
-        SUT node. If the `dpdk_tarball` is local, it is copied to the SUT node. If the
-        `dpdk_tarball` is already on the SUT node, it verifies its existence.
-        The `dpdk_tarball` is then extracted on the SUT node.
+        Args:
+            dpdk_tarball: The path to the DPDK tarball on the SUT node.
 
-        This method sets the `_remote_dpdk_tree_path` property to the path of the
-        extracted DPDK tree on the SUT node.
+        Raises:
+            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but is
+                not found.
+            ConfigurationError: If the `dpdk_tarball` is a valid path but not a valid tar archive.
+        """
+        if not self.main_session.remote_path_exists(dpdk_tarball):
+            raise RemoteFileNotFoundError(f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT.")
+        if not self.main_session.is_remote_tarfile(dpdk_tarball):
+            raise ConfigurationError(f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive.")
+
+    def _copy_dpdk_tarball_to_remote(self, dpdk_tarball: Path) -> PurePath:
+        """Copy the local DPDK tarball to the SUT node.
 
         Args:
-            dpdk_tarball: The path to the DPDK tarball, either locally or on the SUT node.
-            remote: Indicates whether the `dpdk_tarball` is already on the SUT node, instead of the
-                execution host.
+            dpdk_tarball: The local path to the DPDK tarball.
 
-        Raises:
-            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but
-                is not found.
+        Returns:
+            The path of the copied tarball on the SUT node.
+        """
+        self._logger.info(
+            f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
+        )
+        self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
+        return self.main_session.join_remote_path(self._remote_tmp_dir, dpdk_tarball.name)
+
+    def _prepare_and_extract_dpdk_tarball(self, remote_tarball_path: PurePath) -> None:
+        """Prepare the remote DPDK tree path and extract the tarball.
+
+        This method extracts the remote tarball and sets the `_remote_dpdk_tree_path` property to
+        the path of the extracted DPDK tree on the SUT node.
+
+        Args:
+            remote_tarball_path: The path to the DPDK tarball on the SUT node.
         """
 
         def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
@@ -324,30 +350,9 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
                     return PurePath(str(remote_tarball_path).replace(suffixes_to_remove, ""))
             return remote_tarball_path.with_suffix("")
 
-        if remote:
-            if not self.main_session.remote_path_exists(dpdk_tarball):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT."
-                )
-            if not self.main_session.is_remote_tarfile(dpdk_tarball):
-                raise ConfigurationError(
-                    f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive."
-                )
-
-            remote_tarball_path = PurePath(dpdk_tarball)
-        else:
-            self._logger.info(
-                f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
-            )
-            self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
-
-            remote_tarball_path = self.main_session.join_remote_path(
-                self._remote_tmp_dir, PurePath(dpdk_tarball).name
-            )
-
         tarball_top_dir = self.main_session.get_tarball_top_dir(remote_tarball_path)
         self.__remote_dpdk_tree_path = self.main_session.join_remote_path(
-            PurePath(remote_tarball_path).parent,
+            remote_tarball_path.parent,
             tarball_top_dir or remove_tarball_suffix(remote_tarball_path),
         )
 
@@ -360,33 +365,32 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
             self._remote_dpdk_tree_path,
         )
 
-    def _set_remote_dpdk_build_dir(self, build_dir: str | None):
+    def _set_remote_dpdk_build_dir(self, build_dir: str):
         """Set the `remote_dpdk_build_dir` on the SUT.
 
-        If :data:`build_dir` is defined, check existence on the SUT node and sets the
+        Check existence on the SUT node and sets the
         `remote_dpdk_build_dir` property by joining the `_remote_dpdk_tree_path` and `build_dir`.
         Otherwise, sets nothing.
 
         Args:
-            build_dir: If it's defined, DPDK has been pre-built and the build directory is located
+            build_dir: DPDK has been pre-built and the build directory is located
                 in a subdirectory of `dpdk_tree` or `tarball` root directory.
 
         Raises:
             RemoteFileNotFoundError: If the `build_dir` is expected but does not exist on the SUT
                 node.
         """
-        if build_dir:
-            remote_dpdk_build_dir = self.main_session.join_remote_path(
-                self._remote_dpdk_tree_path, build_dir
+        remote_dpdk_build_dir = self.main_session.join_remote_path(
+            self._remote_dpdk_tree_path, build_dir
+        )
+        if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
+            raise RemoteFileNotFoundError(
+                f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
             )
-            if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
-                )
 
-            self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
+        self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
 
-    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
+    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildOptionsConfiguration) -> None:
         """Populate common environment variables and set the DPDK build related properties.
 
         This method sets `compiler_version` for additional information and `remote_dpdk_build_dir`
diff --git a/dts/framework/testbed_model/topology.py b/dts/framework/testbed_model/topology.py
index d38ae36c2a..17b333e76a 100644
--- a/dts/framework/testbed_model/topology.py
+++ b/dts/framework/testbed_model/topology.py
@@ -99,7 +99,16 @@ def __init__(self, sut_ports: Iterable[Port], tg_ports: Iterable[Port]):
                     port_links.append(PortLink(sut_port=sut_port, tg_port=tg_port))
 
         self.type = TopologyType.get_from_value(len(port_links))
-        dummy_port = Port(PortConfig("", "", "", "", "", ""))
+        dummy_port = Port(
+            "",
+            PortConfig(
+                pci="0000:00:00.0",
+                os_driver_for_dpdk="",
+                os_driver="",
+                peer_node="",
+                peer_pci="0000:00:00.0",
+            ),
+        )
         self.tg_port_egress = dummy_port
         self.sut_port_ingress = dummy_port
         self.sut_port_egress = dummy_port
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
index a319fa5320..945f6bbbbb 100644
--- a/dts/framework/testbed_model/traffic_generator/__init__.py
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -38,6 +38,4 @@ def create_traffic_generator(
         case ScapyTrafficGeneratorConfig():
             return ScapyTrafficGenerator(tg_node, traffic_generator_config, privileged=True)
         case _:
-            raise ConfigurationError(
-                f"Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
-            )
+            raise ConfigurationError(f"Unknown traffic generator: {traffic_generator_config.type}")
diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 469a12a780..5ac61cd4e1 100644
--- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -45,7 +45,7 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig, **kwargs):
         """
         self._config = config
         self._tg_node = tg_node
-        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
+        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.type}")
         super().__init__(tg_node, **kwargs)
 
     def send_packet(self, packet: Packet, port: Port) -> None:
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index 78a39e32c7..e862e3ac66 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -28,7 +28,7 @@
 
 from .exception import InternalError
 
-REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
+REGEX_FOR_PCI_ADDRESS: str = r"[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}"
 _REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC: str = r"(?:[\da-fA-F]{2}[:-]){5}[\da-fA-F]{2}"
 _REGEX_FOR_DOT_SEP_MAC: str = r"(?:[\da-fA-F]{4}.){2}[\da-fA-F]{4}"
 REGEX_FOR_MAC_ADDRESS: str = rf"{_REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC}|{_REGEX_FOR_DOT_SEP_MAC}"
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
index d7870bd40f..bc3a2a6bf9 100644
--- a/dts/tests/TestSuite_smoke_tests.py
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -127,7 +127,7 @@ def test_device_bound_to_driver(self) -> None:
         path_to_devbind = self.sut_node.path_to_devbind_script
 
         all_nics_in_dpdk_devbind = self.sut_node.main_session.send_command(
-            f"{path_to_devbind} --status | awk '{REGEX_FOR_PCI_ADDRESS}'",
+            f"{path_to_devbind} --status | awk '/{REGEX_FOR_PCI_ADDRESS}/'",
             SETTINGS.timeout,
         ).stdout
 
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v5 5/8] dts: remove warlock dependency
  2024-11-06 18:09 ` [PATCH v5 0/8] dts: Pydantic configuration Luca Vizzarro
                     ` (3 preceding siblings ...)
  2024-11-06 18:09   ` [PATCH v5 4/8] dts: use pydantic in the configuration Luca Vizzarro
@ 2024-11-06 18:09   ` Luca Vizzarro
  2024-11-06 18:09   ` [PATCH v5 6/8] dts: add autodoc pydantic Luca Vizzarro
                     ` (3 subsequent siblings)
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-06 18:09 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

Since pydantic has completely replaced warlock, there is no more need to
keep it as a dependency. This removes it.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
---
 dts/poetry.lock    | 227 +--------------------------------------------
 dts/pyproject.toml |   1 -
 2 files changed, 1 insertion(+), 227 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index 56c50ad52c..9f7db60793 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -34,24 +34,6 @@ files = [
     {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
 ]
 
-[[package]]
-name = "attrs"
-version = "23.1.0"
-description = "Classes Without Boilerplate"
-optional = false
-python-versions = ">=3.7"
-files = [
-    {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
-    {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
-]
-
-[package.extras]
-cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
-dev = ["attrs[docs,tests]", "pre-commit"]
-docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
-tests = ["attrs[tests-no-zope]", "zope-interface"]
-tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
-
 [[package]]
 name = "babel"
 version = "2.13.1"
@@ -491,66 +473,6 @@ MarkupSafe = ">=2.0"
 [package.extras]
 i18n = ["Babel (>=2.7)"]
 
-[[package]]
-name = "jsonpatch"
-version = "1.33"
-description = "Apply JSON-Patches (RFC 6902)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpatch-1.33-py2.py3-none-any.whl", hash = "sha256:0ae28c0cd062bbd8b8ecc26d7d164fbbea9652a1a3693f3b956c1eae5145dade"},
-    {file = "jsonpatch-1.33.tar.gz", hash = "sha256:9fcd4009c41e6d12348b4a0ff2563ba56a2923a7dfee731d004e212e1ee5030c"},
-]
-
-[package.dependencies]
-jsonpointer = ">=1.9"
-
-[[package]]
-name = "jsonpointer"
-version = "2.4"
-description = "Identify specific nodes in a JSON document (RFC 6901)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpointer-2.4-py2.py3-none-any.whl", hash = "sha256:15d51bba20eea3165644553647711d150376234112651b4f1811022aecad7d7a"},
-    {file = "jsonpointer-2.4.tar.gz", hash = "sha256:585cee82b70211fa9e6043b7bb89db6e1aa49524340dde8ad6b63206ea689d88"},
-]
-
-[[package]]
-name = "jsonschema"
-version = "4.18.4"
-description = "An implementation of JSON Schema validation for Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema-4.18.4-py3-none-any.whl", hash = "sha256:971be834317c22daaa9132340a51c01b50910724082c2c1a2ac87eeec153a3fe"},
-    {file = "jsonschema-4.18.4.tar.gz", hash = "sha256:fb3642735399fa958c0d2aad7057901554596c63349f4f6b283c493cf692a25d"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-jsonschema-specifications = ">=2023.03.6"
-referencing = ">=0.28.4"
-rpds-py = ">=0.7.1"
-
-[package.extras]
-format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
-format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
-
-[[package]]
-name = "jsonschema-specifications"
-version = "2023.7.1"
-description = "The JSON Schema meta-schemas and vocabularies, exposed as a Registry"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema_specifications-2023.7.1-py3-none-any.whl", hash = "sha256:05adf340b659828a004220a9613be00fa3f223f2b82002e273dee62fd50524b1"},
-    {file = "jsonschema_specifications-2023.7.1.tar.gz", hash = "sha256:c91a50404e88a1f6ba40636778e2ee08f6e24c5613fe4c53ac24578a5a7f72bb"},
-]
-
-[package.dependencies]
-referencing = ">=0.28.0"
-
 [[package]]
 name = "markupsafe"
 version = "2.1.3"
@@ -1073,21 +995,6 @@ files = [
     {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
 ]
 
-[[package]]
-name = "referencing"
-version = "0.30.0"
-description = "JSON Referencing + Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "referencing-0.30.0-py3-none-any.whl", hash = "sha256:c257b08a399b6c2f5a3510a50d28ab5dbc7bbde049bcaf954d43c446f83ab548"},
-    {file = "referencing-0.30.0.tar.gz", hash = "sha256:47237742e990457f7512c7d27486394a9aadaf876cbfaa4be65b27b4f4d47c6b"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-rpds-py = ">=0.7.0"
-
 [[package]]
 name = "requests"
 version = "2.31.0"
@@ -1109,112 +1016,6 @@ urllib3 = ">=1.21.1,<3"
 socks = ["PySocks (>=1.5.6,!=1.5.7)"]
 use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
 
-[[package]]
-name = "rpds-py"
-version = "0.9.2"
-description = "Python bindings to Rust's persistent data structures (rpds)"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:ab6919a09c055c9b092798ce18c6c4adf49d24d4d9e43a92b257e3f2548231e7"},
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d55777a80f78dd09410bd84ff8c95ee05519f41113b2df90a69622f5540c4f8b"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a216b26e5af0a8e265d4efd65d3bcec5fba6b26909014effe20cd302fd1138fa"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:29cd8bfb2d716366a035913ced99188a79b623a3512292963d84d3e06e63b496"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:44659b1f326214950a8204a248ca6199535e73a694be8d3e0e869f820767f12f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:745f5a43fdd7d6d25a53ab1a99979e7f8ea419dfefebcab0a5a1e9095490ee5e"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a987578ac5214f18b99d1f2a3851cba5b09f4a689818a106c23dbad0dfeb760f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf4151acb541b6e895354f6ff9ac06995ad9e4175cbc6d30aaed08856558201f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:03421628f0dc10a4119d714a17f646e2837126a25ac7a256bdf7c3943400f67f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:13b602dc3e8dff3063734f02dcf05111e887f301fdda74151a93dbbc249930fe"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:fae5cb554b604b3f9e2c608241b5d8d303e410d7dfb6d397c335f983495ce7f6"},
-    {file = "rpds_py-0.9.2-cp310-none-win32.whl", hash = "sha256:47c5f58a8e0c2c920cc7783113df2fc4ff12bf3a411d985012f145e9242a2764"},
-    {file = "rpds_py-0.9.2-cp310-none-win_amd64.whl", hash = "sha256:4ea6b73c22d8182dff91155af018b11aac9ff7eca085750455c5990cb1cfae6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:e564d2238512c5ef5e9d79338ab77f1cbbda6c2d541ad41b2af445fb200385e3"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f411330a6376fb50e5b7a3e66894e4a39e60ca2e17dce258d53768fea06a37bd"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e7521f5af0233e89939ad626b15278c71b69dc1dfccaa7b97bd4cdf96536bb7"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8d3335c03100a073883857e91db9f2e0ef8a1cf42dc0369cbb9151c149dbbc1b"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d25b1c1096ef0447355f7293fbe9ad740f7c47ae032c2884113f8e87660d8f6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a5d3fbd02efd9cf6a8ffc2f17b53a33542f6b154e88dd7b42ef4a4c0700fdad"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c5934e2833afeaf36bd1eadb57256239785f5af0220ed8d21c2896ec4d3a765f"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:095b460e117685867d45548fbd8598a8d9999227e9061ee7f012d9d264e6048d"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:91378d9f4151adc223d584489591dbb79f78814c0734a7c3bfa9c9e09978121c"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:24a81c177379300220e907e9b864107614b144f6c2a15ed5c3450e19cf536fae"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:de0b6eceb46141984671802d412568d22c6bacc9b230174f9e55fc72ef4f57de"},
-    {file = "rpds_py-0.9.2-cp311-none-win32.whl", hash = "sha256:700375326ed641f3d9d32060a91513ad668bcb7e2cffb18415c399acb25de2ab"},
-    {file = "rpds_py-0.9.2-cp311-none-win_amd64.whl", hash = "sha256:0766babfcf941db8607bdaf82569ec38107dbb03c7f0b72604a0b346b6eb3298"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_10_7_x86_64.whl", hash = "sha256:b1440c291db3f98a914e1afd9d6541e8fc60b4c3aab1a9008d03da4651e67386"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0f2996fbac8e0b77fd67102becb9229986396e051f33dbceada3debaacc7033f"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f30d205755566a25f2ae0382944fcae2f350500ae4df4e795efa9e850821d82"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:159fba751a1e6b1c69244e23ba6c28f879a8758a3e992ed056d86d74a194a0f3"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a1f044792e1adcea82468a72310c66a7f08728d72a244730d14880cd1dabe36b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9251eb8aa82e6cf88510530b29eef4fac825a2b709baf5b94a6094894f252387"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01899794b654e616c8625b194ddd1e5b51ef5b60ed61baa7a2d9c2ad7b2a4238"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0c43f8ae8f6be1d605b0465671124aa8d6a0e40f1fb81dcea28b7e3d87ca1e1"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:207f57c402d1f8712618f737356e4b6f35253b6d20a324d9a47cb9f38ee43a6b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b52e7c5ae35b00566d244ffefba0f46bb6bec749a50412acf42b1c3f402e2c90"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:978fa96dbb005d599ec4fd9ed301b1cc45f1a8f7982d4793faf20b404b56677d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:6aa8326a4a608e1c28da191edd7c924dff445251b94653988efb059b16577a4d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:aad51239bee6bff6823bbbdc8ad85136c6125542bbc609e035ab98ca1e32a192"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4bd4dc3602370679c2dfb818d9c97b1137d4dd412230cfecd3c66a1bf388a196"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dd9da77c6ec1f258387957b754f0df60766ac23ed698b61941ba9acccd3284d1"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:190ca6f55042ea4649ed19c9093a9be9d63cd8a97880106747d7147f88a49d18"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:876bf9ed62323bc7dcfc261dbc5572c996ef26fe6406b0ff985cbcf460fc8a4c"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa2818759aba55df50592ecbc95ebcdc99917fa7b55cc6796235b04193eb3c55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9ea4d00850ef1e917815e59b078ecb338f6a8efda23369677c54a5825dbebb55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:5855c85eb8b8a968a74dc7fb014c9166a05e7e7a8377fb91d78512900aadd13d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:14c408e9d1a80dcb45c05a5149e5961aadb912fff42ca1dd9b68c0044904eb32"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:65a0583c43d9f22cb2130c7b110e695fff834fd5e832a776a107197e59a1898e"},
-    {file = "rpds_py-0.9.2-cp38-none-win32.whl", hash = "sha256:71f2f7715935a61fa3e4ae91d91b67e571aeb5cb5d10331ab681256bda2ad920"},
-    {file = "rpds_py-0.9.2-cp38-none-win_amd64.whl", hash = "sha256:674c704605092e3ebbbd13687b09c9f78c362a4bc710343efe37a91457123044"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:07e2c54bef6838fa44c48dfbc8234e8e2466d851124b551fc4e07a1cfeb37260"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f7fdf55283ad38c33e35e2855565361f4bf0abd02470b8ab28d499c663bc5d7c"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:890ba852c16ace6ed9f90e8670f2c1c178d96510a21b06d2fa12d8783a905193"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:50025635ba8b629a86d9d5474e650da304cb46bbb4d18690532dd79341467846"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:517cbf6e67ae3623c5127206489d69eb2bdb27239a3c3cc559350ef52a3bbf0b"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0836d71ca19071090d524739420a61580f3f894618d10b666cf3d9a1688355b1"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c439fd54b2b9053717cca3de9583be6584b384d88d045f97d409f0ca867d80f"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f68996a3b3dc9335037f82754f9cdbe3a95db42bde571d8c3be26cc6245f2324"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7d68dc8acded354c972116f59b5eb2e5864432948e098c19fe6994926d8e15c3"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:f963c6b1218b96db85fc37a9f0851eaf8b9040aa46dec112611697a7023da535"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5a46859d7f947061b4010e554ccd1791467d1b1759f2dc2ec9055fa239f1bc26"},
-    {file = "rpds_py-0.9.2-cp39-none-win32.whl", hash = "sha256:e07e5dbf8a83c66783a9fe2d4566968ea8c161199680e8ad38d53e075df5f0d0"},
-    {file = "rpds_py-0.9.2-cp39-none-win_amd64.whl", hash = "sha256:682726178138ea45a0766907957b60f3a1bf3acdf212436be9733f28b6c5af3c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_10_7_x86_64.whl", hash = "sha256:196cb208825a8b9c8fc360dc0f87993b8b260038615230242bf18ec84447c08d"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:c7671d45530fcb6d5e22fd40c97e1e1e01965fc298cbda523bb640f3d923b387"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83b32f0940adec65099f3b1c215ef7f1d025d13ff947975a055989cb7fd019a4"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7f67da97f5b9eac838b6980fc6da268622e91f8960e083a34533ca710bec8611"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03975db5f103997904c37e804e5f340c8fdabbb5883f26ee50a255d664eed58c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:987b06d1cdb28f88a42e4fb8a87f094e43f3c435ed8e486533aea0bf2e53d931"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c861a7e4aef15ff91233751619ce3a3d2b9e5877e0fcd76f9ea4f6847183aa16"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:02938432352359805b6da099c9c95c8a0547fe4b274ce8f1a91677401bb9a45f"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:ef1f08f2a924837e112cba2953e15aacfccbbfcd773b4b9b4723f8f2ddded08e"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:35da5cc5cb37c04c4ee03128ad59b8c3941a1e5cd398d78c37f716f32a9b7f67"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:141acb9d4ccc04e704e5992d35472f78c35af047fa0cfae2923835d153f091be"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_10_7_x86_64.whl", hash = "sha256:79f594919d2c1a0cc17d1988a6adaf9a2f000d2e1048f71f298b056b1018e872"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:a06418fe1155e72e16dddc68bb3780ae44cebb2912fbd8bb6ff9161de56e1798"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b2eb034c94b0b96d5eddb290b7b5198460e2d5d0c421751713953a9c4e47d10"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b08605d248b974eb02f40bdcd1a35d3924c83a2a5e8f5d0fa5af852c4d960af"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a0805911caedfe2736935250be5008b261f10a729a303f676d3d5fea6900c96a"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab2299e3f92aa5417d5e16bb45bb4586171c1327568f638e8453c9f8d9e0f020"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c8d7594e38cf98d8a7df25b440f684b510cf4627fe038c297a87496d10a174f"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8b9ec12ad5f0a4625db34db7e0005be2632c1013b253a4a60e8302ad4d462afd"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:1fcdee18fea97238ed17ab6478c66b2095e4ae7177e35fb71fbe561a27adf620"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_i686.whl", hash = "sha256:933a7d5cd4b84f959aedeb84f2030f0a01d63ae6cf256629af3081cf3e3426e8"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:686ba516e02db6d6f8c279d1641f7067ebb5dc58b1d0536c4aaebb7bf01cdc5d"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_10_7_x86_64.whl", hash = "sha256:0173c0444bec0a3d7d848eaeca2d8bd32a1b43f3d3fde6617aac3731fa4be05f"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:d576c3ef8c7b2d560e301eb33891d1944d965a4d7a2eacb6332eee8a71827db6"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed89861ee8c8c47d6beb742a602f912b1bb64f598b1e2f3d758948721d44d468"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1054a08e818f8e18910f1bee731583fe8f899b0a0a5044c6e680ceea34f93876"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99e7c4bb27ff1aab90dcc3e9d37ee5af0231ed98d99cb6f5250de28889a3d502"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c545d9d14d47be716495076b659db179206e3fd997769bc01e2d550eeb685596"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9039a11bca3c41be5a58282ed81ae422fa680409022b996032a43badef2a3752"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fb39aca7a64ad0c9490adfa719dbeeb87d13be137ca189d2564e596f8ba32c07"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:2d8b3b3a2ce0eaa00c5bbbb60b6713e94e7e0becab7b3db6c5c77f979e8ed1f1"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:99b1c16f732b3a9971406fbfe18468592c5a3529585a45a35adbc1389a529a03"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:c27ee01a6c3223025f4badd533bea5e87c988cb0ba2811b690395dfe16088cfe"},
-    {file = "rpds_py-0.9.2.tar.gz", hash = "sha256:8d70e8f14900f2657c249ea4def963bed86a29b81f81f5b76b5a9215680de945"},
-]
-
 [[package]]
 name = "scapy"
 version = "2.5.0"
@@ -1472,17 +1273,6 @@ files = [
     {file = "types_PyYAML-6.0.12.11-py3-none-any.whl", hash = "sha256:a461508f3096d1d5810ec5ab95d7eeecb651f3a15b71959999988942063bf01d"},
 ]
 
-[[package]]
-name = "typing-extensions"
-version = "4.11.0"
-description = "Backported and Experimental Type Hints for Python 3.8+"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "typing_extensions-4.11.0-py3-none-any.whl", hash = "sha256:c1f94d72897edaf4ce775bb7558d5b79d8126906a14ea5ed1635921406c0387a"},
-    {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
-]
-
 [[package]]
 name = "typing-extensions"
 version = "4.12.2"
@@ -1511,22 +1301,7 @@ secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.
 socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
 zstd = ["zstandard (>=0.18.0)"]
 
-[[package]]
-name = "warlock"
-version = "2.0.1"
-description = "Python object model built on JSON schema and JSON patch."
-optional = false
-python-versions = ">=3.7,<4.0"
-files = [
-    {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"},
-    {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"},
-]
-
-[package.dependencies]
-jsonpatch = ">=1,<2"
-jsonschema = ">=4,<5"
-
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827"
+content-hash = "310e2d3725e20ffc6ef017db92e8000c042eb2ac98a1a5eb441de17c87417e9f"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 6c2d1ca8a4..9a3fb02ee9 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -20,7 +20,6 @@ documentation = "https://doc.dpdk.org/guides/tools/dts.html"
 
 [tool.poetry.dependencies]
 python = "^3.10"
-warlock = "^2.0.1"
 PyYAML = "^6.0"
 types-PyYAML = "^6.0.8"
 fabric = "^2.7.1"
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v5 6/8] dts: add autodoc pydantic
  2024-11-06 18:09 ` [PATCH v5 0/8] dts: Pydantic configuration Luca Vizzarro
                     ` (4 preceding siblings ...)
  2024-11-06 18:09   ` [PATCH v5 5/8] dts: remove warlock dependency Luca Vizzarro
@ 2024-11-06 18:09   ` Luca Vizzarro
  2024-11-06 18:09   ` [PATCH v5 7/8] dts: improve configuration API docs Luca Vizzarro
                     ` (2 subsequent siblings)
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-06 18:09 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

Add and enable the autodoc-pydantic sphinx extension. Pydantic models
are not correctly recognised by autodoc, causing the generated docs to
lack all the actual model information. The autodoc-pydantic sphinx
extension fixes the original behaviour by correctly formatting them.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
---
 doc/guides/conf.py       |  13 +++
 doc/guides/tools/dts.rst | 187 ++-------------------------------------
 dts/poetry.lock          |  59 +++++++++++-
 dts/pyproject.toml       |   1 +
 4 files changed, 79 insertions(+), 181 deletions(-)

diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index b553d9d5bf..71fed45b3d 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -60,6 +60,19 @@
 # DTS API docs additional configuration
 if environ.get('DTS_DOC_BUILD'):
     extensions = ['sphinx.ext.napoleon', 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx']
+
+    # Pydantic models require autodoc_pydantic for the right formatting
+    try:
+        import sphinxcontrib.autodoc_pydantic
+
+        extensions.append("sphinxcontrib.autodoc_pydantic")
+    except ImportError:
+        print(
+            "The DTS API doc dependencies are missing. The generated output won't be "
+            "as intended, and autodoc may throw unexpected warnings.",
+            file=stderr,
+        )
+
     # Napoleon enables the Google format of Python doscstrings.
     napoleon_numpy_docstring = False
     napoleon_attr_annotations = True
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index c52de1808c..fb6504fa59 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -204,9 +204,10 @@ node, and then run the tests with the newly built binaries.
 Configuring DTS
 ~~~~~~~~~~~~~~~
 
-DTS configuration is split into nodes and test runs and build targets within test runs,
-and follows a defined schema as described in `Configuration Schema`_.
-By default, DTS will try to use the ``dts/conf.yaml`` :ref:`config file <configuration_schema_example>`,
+DTS configuration is split into nodes and test runs, and must respect the model definitions as
+documented in the DTS API docs under the ``config`` page. The root of the configuration is
+represented by the ``Configuration`` model.
+By default, DTS will try to use the ``dts/conf.yaml`` :ref:`config file <configuration_example>`,
 which is a template that illustrates what can be configured in DTS.
 
 The user must have :ref:`administrator privileges <sut_admin_user>`
@@ -470,184 +471,10 @@ The output is generated in ``build/doc/api/dts/html``.
 
    Make sure to fix any Sphinx warnings when adding or updating docstrings.
 
+.. _configuration_example:
 
-Configuration Schema
---------------------
-
-Definitions
-~~~~~~~~~~~
-
-_`Node name`
-   *string* – A unique identifier for a node.
-   **Examples**: ``SUT1``, ``TG1``.
-
-_`ARCH`
-   *string* – The CPU architecture.
-   **Supported values**: ``x86_64``, ``arm64``, ``ppc64le``.
-
-_`CPU`
-   *string* – The CPU microarchitecture. Use ``native`` for x86.
-   **Supported values**: ``native``, ``armv8a``, ``dpaa2``, ``thunderx``, ``xgene1``.
-
-_`OS`
-   *string* – The operating system. **Supported values**: ``linux``.
-
-_`Compiler`
-   *string* – The compiler used for building DPDK.
-   **Supported values**: ``gcc``, ``clang``, ``icc``, ``mscv``.
-
-_`Build target`
-   *mapping* – Build targets supported by DTS for building DPDK, described as:
-
-   ==================== =================================================================
-   ``arch``             See `ARCH`_
-   ``os``               See `OS`_
-   ``cpu``              See `CPU`_
-   ``compiler``         See `Compiler`_
-   ``compiler_wrapper`` *string* – Value prepended to the CC variable for the DPDK build.
-
-                        **Example**: ``ccache``
-   ==================== =================================================================
-
-_`hugepages_2mb`
-   *mapping* – hugepages_2mb described as:
-
-   ==================== ================================================================
-   ``number_of``        *integer* – The number of 2MB hugepages to configure.
-
-                        Hugepage size will be the system default.
-   ``force_first_numa`` (*optional*, defaults to ``false``) – If ``true``, it forces the
-
-                        configuration of hugepages on the first NUMA node.
-   ==================== ================================================================
-
-_`Network port`
-   *mapping* – the NIC port described as:
-
-   ====================== =================================================================================
-   ``pci``                *string* – the local PCI address of the port. **Example**: ``0000:00:08.0``
-   ``os_driver_for_dpdk`` | *string* – this port's device driver when using with DPDK
-                          | When setting up the SUT, DTS will bind the network device to this driver
-                          | for compatibility with DPDK.
-
-                          **Examples**: ``vfio-pci``, ``mlx5_core``
-   ``os_driver``          | *string* – this port's device driver when **not** using with DPDK
-                          | When tearing down the tests on the SUT, DTS will bind the network device
-                          | *back* to this driver. This driver is meant to be the one that the SUT would
-                          | normally use for this device, or whichever driver it is preferred to leave the
-                          | device bound to after testing.
-                          | This also represents the driver that is used in conjunction with the traffic
-                          | generator software.
-
-                          **Examples**: ``i40e``, ``mlx5_core``
-   ``peer_node``          *string* – the name of the peer node connected to this port.
-   ``peer_pci``           *string* – the PCI address of the peer node port. **Example**: ``000a:01:00.1``
-   ====================== =================================================================================
-
-_`Test suite`
-   *string* – name of the test suite to run. **Examples**: ``hello_world``, ``os_udp``
-
-_`Test target`
-   *mapping* – selects specific test cases to run from a test suite. Mapping is described as follows:
-
-   ========= ===============================================================================================
-   ``suite`` See `Test suite`_
-   ``cases`` (*optional*) *sequence* of *string* – list of the selected test cases in the test suite to run.
-
-             Unknown test cases will be silently ignored.
-   ========= ===============================================================================================
-
-
-Properties
-~~~~~~~~~~
-
-The configuration requires listing all the test run environments and nodes
-involved in the testing. These can be defined with the following mappings:
-
-``test runs``
-   `sequence <https://docs.python.org/3/library/stdtypes.html#sequence-types-list-tuple-range>`_ listing
-   the test run environments. Each entry is described as per the following
-   `mapping <https://docs.python.org/3/library/stdtypes.html#mapping-types-dict>`_:
-
-   +----------------------------+-------------------------------------------------------------------+
-   | ``build_targets``          | *sequence* of `Build target`_                                     |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``perf``                   | *boolean* – Enable performance testing.                           |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``func``                   | *boolean* – Enable functional testing.                            |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``test_suites``            | *sequence* of **one of** `Test suite`_ **or** `Test target`_      |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``skip_smoke_tests``       | (*optional*) *boolean* – Allows you to skip smoke testing         |
-   |                            | if ``true``.                                                      |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``system_under_test_node`` | System under test node specified with:                            |
-   |                            +---------------+---------------------------------------------------+
-   |                            | ``node_name`` | See `Node name`_                                  |
-   |                            +---------------+---------------------------------------------------+
-   |                            | ``vdevs``     | (*optional*) *sequence* of *string*               |
-   |                            |               |                                                   |
-   |                            |               | List of virtual devices passed with the ``--vdev``|
-   |                            |               | argument to DPDK. **Example**: ``crypto_openssl`` |
-   +----------------------------+---------------+---------------------------------------------------+
-   | ``traffic_generator_node`` | Node name for the traffic generator node.                         |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``random_seed``            | (*optional*) *int* – Set a seed for pseudo-random generation.     |
-   +----------------------------+-------------------------------------------------------------------+
-
-``nodes``
-   `sequence <https://docs.python.org/3/library/stdtypes.html#sequence-types-list-tuple-range>`_ listing
-   the nodes. Each entry is described as per the following
-   `mapping <https://docs.python.org/3/library/stdtypes.html#mapping-types-dict>`_:
-
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``name``              | See `Node name`_                                                                      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``hostname``          | *string* – The network hostname or IP address of this node.                           |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``user``              | *string* – The SSH user credential to use to login to this node.                      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``password``          | (*optional*) *string* – The SSH password credential for this node.                    |
-   |                       |                                                                                       |
-   |                       | **NB**: Use only as last resort. SSH keys are **strongly** preferred.                 |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``arch``              | The architecture of this node. See `ARCH`_ for supported values.                      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``os``                | The operating system of this node. See `OS`_ for supported values.                    |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``lcores``            | | (*optional*, defaults to 1) *string* – Comma-separated list of logical              |
-   |                       | | cores to use. An empty string means use all lcores.                                 |
-   |                       |                                                                                       |
-   |                       | **Example**: ``1,2,3,4,5,18-22``                                                      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``use_first_core``    | (*optional*, defaults to ``false``) *boolean*                                         |
-   |                       |                                                                                       |
-   |                       | Indicates whether DPDK should use only the first physical core or not.                |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``memory_channels``   | (*optional*, defaults to 1) *integer*                                                 |
-   |                       |                                                                                       |
-   |                       | The number of the memory channels to use.                                             |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``hugepages_2mb``     | (*optional*) See `hugepages_2mb`_. If unset, 2MB hugepages won't be configured        |
-   |                       |                                                                                       |
-   |                       | in favour of the system configuration.                                                |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``ports``             | | *sequence* of `Network port`_ – Describe ports that are **directly** paired with    |
-   |                       | | other nodes used in conjunction with this one. Both ends of the links must be       |
-   |                       | | described. If there any inconsistencies DTS won't run.                              |
-   |                       |                                                                                       |
-   |                       | **Example**: port 1 of node ``SUT1`` is connected to port 1 of node ``TG1`` etc.      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``traffic_generator`` | (*optional*) Traffic generator, if any, setup on this node described as:              |
-   |                       +----------+----------------------------------------------------------------------------+
-   |                       | ``type`` | *string* – **Supported values**: *SCAPY*                                   |
-   +-----------------------+----------+----------------------------------------------------------------------------+
-
-
-.. _configuration_schema_example:
-
-Example
-~~~~~~~
+Configuration Example
+---------------------
 
 The following example (which can be found in ``dts/conf.yaml``) sets up two nodes:
 
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 9f7db60793..ee564676b4 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -34,6 +34,29 @@ files = [
     {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
 ]
 
+[[package]]
+name = "autodoc-pydantic"
+version = "2.2.0"
+description = "Seamlessly integrate pydantic models in your Sphinx documentation."
+optional = false
+python-versions = "<4.0.0,>=3.8.1"
+files = [
+    {file = "autodoc_pydantic-2.2.0-py3-none-any.whl", hash = "sha256:8c6a36fbf6ed2700ea9c6d21ea76ad541b621fbdf16b5a80ee04673548af4d95"},
+]
+
+[package.dependencies]
+pydantic = ">=2.0,<3.0.0"
+pydantic-settings = ">=2.0,<3.0.0"
+Sphinx = ">=4.0"
+
+[package.extras]
+docs = ["myst-parser (>=3.0.0,<4.0.0)", "sphinx-copybutton (>=0.5.0,<0.6.0)", "sphinx-rtd-theme (>=2.0.0,<3.0.0)", "sphinx-tabs (>=3,<4)", "sphinxcontrib-mermaid (>=0.9.0,<0.10.0)"]
+erdantic = ["erdantic (<2.0)"]
+linting = ["ruff (>=0.4.0,<0.5.0)"]
+security = ["pip-audit (>=2.7.2,<3.0.0)"]
+test = ["coverage (>=7,<8)", "defusedxml (>=0.7.1)", "pytest (>=8.0.0,<9.0.0)", "pytest-sugar (>=1.0.0,<2.0.0)"]
+type-checking = ["mypy (>=1.9,<2.0)", "types-docutils (>=0.20,<0.21)", "typing-extensions (>=4.11,<5.0)"]
+
 [[package]]
 name = "babel"
 version = "2.13.1"
@@ -829,6 +852,26 @@ files = [
 [package.dependencies]
 typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
 
+[[package]]
+name = "pydantic-settings"
+version = "2.6.0"
+description = "Settings management using Pydantic"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic_settings-2.6.0-py3-none-any.whl", hash = "sha256:4a819166f119b74d7f8c765196b165f95cc7487ce58ea27dec8a5a26be0970e0"},
+    {file = "pydantic_settings-2.6.0.tar.gz", hash = "sha256:44a1804abffac9e6a30372bb45f6cafab945ef5af25e66b1c634c01dd39e0188"},
+]
+
+[package.dependencies]
+pydantic = ">=2.7.0"
+python-dotenv = ">=0.21.0"
+
+[package.extras]
+azure-key-vault = ["azure-identity (>=1.16.0)", "azure-keyvault-secrets (>=4.8.0)"]
+toml = ["tomli (>=2.0.1)"]
+yaml = ["pyyaml (>=6.0.1)"]
+
 [[package]]
 name = "pydocstyle"
 version = "6.1.1"
@@ -935,6 +978,20 @@ cffi = ">=1.4.1"
 docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"]
 tests = ["hypothesis (>=3.27.0)", "pytest (>=3.2.1,!=3.3.0)"]
 
+[[package]]
+name = "python-dotenv"
+version = "1.0.1"
+description = "Read key-value pairs from a .env file and set them as environment variables"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "python-dotenv-1.0.1.tar.gz", hash = "sha256:e324ee90a023d808f1959c46bcbc04446a10ced277783dc6ee09987c37ec10ca"},
+    {file = "python_dotenv-1.0.1-py3-none-any.whl", hash = "sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a"},
+]
+
+[package.extras]
+cli = ["click (>=5.0)"]
+
 [[package]]
 name = "pyyaml"
 version = "6.0.1"
@@ -1304,4 +1361,4 @@ zstd = ["zstandard (>=0.18.0)"]
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "310e2d3725e20ffc6ef017db92e8000c042eb2ac98a1a5eb441de17c87417e9f"
+content-hash = "fe9a9fdf7b43e8dce2fb5ee600921d4047fef2f4037a78bbd150f71df202493e"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 9a3fb02ee9..f69c70877a 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -44,6 +44,7 @@ optional = true
 sphinx = "<=7"
 sphinx-rtd-theme = ">=1.2.2"
 pyelftools = "^0.31"
+autodoc-pydantic = "^2.2.0"
 
 [build-system]
 requires = ["poetry-core>=1.0.0"]
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v5 7/8] dts: improve configuration API docs
  2024-11-06 18:09 ` [PATCH v5 0/8] dts: Pydantic configuration Luca Vizzarro
                     ` (5 preceding siblings ...)
  2024-11-06 18:09   ` [PATCH v5 6/8] dts: add autodoc pydantic Luca Vizzarro
@ 2024-11-06 18:09   ` Luca Vizzarro
  2024-11-06 18:09   ` [PATCH v5 8/8] dts: use TestSuiteSpec class imports Luca Vizzarro
  2024-11-07  0:34   ` [PATCH v5 0/8] dts: Pydantic configuration Patrick Robb
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-06 18:09 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

Pydantic models are not treated the same way as dataclasses by autodoc.
As a consequence the docstrings need to be applied directly to each
field. Otherwise the generated API documentation page would present two
entries per each field with each their own differences.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
---
 doc/guides/tools/dts.rst         |   5 +-
 dts/framework/config/__init__.py | 253 +++++++++++--------------------
 2 files changed, 88 insertions(+), 170 deletions(-)

diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index fb6504fa59..f4e297413d 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -1,5 +1,6 @@
 ..  SPDX-License-Identifier: BSD-3-Clause
     Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
+    Copyright(c) 2024 Arm Limited
 
 DPDK Test Suite
 ===============
@@ -327,8 +328,8 @@ where we deviate or where some additional clarification is helpful:
    * The ``dataclass.dataclass`` decorator changes how the attributes are processed.
      The dataclass attributes which result in instance variables/attributes
      should also be recorded in the ``Attributes:`` section.
-   * Class variables/attributes, on the other hand, should be documented with ``#:``
-     above the type annotated line.
+   * Class variables/attributes and Pydantic model fields, on the other hand, should be documented
+     with ``#:`` above the type annotated line.
      The description may be omitted if the meaning is obvious.
    * The ``Enum`` and ``TypedDict`` also process the attributes in particular ways
      and should be documented with ``#:`` as well.
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index 252e945e12..670e2a0a28 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -116,54 +116,34 @@ class TrafficGeneratorType(str, Enum):
 
 
 class HugepageConfiguration(BaseModel, frozen=True, extra="forbid"):
-    r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
-
-    Attributes:
-        number_of: The number of hugepages to allocate.
-        force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.
-    """
+    r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s."""
 
+    #: The number of hugepages to allocate.
     number_of: int
+    #: If :data:`True`, the hugepages will be configured on the first NUMA node.
     force_first_numa: bool
 
 
 class PortConfig(BaseModel, frozen=True, extra="forbid"):
-    r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
-
-    Attributes:
-        pci: The PCI address of the port.
-        os_driver_for_dpdk: The operating system driver name for use with DPDK.
-        os_driver: The operating system driver name when the operating system controls the port.
-        peer_node: The :class:`~framework.testbed_model.node.Node` of the port
-            connected to this port.
-        peer_pci: The PCI address of the port connected to this port.
-    """
+    r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s."""
 
-    pci: str = Field(
-        description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS
-    )
-    os_driver_for_dpdk: str = Field(
-        description="The driver that the kernel should bind this device to for DPDK to use it.",
-        examples=["vfio-pci", "mlx5_core"],
-    )
-    os_driver: str = Field(
-        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
-    )
-    peer_node: str = Field(description="The name of the peer node this port is connected to.")
-    peer_pci: str = Field(
-        description="The PCI address of the peer port this port is connected to.",
-        pattern=REGEX_FOR_PCI_ADDRESS,
-    )
+    #: The PCI address of the port.
+    pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS)
+    #: The driver that the kernel should bind this device to for DPDK to use it.
+    os_driver_for_dpdk: str = Field(examples=["vfio-pci", "mlx5_core"])
+    #: The operating system driver name when the operating system controls the port.
+    os_driver: str = Field(examples=["i40e", "ice", "mlx5_core"])
+    #: The name of the peer node this port is connected to.
+    peer_node: str
+    #: The PCI address of the peer port connected to this port.
+    peer_pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS)
 
 
 class TrafficGeneratorConfig(BaseModel, frozen=True, extra="forbid"):
-    """A protocol required to define traffic generator types.
-
-    Attributes:
-        type: The traffic generator type, the child class is required to define to be distinguished
-            among others.
-    """
+    """A protocol required to define traffic generator types."""
 
+    #: The traffic generator type the child class is required to define to be distinguished among
+    #: others.
     type: TrafficGeneratorType
 
 
@@ -176,13 +156,10 @@ class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="fo
 #: A union type discriminating traffic generators by the `type` field.
 TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
 
-
-#: A field representing logical core ranges.
+#: Comma-separated list of logical cores to use. An empty string means use all lcores.
 LogicalCores = Annotated[
     str,
     Field(
-        description="Comma-separated list of logical cores to use. "
-        "An empty string means use all lcores.",
         examples=["1,2,3,4,5,18-22", "10-15"],
         pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
     ),
@@ -190,61 +167,41 @@ class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig, frozen=True, extra="fo
 
 
 class NodeConfiguration(BaseModel, frozen=True, extra="forbid"):
-    r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
-
-    Attributes:
-        name: The name of the :class:`~framework.testbed_model.node.Node`.
-        hostname: The hostname of the :class:`~framework.testbed_model.node.Node`.
-            Can be an IP or a domain name.
-        user: The name of the user used to connect to
-            the :class:`~framework.testbed_model.node.Node`.
-        password: The password of the user. The use of passwords is heavily discouraged.
-            Please use keys instead.
-        arch: The architecture of the :class:`~framework.testbed_model.node.Node`.
-        os: The operating system of the :class:`~framework.testbed_model.node.Node`.
-        lcores: A comma delimited list of logical cores to use when running DPDK.
-        use_first_core: If :data:`True`, the first logical core won't be used.
-        hugepages: An optional hugepage configuration.
-        ports: The ports that can be used in testing.
-    """
-
-    name: str = Field(description="A unique identifier for this node.")
-    hostname: str = Field(description="The hostname or IP address of the node.")
-    user: str = Field(description="The login user to use to connect to this node.")
-    password: str | None = Field(
-        default=None,
-        description="The login password to use to connect to this node. "
-        "SSH keys are STRONGLY preferred, use only as last resort.",
-    )
+    r"""The configuration of :class:`~framework.testbed_model.node.Node`\s."""
+
+    #: The name of the :class:`~framework.testbed_model.node.Node`.
+    name: str
+    #: The hostname of the :class:`~framework.testbed_model.node.Node`. Can also be an IP address.
+    hostname: str
+    #: The name of the user used to connect to the :class:`~framework.testbed_model.node.Node`.
+    user: str
+    #: The password of the user. The use of passwords is heavily discouraged, please use SSH keys.
+    password: str | None = None
+    #: The architecture of the :class:`~framework.testbed_model.node.Node`.
     arch: Architecture
+    #: The operating system of the :class:`~framework.testbed_model.node.Node`.
     os: OS
+    #: A comma delimited list of logical cores to use when running DPDK.
     lcores: LogicalCores = "1"
-    use_first_core: bool = Field(
-        default=False, description="DPDK won't use the first physical core if set to False."
-    )
+    #: If :data:`True`, the first logical core won't be used.
+    use_first_core: bool = False
+    #: An optional hugepage configuration.
     hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
+    #: The ports that can be used in testing.
     ports: list[PortConfig] = Field(min_length=1)
 
 
 class SutNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
-    """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
+    """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration."""
 
-    Attributes:
-        memory_channels: The number of memory channels to use when running DPDK.
-    """
-
-    memory_channels: int = Field(
-        default=1, description="Number of memory channels to use when running DPDK."
-    )
+    #: The number of memory channels to use when running DPDK.
+    memory_channels: int = 1
 
 
 class TGNodeConfiguration(NodeConfiguration, frozen=True, extra="forbid"):
-    """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
-
-    Attributes:
-        traffic_generator: The configuration of the traffic generator present on the TG node.
-    """
+    """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration."""
 
+    #: The configuration of the traffic generator present on the TG node.
     traffic_generator: TrafficGeneratorConfigTypes
 
 
@@ -258,20 +215,18 @@ def resolve_path(path: Path) -> Path:
 
 
 class BaseDPDKLocation(BaseModel, frozen=True, extra="forbid"):
-    """DPDK location.
+    """DPDK location base class.
 
-    The path to the DPDK sources, build dir and type of location.
-
-    Attributes:
-        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
-            located on the SUT node, instead of the execution host.
+    The path to the DPDK sources and type of location.
     """
 
+    #: Specifies whether to find DPDK on the SUT node or on the local host. Which are respectively
+    #: represented by :class:`RemoteDPDKLocation` and :class:`LocalDPDKTreeLocation`.
     remote: bool = False
 
 
 class LocalDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
-    """Local DPDK location parent class.
+    """Local DPDK location base class.
 
     This class is meant to represent any location that is present only locally.
     """
@@ -284,14 +239,12 @@ class LocalDPDKTreeLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
 
     This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
     validation.
-
-    Attributes:
-        dpdk_tree: The path to the DPDK source tree directory.
     """
 
+    #: The path to the DPDK source tree directory on the local host passed as string.
     dpdk_tree: Path
 
-    #: Resolve the local DPDK tree path
+    #: Resolve the local DPDK tree path.
     resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
 
     @model_validator(mode="after")
@@ -307,14 +260,12 @@ class LocalDPDKTarballLocation(LocalDPDKLocation, frozen=True, extra="forbid"):
 
     This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
     validation.
-
-    Attributes:
-        tarball: The path to the DPDK tarball.
     """
 
+    #: The path to the DPDK tarball on the local host passed as string.
     tarball: Path
 
-    #: Resolve the local tarball path
+    #: Resolve the local tarball path.
     resolve_tarball_path = field_validator("tarball")(resolve_path)
 
     @model_validator(mode="after")
@@ -326,7 +277,7 @@ def validate_tarball_path(self) -> Self:
 
 
 class RemoteDPDKLocation(BaseDPDKLocation, frozen=True, extra="forbid"):
-    """Remote DPDK location parent class.
+    """Remote DPDK location base class.
 
     This class is meant to represent any location that is present only remotely.
     """
@@ -338,11 +289,9 @@ class RemoteDPDKTreeLocation(RemoteDPDKLocation, frozen=True, extra="forbid"):
     """Remote DPDK tree location.
 
     This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
-
-    Attributes:
-        dpdk_tree: The path to the DPDK source tree directory.
     """
 
+    #: The path to the DPDK source tree directory on the remote node passed as string.
     dpdk_tree: PurePath
 
 
@@ -351,11 +300,9 @@ class RemoteDPDKTarballLocation(RemoteDPDKLocation, frozen=True, extra="forbid")
 
     This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
     validation.
-
-    Attributes:
-        tarball: The path to the DPDK tarball.
     """
 
+    #: The path to the DPDK tarball on the remote node passed as string.
     tarball: PurePath
 
 
@@ -372,23 +319,17 @@ class BaseDPDKBuildConfiguration(BaseModel, frozen=True, extra="forbid"):
     """The base configuration for different types of build.
 
     The configuration contain the location of the DPDK and configuration used for building it.
-
-    Attributes:
-        dpdk_location: The location of the DPDK tree.
     """
 
+    #: The location of the DPDK tree.
     dpdk_location: DPDKLocation
 
 
 class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
-    """DPDK precompiled build configuration.
-
-    Attributes:
-        precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory
-            is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will
-            be using `dpdk_build_config` from configuration to build the DPDK from source.
-    """
+    """DPDK precompiled build configuration."""
 
+    #: If it's defined, DPDK has been pre-compiled and the build directory is located in a
+    #: subdirectory of `~dpdk_location.dpdk_tree` or `~dpdk_location.tarball` root directory.
     precompiled_build_dir: str = Field(min_length=1)
 
 
@@ -396,20 +337,18 @@ class DPDKBuildOptionsConfiguration(BaseModel, frozen=True, extra="forbid"):
     """DPDK build options configuration.
 
     The build options used for building DPDK.
-
-    Attributes:
-        arch: The target architecture to build for.
-        os: The target os to build for.
-        cpu: The target CPU to build for.
-        compiler: The compiler executable to use.
-        compiler_wrapper: This string will be put in front of the compiler when executing the build.
-            Useful for adding wrapper commands, such as ``ccache``.
     """
 
+    #: The target architecture to build for.
     arch: Architecture
+    #: The target OS to build for.
     os: OS
+    #: The target CPU to build for.
     cpu: CPUType
+    #: The compiler executable to use.
     compiler: Compiler
+    #: This string will be put in front of the compiler when executing the build. Useful for adding
+    #: wrapper commands, such as ``ccache``.
     compiler_wrapper: str = ""
 
     @cached_property
@@ -419,12 +358,9 @@ def name(self) -> str:
 
 
 class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration, frozen=True, extra="forbid"):
-    """DPDK uncompiled build configuration.
-
-    Attributes:
-        build_options: The build options to compile DPDK.
-    """
+    """DPDK uncompiled build configuration."""
 
+    #: The build options to compiled DPDK with.
     build_options: DPDKBuildOptionsConfiguration
 
 
@@ -448,24 +384,13 @@ class TestSuiteConfig(BaseModel, frozen=True, extra="forbid"):
             # or as model fields:
             - test_suite: hello_world
               test_cases: [hello_world_single_core] # without this field all test cases are run
-
-    Attributes:
-        test_suite_name: The name of the test suite module without the starting ``TestSuite_``.
-        test_cases_names: The names of test cases from this test suite to execute.
-            If empty, all test cases will be executed.
     """
 
-    test_suite_name: str = Field(
-        title="Test suite name",
-        description="The identifying module name of the test suite without the prefix.",
-        alias="test_suite",
-    )
-    test_cases_names: list[str] = Field(
-        default_factory=list,
-        title="Test cases by name",
-        description="The identifying name of the test cases of the test suite.",
-        alias="test_cases",
-    )
+    #: The name of the test suite module without the starting ``TestSuite_``.
+    test_suite_name: str = Field(alias="test_suite")
+    #: The names of test cases from this test suite to execute. If empty, all test cases will be
+    #: executed.
+    test_cases_names: list[str] = Field(default_factory=list, alias="test_cases")
 
     @cached_property
     def test_suite_spec(self) -> "TestSuiteSpec":
@@ -507,14 +432,11 @@ def validate_names(self) -> Self:
 
 
 class TestRunSUTNodeConfiguration(BaseModel, frozen=True, extra="forbid"):
-    """The SUT node configuration of a test run.
-
-    Attributes:
-        node_name: The SUT node to use in this test run.
-        vdevs: The names of virtual devices to test.
-    """
+    """The SUT node configuration of a test run."""
 
+    #: The SUT node to use in this test run.
     node_name: str
+    #: The names of virtual devices to test.
     vdevs: list[str] = Field(default_factory=list)
 
 
@@ -523,25 +445,23 @@ class TestRunConfiguration(BaseModel, frozen=True, extra="forbid"):
 
     The configuration contains testbed information, what tests to execute
     and with what DPDK build.
-
-    Attributes:
-        dpdk_config: The DPDK configuration used to test.
-        perf: Whether to run performance tests.
-        func: Whether to run functional tests.
-        skip_smoke_tests: Whether to skip smoke tests.
-        test_suites: The names of test suites and/or test cases to execute.
-        system_under_test_node: The SUT node configuration to use in this test run.
-        traffic_generator_node: The TG node name to use in this test run.
-        random_seed: The seed to use for pseudo-random generation.
     """
 
+    #: The DPDK configuration used to test.
     dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
-    perf: bool = Field(description="Enable performance testing.")
-    func: bool = Field(description="Enable functional testing.")
+    #: Whether to run performance tests.
+    perf: bool
+    #: Whether to run functional tests.
+    func: bool
+    #: Whether to skip smoke tests.
     skip_smoke_tests: bool = False
+    #: The names of test suites and/or test cases to execute.
     test_suites: list[TestSuiteConfig] = Field(min_length=1)
+    #: The SUT node configuration to use in this test run.
     system_under_test_node: TestRunSUTNodeConfiguration
+    #: The TG node name to use in this test run.
     traffic_generator_node: str
+    #: The seed to use for pseudo-random generation.
     random_seed: int | None = None
 
 
@@ -557,14 +477,11 @@ class TestRunWithNodesConfiguration(NamedTuple):
 
 
 class Configuration(BaseModel, extra="forbid"):
-    """DTS testbed and test configuration.
-
-    Attributes:
-        test_runs: Test run configurations.
-        nodes: Node configurations.
-    """
+    """DTS testbed and test configuration."""
 
+    #: Test run configurations.
     test_runs: list[TestRunConfiguration] = Field(min_length=1)
+    #: Node configurations.
     nodes: list[NodeConfigurationTypes] = Field(min_length=1)
 
     @cached_property
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v5 8/8] dts: use TestSuiteSpec class imports
  2024-11-06 18:09 ` [PATCH v5 0/8] dts: Pydantic configuration Luca Vizzarro
                     ` (6 preceding siblings ...)
  2024-11-06 18:09   ` [PATCH v5 7/8] dts: improve configuration API docs Luca Vizzarro
@ 2024-11-06 18:09   ` Luca Vizzarro
  2024-11-07  0:34   ` [PATCH v5 0/8] dts: Pydantic configuration Patrick Robb
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-06 18:09 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

The introduction of TestSuiteSpec adds auto-discovery of test suites,
which are also automatically imported. This causes double imports as the
runner loads the test suites. This changes the behaviour of the runner
to load the imported classes from TestSuiteSpec instead of importing
them again.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
---
 dts/framework/runner.py | 84 ++++-------------------------------------
 1 file changed, 7 insertions(+), 77 deletions(-)

diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index c3d9a27a8c..5f5837a132 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -2,6 +2,7 @@
 # Copyright(c) 2010-2019 Intel Corporation
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
+# Copyright(c) 2024 Arm Limited
 
 """Test suite runner module.
 
@@ -17,8 +18,6 @@
 and the test case stage runs test cases individually.
 """
 
-import importlib
-import inspect
 import os
 import random
 import sys
@@ -39,12 +38,7 @@
     TGNodeConfiguration,
     load_config,
 )
-from .exception import (
-    BlockingTestSuiteError,
-    ConfigurationError,
-    SSHTimeoutError,
-    TestCaseVerifyError,
-)
+from .exception import BlockingTestSuiteError, SSHTimeoutError, TestCaseVerifyError
 from .logger import DTSLogger, DtsStage, get_dts_logger
 from .settings import SETTINGS
 from .test_result import (
@@ -215,11 +209,10 @@ def _get_test_suites_with_cases(
         func: bool,
         perf: bool,
     ) -> list[TestSuiteWithCases]:
-        """Test suites with test cases discovery.
+        """Get test suites with selected cases.
 
-        The test suites with test cases defined in the user configuration are discovered
-        and stored for future use so that we don't import the modules twice and so that
-        the list of test suites with test cases is available for recording right away.
+        The test suites with test cases defined in the user configuration are selected
+        and the corresponding functions and classes are gathered.
 
         Args:
             test_suite_configs: Test suite configurations.
@@ -227,12 +220,12 @@ def _get_test_suites_with_cases(
             perf: Whether to include performance test cases in the final list.
 
         Returns:
-            The discovered test suites, each with test cases.
+            The test suites, each with test cases.
         """
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
+            test_suite_class = test_suite_config.test_suite_spec.class_obj
             test_cases: list[type[TestCase]] = []
             func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
                 test_suite_config.test_cases_names
@@ -245,71 +238,8 @@ def _get_test_suites_with_cases(
             test_suites_with_cases.append(
                 TestSuiteWithCases(test_suite_class=test_suite_class, test_cases=test_cases)
             )
-
         return test_suites_with_cases
 
-    def _get_test_suite_class(self, module_name: str) -> type[TestSuite]:
-        """Find the :class:`TestSuite` class in `module_name`.
-
-        The full module name is `module_name` prefixed with `self._test_suite_module_prefix`.
-        The module name is a standard filename with words separated with underscores.
-        Search the `module_name` for a :class:`TestSuite` class which starts
-        with `self._test_suite_class_prefix`, continuing with CamelCase `module_name`.
-        The first matching class is returned.
-
-        The CamelCase convention applies to abbreviations, acronyms, initialisms and so on::
-
-            OS -> Os
-            TCP -> Tcp
-
-        Args:
-            module_name: The module name without prefix where to search for the test suite.
-
-        Returns:
-            The found test suite class.
-
-        Raises:
-            ConfigurationError: If the corresponding module is not found or
-                a valid :class:`TestSuite` is not found in the module.
-        """
-
-        def is_test_suite(object) -> bool:
-            """Check whether `object` is a :class:`TestSuite`.
-
-            The `object` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
-
-            Args:
-                object: The object to be checked.
-
-            Returns:
-                :data:`True` if `object` is a subclass of `TestSuite`.
-            """
-            try:
-                if issubclass(object, TestSuite) and object is not TestSuite:
-                    return True
-            except TypeError:
-                return False
-            return False
-
-        testsuite_module_path = f"{self._test_suite_module_prefix}{module_name}"
-        try:
-            test_suite_module = importlib.import_module(testsuite_module_path)
-        except ModuleNotFoundError as e:
-            raise ConfigurationError(
-                f"Test suite module '{testsuite_module_path}' not found."
-            ) from e
-
-        camel_case_suite_name = "".join(
-            [suite_word.capitalize() for suite_word in module_name.split("_")]
-        )
-        full_suite_name_to_find = f"{self._test_suite_class_prefix}{camel_case_suite_name}"
-        for class_name, class_obj in inspect.getmembers(test_suite_module, is_test_suite):
-            if class_name == full_suite_name_to_find:
-                return class_obj
-        raise ConfigurationError(
-            f"Couldn't find any valid test suites in {test_suite_module.__name__}."
-        )
-
     def _connect_nodes_and_run_test_run(
         self,
         sut_nodes: dict[str, SutNode],
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v5 4/8] dts: use pydantic in the configuration
  2024-11-06 18:09   ` [PATCH v5 4/8] dts: use pydantic in the configuration Luca Vizzarro
@ 2024-11-07  0:33     ` Patrick Robb
  0 siblings, 0 replies; 83+ messages in thread
From: Patrick Robb @ 2024-11-07  0:33 UTC (permalink / raw)
  To: Luca Vizzarro; +Cc: dev, Paul Szczepanek, Nicholas Pratte

[-- Attachment #1: Type: text/plain, Size: 98 bytes --]

The added config validation here is great, thanks.

Reviewed-by: Patrick Robb <probb@iol.unh.edu>

[-- Attachment #2: Type: text/html, Size: 188 bytes --]

^ permalink raw reply	[flat|nested] 83+ messages in thread

* Re: [PATCH v5 0/8] dts: Pydantic configuration
  2024-11-06 18:09 ` [PATCH v5 0/8] dts: Pydantic configuration Luca Vizzarro
                     ` (7 preceding siblings ...)
  2024-11-06 18:09   ` [PATCH v5 8/8] dts: use TestSuiteSpec class imports Luca Vizzarro
@ 2024-11-07  0:34   ` Patrick Robb
  8 siblings, 0 replies; 83+ messages in thread
From: Patrick Robb @ 2024-11-07  0:34 UTC (permalink / raw)
  To: Luca Vizzarro; +Cc: dev, Paul Szczepanek

[-- Attachment #1: Type: text/plain, Size: 53 bytes --]

Series-reviewed-by: Patrick Robb <probb@iol.unh.edu>

[-- Attachment #2: Type: text/html, Size: 119 bytes --]

^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v6 0/9] dts: Pydantic configuration
  2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
                   ` (8 preceding siblings ...)
  2024-11-06 18:09 ` [PATCH v5 0/8] dts: Pydantic configuration Luca Vizzarro
@ 2024-11-08 11:39 ` Luca Vizzarro
  2024-11-08 11:39   ` [PATCH v6 1/9] dts: add pydantic dependency Luca Vizzarro
                     ` (8 more replies)
  9 siblings, 9 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-08 11:39 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

Hi there,

sending a v6 for the pydantic changes.

v6:
- rebased
- fixed API doc errors when building outside of the poetry shell
  - now re-using a `to_pascal_case` function instead of pydantic's
    as this would block generating the docs correctly if pydantic
    is missing
v5:
- rebased
- fixed typos
- renamed NodeInfo to OSSessionInfo
- fixed bug on DPDKRemoteTarballConfiguration object
v4:
- added autodoc_pydantic due to autodoc warnings
- fixed pydantic models docstrings
- updated docs
- refactored DPDKBuildInfo and NodeInfo which didn't belong in
  configuration
v3:
- removed the common FrozenModel and configured each BaseModel
  individually, due to mypy complaints
v2:
- rebased and merge conflicts resolved:
  - capabilities patch introducing TestCase has now been combined with
    TestSuiteSpec
  - external build patch added more configuration complexity which has
    been re-worked in pydantic adding exclusion via structured models
- split pydantic/warlock dependency chains
- deleted the config schema as no longer needed
- removed config schema generator
- turned all configuration dataclasses into Pydantic BaseModels
- refactored
- improved docstrings

Best,
Luca

Luca Vizzarro (9):
  dts: add pydantic dependency
  dts: add TestSuiteSpec class and discovery
  dts: refactor build and node info classes
  dts: use pydantic in the configuration
  dts: remove warlock dependency
  dts: add autodoc pydantic
  dts: improve configuration API docs
  dts: fix custom enum behaviour with docs
  dts: use TestSuiteSpec class imports

 doc/api/dts/conf_yaml_schema.json             |   1 -
 doc/api/dts/framework.config.rst              |   6 -
 doc/api/dts/framework.config.types.rst        |   8 -
 doc/guides/conf.py                            |  13 +
 doc/guides/tools/dts.rst                      | 192 +---
 dts/conf.yaml                                 |  11 +-
 dts/framework/config/__init__.py              | 860 ++++++++----------
 dts/framework/config/conf_yaml_schema.json    | 459 ----------
 dts/framework/config/types.py                 | 149 ---
 dts/framework/remote_session/testpmd_shell.py |   3 +-
 dts/framework/runner.py                       | 139 +--
 dts/framework/settings.py                     | 124 +--
 dts/framework/test_result.py                  |   6 +-
 dts/framework/test_suite.py                   | 190 +++-
 dts/framework/testbed_model/capability.py     |  12 +-
 dts/framework/testbed_model/node.py           |  15 +-
 dts/framework/testbed_model/os_session.py     |  27 +-
 dts/framework/testbed_model/port.py           |   4 +-
 dts/framework/testbed_model/posix_session.py  |  12 +-
 dts/framework/testbed_model/sut_node.py       | 204 +++--
 dts/framework/testbed_model/topology.py       |  14 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 dts/framework/utils.py                        |   7 +-
 dts/poetry.lock                               | 423 +++++----
 dts/pyproject.toml                            |   3 +-
 dts/tests/TestSuite_smoke_tests.py            |   2 +-
 27 files changed, 1090 insertions(+), 1800 deletions(-)
 delete mode 120000 doc/api/dts/conf_yaml_schema.json
 delete mode 100644 doc/api/dts/framework.config.types.rst
 delete mode 100644 dts/framework/config/conf_yaml_schema.json
 delete mode 100644 dts/framework/config/types.py

-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v6 1/9] dts: add pydantic dependency
  2024-11-08 11:39 ` [PATCH v6 0/9] " Luca Vizzarro
@ 2024-11-08 11:39   ` Luca Vizzarro
  2024-11-08 11:39   ` [PATCH v6 2/9] dts: add TestSuiteSpec class and discovery Luca Vizzarro
                     ` (7 subsequent siblings)
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-08 11:39 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

As part of configuration validation and deserialization improvements,
this adds pydantic as a project dependency. Pydantic is a library that
caters to all of the aforementioned needs, while improving the process
and code.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
Reviewed-by: Patrick Robb <probb@iol.unh.edu>
---
 dts/poetry.lock    | 171 ++++++++++++++++++++++++++++++++++++++++++++-
 dts/pyproject.toml |   1 +
 2 files changed, 170 insertions(+), 2 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index cf5f6569c6..56c50ad52c 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,4 +1,4 @@
-# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+# This file is automatically @generated by Poetry 1.8.3 and should not be changed by hand.
 
 [[package]]
 name = "aenum"
@@ -23,6 +23,17 @@ files = [
     {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
 ]
 
+[[package]]
+name = "annotated-types"
+version = "0.7.0"
+description = "Reusable constraint types to use with typing.Annotated"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"},
+    {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
+]
+
 [[package]]
 name = "attrs"
 version = "23.1.0"
@@ -567,6 +578,16 @@ files = [
     {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
     {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
     {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:f698de3fd0c4e6972b92290a45bd9b1536bffe8c6759c62471efaa8acb4c37bc"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:aa57bd9cf8ae831a362185ee444e15a93ecb2e344c8e52e4d721ea3ab6ef1823"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ffcc3f7c66b5f5b7931a5aa68fc9cecc51e685ef90282f4a82f0f5e9b704ad11"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:47d4f1c5f80fc62fdd7777d0d40a2e9dda0a05883ab11374334f6c4de38adffd"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1f67c7038d560d92149c060157d623c542173016c4babc0c1913cca0564b9939"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:9aad3c1755095ce347e26488214ef77e0485a3c34a50c5a5e2471dff60b9dd9c"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:14ff806850827afd6b07a5f32bd917fb7f45b046ba40c57abdb636674a8b559c"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8f9293864fe09b8149f0cc42ce56e3f0e54de883a9de90cd427f191c346eb2e1"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-win32.whl", hash = "sha256:715d3562f79d540f251b99ebd6d8baa547118974341db04f5ad06d5ea3eb8007"},
+    {file = "MarkupSafe-2.1.3-cp312-cp312-win_amd64.whl", hash = "sha256:1b8dd8c3fd14349433c79fa8abeb573a55fc0fdd769133baac1f5e07abf54aeb"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
     {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
@@ -762,6 +783,130 @@ files = [
     {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
 ]
 
+[[package]]
+name = "pydantic"
+version = "2.9.2"
+description = "Data validation using Python type hints"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic-2.9.2-py3-none-any.whl", hash = "sha256:f048cec7b26778210e28a0459867920654d48e5e62db0958433636cde4254f12"},
+    {file = "pydantic-2.9.2.tar.gz", hash = "sha256:d155cef71265d1e9807ed1c32b4c8deec042a44a50a4188b25ac67ecd81a9c0f"},
+]
+
+[package.dependencies]
+annotated-types = ">=0.6.0"
+pydantic-core = "2.23.4"
+typing-extensions = [
+    {version = ">=4.12.2", markers = "python_version >= \"3.13\""},
+    {version = ">=4.6.1", markers = "python_version < \"3.13\""},
+]
+
+[package.extras]
+email = ["email-validator (>=2.0.0)"]
+timezone = ["tzdata"]
+
+[[package]]
+name = "pydantic-core"
+version = "2.23.4"
+description = "Core functionality for Pydantic validation and serialization"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic_core-2.23.4-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:b10bd51f823d891193d4717448fab065733958bdb6a6b351967bd349d48d5c9b"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:4fc714bdbfb534f94034efaa6eadd74e5b93c8fa6315565a222f7b6f42ca1166"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63e46b3169866bd62849936de036f901a9356e36376079b05efa83caeaa02ceb"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed1a53de42fbe34853ba90513cea21673481cd81ed1be739f7f2efb931b24916"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cfdd16ab5e59fc31b5e906d1a3f666571abc367598e3e02c83403acabc092e07"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:255a8ef062cbf6674450e668482456abac99a5583bbafb73f9ad469540a3a232"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4a7cd62e831afe623fbb7aabbb4fe583212115b3ef38a9f6b71869ba644624a2"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f09e2ff1f17c2b51f2bc76d1cc33da96298f0a036a137f5440ab3ec5360b624f"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:e38e63e6f3d1cec5a27e0afe90a085af8b6806ee208b33030e65b6516353f1a3"},
+    {file = "pydantic_core-2.23.4-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0dbd8dbed2085ed23b5c04afa29d8fd2771674223135dc9bc937f3c09284d071"},
+    {file = "pydantic_core-2.23.4-cp310-none-win32.whl", hash = "sha256:6531b7ca5f951d663c339002e91aaebda765ec7d61b7d1e3991051906ddde119"},
+    {file = "pydantic_core-2.23.4-cp310-none-win_amd64.whl", hash = "sha256:7c9129eb40958b3d4500fa2467e6a83356b3b61bfff1b414c7361d9220f9ae8f"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:77733e3892bb0a7fa797826361ce8a9184d25c8dffaec60b7ffe928153680ba8"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1b84d168f6c48fabd1f2027a3d1bdfe62f92cade1fb273a5d68e621da0e44e6d"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:df49e7a0861a8c36d089c1ed57d308623d60416dab2647a4a17fe050ba85de0e"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ff02b6d461a6de369f07ec15e465a88895f3223eb75073ffea56b84d9331f607"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:996a38a83508c54c78a5f41456b0103c30508fed9abcad0a59b876d7398f25fd"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d97683ddee4723ae8c95d1eddac7c192e8c552da0c73a925a89fa8649bf13eea"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:216f9b2d7713eb98cb83c80b9c794de1f6b7e3145eef40400c62e86cee5f4e1e"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6f783e0ec4803c787bcea93e13e9932edab72068f68ecffdf86a99fd5918878b"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:d0776dea117cf5272382634bd2a5c1b6eb16767c223c6a5317cd3e2a757c61a0"},
+    {file = "pydantic_core-2.23.4-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d5f7a395a8cf1621939692dba2a6b6a830efa6b3cee787d82c7de1ad2930de64"},
+    {file = "pydantic_core-2.23.4-cp311-none-win32.whl", hash = "sha256:74b9127ffea03643e998e0c5ad9bd3811d3dac8c676e47db17b0ee7c3c3bf35f"},
+    {file = "pydantic_core-2.23.4-cp311-none-win_amd64.whl", hash = "sha256:98d134c954828488b153d88ba1f34e14259284f256180ce659e8d83e9c05eaa3"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:f3e0da4ebaef65158d4dfd7d3678aad692f7666877df0002b8a522cdf088f231"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f69a8e0b033b747bb3e36a44e7732f0c99f7edd5cea723d45bc0d6e95377ffee"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:723314c1d51722ab28bfcd5240d858512ffd3116449c557a1336cbe3919beb87"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bb2802e667b7051a1bebbfe93684841cc9351004e2badbd6411bf357ab8d5ac8"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d18ca8148bebe1b0a382a27a8ee60350091a6ddaf475fa05ef50dc35b5df6327"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:33e3d65a85a2a4a0dc3b092b938a4062b1a05f3a9abde65ea93b233bca0e03f2"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:128585782e5bfa515c590ccee4b727fb76925dd04a98864182b22e89a4e6ed36"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:68665f4c17edcceecc112dfed5dbe6f92261fb9d6054b47d01bf6371a6196126"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:20152074317d9bed6b7a95ade3b7d6054845d70584216160860425f4fbd5ee9e"},
+    {file = "pydantic_core-2.23.4-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:9261d3ce84fa1d38ed649c3638feefeae23d32ba9182963e465d58d62203bd24"},
+    {file = "pydantic_core-2.23.4-cp312-none-win32.whl", hash = "sha256:4ba762ed58e8d68657fc1281e9bb72e1c3e79cc5d464be146e260c541ec12d84"},
+    {file = "pydantic_core-2.23.4-cp312-none-win_amd64.whl", hash = "sha256:97df63000f4fea395b2824da80e169731088656d1818a11b95f3b173747b6cd9"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:7530e201d10d7d14abce4fb54cfe5b94a0aefc87da539d0346a484ead376c3cc"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:df933278128ea1cd77772673c73954e53a1c95a4fdf41eef97c2b779271bd0bd"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cb3da3fd1b6a5d0279a01877713dbda118a2a4fc6f0d821a57da2e464793f05"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:42c6dcb030aefb668a2b7009c85b27f90e51e6a3b4d5c9bc4c57631292015b0d"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:696dd8d674d6ce621ab9d45b205df149399e4bb9aa34102c970b721554828510"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2971bb5ffe72cc0f555c13e19b23c85b654dd2a8f7ab493c262071377bfce9f6"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8394d940e5d400d04cad4f75c0598665cbb81aecefaca82ca85bd28264af7f9b"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0dff76e0602ca7d4cdaacc1ac4c005e0ce0dcfe095d5b5259163a80d3a10d327"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:7d32706badfe136888bdea71c0def994644e09fff0bfe47441deaed8e96fdbc6"},
+    {file = "pydantic_core-2.23.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ed541d70698978a20eb63d8c5d72f2cc6d7079d9d90f6b50bad07826f1320f5f"},
+    {file = "pydantic_core-2.23.4-cp313-none-win32.whl", hash = "sha256:3d5639516376dce1940ea36edf408c554475369f5da2abd45d44621cb616f769"},
+    {file = "pydantic_core-2.23.4-cp313-none-win_amd64.whl", hash = "sha256:5a1504ad17ba4210df3a045132a7baeeba5a200e930f57512ee02909fc5c4cb5"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:d4488a93b071c04dc20f5cecc3631fc78b9789dd72483ba15d423b5b3689b555"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:81965a16b675b35e1d09dd14df53f190f9129c0202356ed44ab2728b1c905658"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4ffa2ebd4c8530079140dd2d7f794a9d9a73cbb8e9d59ffe24c63436efa8f271"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:61817945f2fe7d166e75fbfb28004034b48e44878177fc54d81688e7b85a3665"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:29d2c342c4bc01b88402d60189f3df065fb0dda3654744d5a165a5288a657368"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5e11661ce0fd30a6790e8bcdf263b9ec5988e95e63cf901972107efc49218b13"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9d18368b137c6295db49ce7218b1a9ba15c5bc254c96d7c9f9e924a9bc7825ad"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ec4e55f79b1c4ffb2eecd8a0cfba9955a2588497d96851f4c8f99aa4a1d39b12"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:374a5e5049eda9e0a44c696c7ade3ff355f06b1fe0bb945ea3cac2bc336478a2"},
+    {file = "pydantic_core-2.23.4-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5c364564d17da23db1106787675fc7af45f2f7b58b4173bfdd105564e132e6fb"},
+    {file = "pydantic_core-2.23.4-cp38-none-win32.whl", hash = "sha256:d7a80d21d613eec45e3d41eb22f8f94ddc758a6c4720842dc74c0581f54993d6"},
+    {file = "pydantic_core-2.23.4-cp38-none-win_amd64.whl", hash = "sha256:5f5ff8d839f4566a474a969508fe1c5e59c31c80d9e140566f9a37bba7b8d556"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a4fa4fc04dff799089689f4fd502ce7d59de529fc2f40a2c8836886c03e0175a"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a7df63886be5e270da67e0966cf4afbae86069501d35c8c1b3b6c168f42cb36"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dcedcd19a557e182628afa1d553c3895a9f825b936415d0dbd3cd0bbcfd29b4b"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f54b118ce5de9ac21c363d9b3caa6c800341e8c47a508787e5868c6b79c9323"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:86d2f57d3e1379a9525c5ab067b27dbb8a0642fb5d454e17a9ac434f9ce523e3"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:de6d1d1b9e5101508cb37ab0d972357cac5235f5c6533d1071964c47139257df"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1278e0d324f6908e872730c9102b0112477a7f7cf88b308e4fc36ce1bdb6d58c"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9a6b5099eeec78827553827f4c6b8615978bb4b6a88e5d9b93eddf8bb6790f55"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:e55541f756f9b3ee346b840103f32779c695a19826a4c442b7954550a0972040"},
+    {file = "pydantic_core-2.23.4-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a5c7ba8ffb6d6f8f2ab08743be203654bb1aaa8c9dcb09f82ddd34eadb695605"},
+    {file = "pydantic_core-2.23.4-cp39-none-win32.whl", hash = "sha256:37b0fe330e4a58d3c58b24d91d1eb102aeec675a3db4c292ec3928ecd892a9a6"},
+    {file = "pydantic_core-2.23.4-cp39-none-win_amd64.whl", hash = "sha256:1498bec4c05c9c787bde9125cfdcc63a41004ff167f495063191b863399b1a29"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:f455ee30a9d61d3e1a15abd5068827773d6e4dc513e795f380cdd59932c782d5"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:1e90d2e3bd2c3863d48525d297cd143fe541be8bbf6f579504b9712cb6b643ec"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2e203fdf807ac7e12ab59ca2bfcabb38c7cf0b33c41efeb00f8e5da1d86af480"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e08277a400de01bc72436a0ccd02bdf596631411f592ad985dcee21445bd0068"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f220b0eea5965dec25480b6333c788fb72ce5f9129e8759ef876a1d805d00801"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:d06b0c8da4f16d1d1e352134427cb194a0a6e19ad5db9161bf32b2113409e728"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:ba1a0996f6c2773bd83e63f18914c1de3c9dd26d55f4ac302a7efe93fb8e7433"},
+    {file = "pydantic_core-2.23.4-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:9a5bce9d23aac8f0cf0836ecfc033896aa8443b501c58d0602dbfd5bd5b37753"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:78ddaaa81421a29574a682b3179d4cf9e6d405a09b99d93ddcf7e5239c742e21"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:883a91b5dd7d26492ff2f04f40fbb652de40fcc0afe07e8129e8ae779c2110eb"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:88ad334a15b32a791ea935af224b9de1bf99bcd62fabf745d5f3442199d86d59"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:233710f069d251feb12a56da21e14cca67994eab08362207785cf8c598e74577"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:19442362866a753485ba5e4be408964644dd6a09123d9416c54cd49171f50744"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:624e278a7d29b6445e4e813af92af37820fafb6dcc55c012c834f9e26f9aaaef"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f5ef8f42bec47f21d07668a043f077d507e5bf4e668d5c6dfe6aaba89de1a5b8"},
+    {file = "pydantic_core-2.23.4-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:aea443fffa9fbe3af1a9ba721a87f926fe548d32cab71d188a6ede77d0ff244e"},
+    {file = "pydantic_core-2.23.4.tar.gz", hash = "sha256:2584f7cf844ac4d970fba483a717dbe10c1c1c96a969bf65d61ffe94df1b2863"},
+]
+
+[package.dependencies]
+typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
+
 [[package]]
 name = "pydocstyle"
 version = "6.1.1"
@@ -880,6 +1025,7 @@ files = [
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
     {file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
+    {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
     {file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
     {file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
     {file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -887,8 +1033,16 @@ files = [
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
     {file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
+    {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
     {file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
     {file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
+    {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
+    {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
+    {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
+    {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
+    {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
+    {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
+    {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
     {file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
     {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
     {file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -905,6 +1059,7 @@ files = [
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
     {file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
+    {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
     {file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
     {file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
     {file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -912,6 +1067,7 @@ files = [
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
     {file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
+    {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
     {file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
     {file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
     {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -1327,6 +1483,17 @@ files = [
     {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
 ]
 
+[[package]]
+name = "typing-extensions"
+version = "4.12.2"
+description = "Backported and Experimental Type Hints for Python 3.8+"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"},
+    {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"},
+]
+
 [[package]]
 name = "urllib3"
 version = "2.0.7"
@@ -1362,4 +1529,4 @@ jsonschema = ">=4,<5"
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "6f20ce05310df93fed1d392160d1653ae5de5c6f260a5865eb3c6111a7c2b394"
+content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 506380ac2f..6c2d1ca8a4 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -28,6 +28,7 @@ scapy = "^2.5.0"
 pydocstyle = "6.1.1"
 typing-extensions = "^4.11.0"
 aenum = "^3.1.15"
+pydantic = "^2.9.2"
 
 [tool.poetry.group.dev.dependencies]
 mypy = "^1.10.0"
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v6 2/9] dts: add TestSuiteSpec class and discovery
  2024-11-08 11:39 ` [PATCH v6 0/9] " Luca Vizzarro
  2024-11-08 11:39   ` [PATCH v6 1/9] dts: add pydantic dependency Luca Vizzarro
@ 2024-11-08 11:39   ` Luca Vizzarro
  2024-11-20  8:48     ` Ali Alnubani
  2024-11-08 11:39   ` [PATCH v6 3/9] dts: refactor build and node info classes Luca Vizzarro
                     ` (6 subsequent siblings)
  8 siblings, 1 reply; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-08 11:39 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

Currently there is a lack of a definition which identifies all the test
suites available to test. This change intends to simplify the process to
discover all the test suites and identify them.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
Reviewed-by: Patrick Robb <probb@iol.unh.edu>
---
 dts/framework/runner.py                   |   2 +-
 dts/framework/test_suite.py               | 190 +++++++++++++++++++---
 dts/framework/testbed_model/capability.py |  12 +-
 dts/framework/utils.py                    |   5 +
 4 files changed, 182 insertions(+), 27 deletions(-)

diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 8bbe698eaf..195622c653 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -225,7 +225,7 @@ def _get_test_suites_with_cases(
         for test_suite_config in test_suite_configs:
             test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
             test_cases: list[type[TestCase]] = []
-            func_test_cases, perf_test_cases = test_suite_class.get_test_cases(
+            func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
                 test_suite_config.test_cases
             )
             if func:
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index cbe3b30ffc..fb5d646ce3 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2010-2014 Intel Corporation
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Features common to all test suites.
 
@@ -16,13 +17,19 @@
 import inspect
 from collections import Counter
 from collections.abc import Callable, Sequence
+from dataclasses import dataclass
 from enum import Enum, auto
+from functools import cached_property
+from importlib import import_module
 from ipaddress import IPv4Interface, IPv6Interface, ip_interface
+from pkgutil import iter_modules
+from types import ModuleType
 from typing import ClassVar, Protocol, TypeVar, Union, cast
 
 from scapy.layers.inet import IP  # type: ignore[import-untyped]
 from scapy.layers.l2 import Ether  # type: ignore[import-untyped]
 from scapy.packet import Packet, Padding, raw  # type: ignore[import-untyped]
+from typing_extensions import Self
 
 from framework.testbed_model.capability import TestProtocol
 from framework.testbed_model.port import Port
@@ -33,9 +40,9 @@
     PacketFilteringConfig,
 )
 
-from .exception import ConfigurationError, TestCaseVerifyError
+from .exception import ConfigurationError, InternalError, TestCaseVerifyError
 from .logger import DTSLogger, get_dts_logger
-from .utils import get_packet_summaries
+from .utils import get_packet_summaries, to_pascal_case
 
 
 class TestSuite(TestProtocol):
@@ -112,10 +119,24 @@ def __init__(
         self._tg_ip_address_ingress = ip_interface("192.168.101.3/24")
 
     @classmethod
-    def get_test_cases(
+    def get_test_cases(cls) -> list[type["TestCase"]]:
+        """A list of all the available test cases."""
+
+        def is_test_case(function: Callable) -> bool:
+            if inspect.isfunction(function):
+                # TestCase is not used at runtime, so we can't use isinstance() with `function`.
+                # But function.test_type exists.
+                if hasattr(function, "test_type"):
+                    return isinstance(function.test_type, TestCaseType)
+            return False
+
+        return [test_case for _, test_case in inspect.getmembers(cls, is_test_case)]
+
+    @classmethod
+    def filter_test_cases(
         cls, test_case_sublist: Sequence[str] | None = None
     ) -> tuple[set[type["TestCase"]], set[type["TestCase"]]]:
-        """Filter `test_case_subset` from this class.
+        """Filter `test_case_sublist` from this class.
 
         Test cases are regular (or bound) methods decorated with :func:`func_test`
         or :func:`perf_test`.
@@ -129,17 +150,8 @@ def get_test_cases(
             as methods are bound to instances and this method only has access to the class.
 
         Raises:
-            ConfigurationError: If a test case from `test_case_subset` is not found.
+            ConfigurationError: If a test case from `test_case_sublist` is not found.
         """
-
-        def is_test_case(function: Callable) -> bool:
-            if inspect.isfunction(function):
-                # TestCase is not used at runtime, so we can't use isinstance() with `function`.
-                # But function.test_type exists.
-                if hasattr(function, "test_type"):
-                    return isinstance(function.test_type, TestCaseType)
-            return False
-
         if test_case_sublist is None:
             test_case_sublist = []
 
@@ -149,22 +161,22 @@ def is_test_case(function: Callable) -> bool:
         func_test_cases = set()
         perf_test_cases = set()
 
-        for test_case_name, test_case_function in inspect.getmembers(cls, is_test_case):
-            if test_case_name in test_case_sublist_copy:
+        for test_case in cls.get_test_cases():
+            if test_case.name in test_case_sublist_copy:
                 # if test_case_sublist_copy is non-empty, remove the found test case
                 # so that we can look at the remainder at the end
-                test_case_sublist_copy.remove(test_case_name)
+                test_case_sublist_copy.remove(test_case.name)
             elif test_case_sublist:
                 # the original list not being empty means we're filtering test cases
-                # since we didn't remove test_case_name in the previous branch,
+                # since we didn't remove test_case.name in the previous branch,
                 # it doesn't match the filter and we don't want to remove it
                 continue
 
-            match test_case_function.test_type:
+            match test_case.test_type:
                 case TestCaseType.PERFORMANCE:
-                    perf_test_cases.add(test_case_function)
+                    perf_test_cases.add(test_case)
                 case TestCaseType.FUNCTIONAL:
-                    func_test_cases.add(test_case_function)
+                    func_test_cases.add(test_case)
 
         if test_case_sublist_copy:
             raise ConfigurationError(
@@ -536,6 +548,8 @@ class TestCase(TestProtocol, Protocol[TestSuiteMethodType]):
     test case function to :class:`TestCase` and sets common variables.
     """
 
+    #:
+    name: ClassVar[str]
     #:
     test_type: ClassVar[TestCaseType]
     #: necessary for mypy so that it can treat this class as the function it's shadowing
@@ -560,6 +574,7 @@ def make_decorator(
 
         def _decorator(func: TestSuiteMethodType) -> type[TestCase]:
             test_case = cast(type[TestCase], func)
+            test_case.name = func.__name__
             test_case.skip = cls.skip
             test_case.skip_reason = cls.skip_reason
             test_case.required_capabilities = set()
@@ -575,3 +590,136 @@ def _decorator(func: TestSuiteMethodType) -> type[TestCase]:
 func_test: Callable = TestCase.make_decorator(TestCaseType.FUNCTIONAL)
 #: The decorator for performance test cases.
 perf_test: Callable = TestCase.make_decorator(TestCaseType.PERFORMANCE)
+
+
+@dataclass
+class TestSuiteSpec:
+    """A class defining the specification of a test suite.
+
+    Apart from defining all the specs of a test suite, a helper function :meth:`discover_all` is
+    provided to automatically discover all the available test suites.
+
+    Attributes:
+        module_name: The name of the test suite's module.
+    """
+
+    #:
+    TEST_SUITES_PACKAGE_NAME = "tests"
+    #:
+    TEST_SUITE_MODULE_PREFIX = "TestSuite_"
+    #:
+    TEST_SUITE_CLASS_PREFIX = "Test"
+    #:
+    TEST_CASE_METHOD_PREFIX = "test_"
+    #:
+    FUNC_TEST_CASE_REGEX = r"test_(?!perf_)"
+    #:
+    PERF_TEST_CASE_REGEX = r"test_perf_"
+
+    module_name: str
+
+    @cached_property
+    def name(self) -> str:
+        """The name of the test suite's module."""
+        return self.module_name[len(self.TEST_SUITE_MODULE_PREFIX) :]
+
+    @cached_property
+    def module(self) -> ModuleType:
+        """A reference to the test suite's module."""
+        return import_module(f"{self.TEST_SUITES_PACKAGE_NAME}.{self.module_name}")
+
+    @cached_property
+    def class_name(self) -> str:
+        """The name of the test suite's class."""
+        return f"{self.TEST_SUITE_CLASS_PREFIX}{to_pascal_case(self.name)}"
+
+    @cached_property
+    def class_obj(self) -> type[TestSuite]:
+        """A reference to the test suite's class."""
+
+        def is_test_suite(obj) -> bool:
+            """Check whether `obj` is a :class:`TestSuite`.
+
+            The `obj` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
+
+            Args:
+                obj: The object to be checked.
+
+            Returns:
+                :data:`True` if `obj` is a subclass of `TestSuite`.
+            """
+            try:
+                if issubclass(obj, TestSuite) and obj is not TestSuite:
+                    return True
+            except TypeError:
+                return False
+            return False
+
+        for class_name, class_obj in inspect.getmembers(self.module, is_test_suite):
+            if class_name == self.class_name:
+                return class_obj
+
+        raise InternalError(
+            f"Expected class {self.class_name} not found in module {self.module_name}."
+        )
+
+    @classmethod
+    def discover_all(
+        cls, package_name: str | None = None, module_prefix: str | None = None
+    ) -> list[Self]:
+        """Discover all the test suites.
+
+        The test suites are discovered in the provided `package_name`. The full module name,
+        expected under that package, is prefixed with `module_prefix`.
+        The module name is a standard filename with words separated with underscores.
+        For each module found, search for a :class:`TestSuite` class which starts
+        with :attr:`~TestSuiteSpec.TEST_SUITE_CLASS_PREFIX`, continuing with the module name in
+        PascalCase.
+
+        The PascalCase convention applies to abbreviations, acronyms, initialisms and so on::
+
+            OS -> Os
+            TCP -> Tcp
+
+        Args:
+            package_name: The name of the package where to find the test suites. If :data:`None`,
+                the :attr:`~TestSuiteSpec.TEST_SUITES_PACKAGE_NAME` is used.
+            module_prefix: The name prefix defining the test suite module. If :data:`None`, the
+                :attr:`~TestSuiteSpec.TEST_SUITE_MODULE_PREFIX` constant is used.
+
+        Returns:
+            A list containing all the discovered test suites.
+        """
+        if package_name is None:
+            package_name = cls.TEST_SUITES_PACKAGE_NAME
+        if module_prefix is None:
+            module_prefix = cls.TEST_SUITE_MODULE_PREFIX
+
+        test_suites = []
+
+        test_suites_pkg = import_module(package_name)
+        for _, module_name, is_pkg in iter_modules(test_suites_pkg.__path__):
+            if not module_name.startswith(module_prefix) or is_pkg:
+                continue
+
+            test_suite = cls(module_name)
+            try:
+                if test_suite.class_obj:
+                    test_suites.append(test_suite)
+            except InternalError as err:
+                get_dts_logger().warning(err)
+
+        return test_suites
+
+
+AVAILABLE_TEST_SUITES: list[TestSuiteSpec] = TestSuiteSpec.discover_all()
+"""Constant to store all the available, discovered and imported test suites.
+
+The test suites should be gathered from this list to avoid importing more than once.
+"""
+
+
+def find_by_name(name: str) -> TestSuiteSpec | None:
+    """Find a requested test suite by name from the available ones."""
+    test_suites = filter(lambda t: t.name == name, AVAILABLE_TEST_SUITES)
+    return next(test_suites, None)
diff --git a/dts/framework/testbed_model/capability.py b/dts/framework/testbed_model/capability.py
index 2207957a7a..0d5f0e0b32 100644
--- a/dts/framework/testbed_model/capability.py
+++ b/dts/framework/testbed_model/capability.py
@@ -47,9 +47,9 @@ def test_scatter_mbuf_2048(self):
 
 import inspect
 from abc import ABC, abstractmethod
-from collections.abc import MutableSet, Sequence
+from collections.abc import MutableSet
 from dataclasses import dataclass
-from typing import Callable, ClassVar, Protocol
+from typing import TYPE_CHECKING, Callable, ClassVar, Protocol
 
 from typing_extensions import Self
 
@@ -66,6 +66,9 @@ def test_scatter_mbuf_2048(self):
 from .sut_node import SutNode
 from .topology import Topology, TopologyType
 
+if TYPE_CHECKING:
+    from framework.test_suite import TestCase
+
 
 class Capability(ABC):
     """The base class for various capabilities.
@@ -354,8 +357,7 @@ def set_required(self, test_case_or_suite: type["TestProtocol"]) -> None:
         if inspect.isclass(test_case_or_suite):
             if self.topology_type is not TopologyType.default:
                 self.add_to_required(test_case_or_suite)
-                func_test_cases, perf_test_cases = test_case_or_suite.get_test_cases()
-                for test_case in func_test_cases | perf_test_cases:
+                for test_case in test_case_or_suite.get_test_cases():
                     if test_case.topology_type.topology_type is TopologyType.default:
                         # test case topology has not been set, use the one set by the test suite
                         self.add_to_required(test_case)
@@ -446,7 +448,7 @@ class TestProtocol(Protocol):
     required_capabilities: ClassVar[set[Capability]] = set()
 
     @classmethod
-    def get_test_cases(cls, test_case_sublist: Sequence[str] | None = None) -> tuple[set, set]:
+    def get_test_cases(cls) -> list[type["TestCase"]]:
         """Get test cases. Should be implemented by subclasses containing test cases.
 
         Raises:
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index 78a39e32c7..43e2592fce 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -303,3 +303,8 @@ class MultiInheritanceBaseClass:
     def __init__(self, *args, **kwargs) -> None:
         """Call the init method of :class:`object`."""
         super().__init__()
+
+
+def to_pascal_case(text: str) -> str:
+    """Convert `text` from snake_case to PascalCase."""
+    return "".join([seg.capitalize() for seg in text.split("_")])
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v6 3/9] dts: refactor build and node info classes
  2024-11-08 11:39 ` [PATCH v6 0/9] " Luca Vizzarro
  2024-11-08 11:39   ` [PATCH v6 1/9] dts: add pydantic dependency Luca Vizzarro
  2024-11-08 11:39   ` [PATCH v6 2/9] dts: add TestSuiteSpec class and discovery Luca Vizzarro
@ 2024-11-08 11:39   ` Luca Vizzarro
  2024-11-08 11:40   ` [PATCH v6 4/9] dts: use pydantic in the configuration Luca Vizzarro
                     ` (5 subsequent siblings)
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-08 11:39 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

The DPDKBuildInfo and NodeInfo classes, representing information
gathered in runtime, were erroneously placed in the configuration
package. This moves them in more appropriate modules.

NodeInfo, specifically, is moved to os_session instead of node mostly
as a consequence of circular dependencies. And given os_session is the
top-most module to reference it, it appears to be the most suitable
place outside of node.

Finally NodeInfo, is better renamed to OSSessionInfo as it represents
the information on the target OS session.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
Reviewed-by: Patrick Robb <probb@iol.unh.edu>
---
 dts/framework/config/__init__.py             | 31 --------------------
 dts/framework/test_result.py                 |  6 ++--
 dts/framework/testbed_model/os_session.py    | 23 +++++++++++++--
 dts/framework/testbed_model/posix_session.py |  8 ++---
 dts/framework/testbed_model/sut_node.py      | 22 ++++++++++----
 5 files changed, 46 insertions(+), 44 deletions(-)

diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index d0d95d00c7..7403ccbf14 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -318,24 +318,6 @@ class TGNodeConfiguration(NodeConfiguration):
     traffic_generator: TrafficGeneratorConfig
 
 
-@dataclass(slots=True, frozen=True)
-class NodeInfo:
-    """Supplemental node information.
-
-    Attributes:
-        os_name: The name of the running operating system of
-            the :class:`~framework.testbed_model.node.Node`.
-        os_version: The version of the running operating system of
-            the :class:`~framework.testbed_model.node.Node`.
-        kernel_version: The kernel version of the running operating system of
-            the :class:`~framework.testbed_model.node.Node`.
-    """
-
-    os_name: str
-    os_version: str
-    kernel_version: str
-
-
 @dataclass(slots=True, frozen=True)
 class DPDKBuildConfiguration:
     """DPDK build configuration.
@@ -493,19 +475,6 @@ def from_dict(cls, d: DPDKConfigurationDict) -> Self:
         )
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKBuildInfo:
-    """Various versions and other information about a DPDK build.
-
-    Attributes:
-        dpdk_version: The DPDK version that was built.
-        compiler_version: The version of the compiler used to build DPDK.
-    """
-
-    dpdk_version: str | None
-    compiler_version: str | None
-
-
 @dataclass(slots=True, frozen=True)
 class TestSuiteConfig:
     """Test suite configuration.
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 00263ad69e..6014d281b5 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -30,11 +30,13 @@
 
 from framework.testbed_model.capability import Capability
 
-from .config import DPDKBuildInfo, NodeInfo, TestRunConfiguration, TestSuiteConfig
+from .config import TestRunConfiguration, TestSuiteConfig
 from .exception import DTSError, ErrorSeverity
 from .logger import DTSLogger
 from .settings import SETTINGS
 from .test_suite import TestCase, TestSuite
+from .testbed_model.os_session import OSSessionInfo
+from .testbed_model.sut_node import DPDKBuildInfo
 
 
 @dataclass(slots=True, frozen=True)
@@ -421,7 +423,7 @@ def test_suites_with_cases(self, test_suites_with_cases: list[TestSuiteWithCases
             )
         self._test_suites_with_cases = test_suites_with_cases
 
-    def add_sut_info(self, sut_info: NodeInfo) -> None:
+    def add_sut_info(self, sut_info: OSSessionInfo) -> None:
         """Add SUT information gathered at runtime.
 
         Args:
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 6194ddb989..db37424954 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -24,11 +24,12 @@
 """
 from abc import ABC, abstractmethod
 from collections.abc import Iterable
+from dataclasses import dataclass
 from ipaddress import IPv4Interface, IPv6Interface
 from pathlib import Path, PurePath, PurePosixPath
 from typing import Union
 
-from framework.config import Architecture, NodeConfiguration, NodeInfo
+from framework.config import Architecture, NodeConfiguration
 from framework.logger import DTSLogger
 from framework.remote_session import (
     InteractiveRemoteSession,
@@ -44,6 +45,24 @@
 from .port import Port
 
 
+@dataclass(slots=True, frozen=True)
+class OSSessionInfo:
+    """Supplemental OS session information.
+
+    Attributes:
+        os_name: The name of the running operating system of
+            the :class:`~framework.testbed_model.node.Node`.
+        os_version: The version of the running operating system of
+            the :class:`~framework.testbed_model.node.Node`.
+        kernel_version: The kernel version of the running operating system of
+            the :class:`~framework.testbed_model.node.Node`.
+    """
+
+    os_name: str
+    os_version: str
+    kernel_version: str
+
+
 class OSSession(ABC):
     """OS-unaware to OS-aware translation API definition.
 
@@ -482,7 +501,7 @@ def get_compiler_version(self, compiler_name: str) -> str:
         """
 
     @abstractmethod
-    def get_node_info(self) -> NodeInfo:
+    def get_node_info(self) -> OSSessionInfo:
         """Collect additional information about the node.
 
         Returns:
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 5ab7c18fb7..d7a1f38cad 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -15,7 +15,7 @@
 from collections.abc import Iterable
 from pathlib import Path, PurePath, PurePosixPath
 
-from framework.config import Architecture, NodeInfo
+from framework.config import Architecture
 from framework.exception import DPDKBuildError, RemoteCommandExecutionError
 from framework.settings import SETTINGS
 from framework.utils import (
@@ -26,7 +26,7 @@
     extract_tarball,
 )
 
-from .os_session import OSSession
+from .os_session import OSSession, OSSessionInfo
 
 
 class PosixSession(OSSession):
@@ -386,11 +386,11 @@ def get_compiler_version(self, compiler_name: str) -> str:
             case _:
                 raise ValueError(f"Unknown compiler {compiler_name}")
 
-    def get_node_info(self) -> NodeInfo:
+    def get_node_info(self) -> OSSessionInfo:
         """Overrides :meth:`~.os_session.OSSession.get_node_info`."""
         os_release_info = self.send_command(
             "awk -F= '$1 ~ /^NAME$|^VERSION$/ {print $2}' /etc/os-release",
             SETTINGS.timeout,
         ).stdout.split("\n")
         kernel_version = self.send_command("uname -r", SETTINGS.timeout).stdout
-        return NodeInfo(os_release_info[0].strip(), os_release_info[1].strip(), kernel_version)
+        return OSSessionInfo(os_release_info[0].strip(), os_release_info[1].strip(), kernel_version)
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index e160386324..5474d436a1 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -14,13 +14,12 @@
 
 import os
 import time
+from dataclasses import dataclass
 from pathlib import PurePath
 
 from framework.config import (
     DPDKBuildConfiguration,
-    DPDKBuildInfo,
     DPDKLocation,
-    NodeInfo,
     SutNodeConfiguration,
     TestRunConfiguration,
 )
@@ -30,10 +29,23 @@
 from framework.utils import MesonArgs, TarCompressionFormat
 
 from .node import Node
-from .os_session import OSSession
+from .os_session import OSSession, OSSessionInfo
 from .virtual_device import VirtualDevice
 
 
+@dataclass(slots=True, frozen=True)
+class DPDKBuildInfo:
+    """Various versions and other information about a DPDK build.
+
+    Attributes:
+        dpdk_version: The DPDK version that was built.
+        compiler_version: The version of the compiler used to build DPDK.
+    """
+
+    dpdk_version: str | None
+    compiler_version: str | None
+
+
 class SutNode(Node):
     """The system under test node.
 
@@ -63,7 +75,7 @@ class SutNode(Node):
     _app_compile_timeout: float
     _dpdk_kill_session: OSSession | None
     _dpdk_version: str | None
-    _node_info: NodeInfo | None
+    _node_info: OSSessionInfo | None
     _compiler_version: str | None
     _path_to_devbind_script: PurePath | None
     _ports_bound_to_dpdk: bool
@@ -125,7 +137,7 @@ def dpdk_version(self) -> str | None:
         return self._dpdk_version
 
     @property
-    def node_info(self) -> NodeInfo:
+    def node_info(self) -> OSSessionInfo:
         """Additional node information."""
         if self._node_info is None:
             self._node_info = self.main_session.get_node_info()
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v6 4/9] dts: use pydantic in the configuration
  2024-11-08 11:39 ` [PATCH v6 0/9] " Luca Vizzarro
                     ` (2 preceding siblings ...)
  2024-11-08 11:39   ` [PATCH v6 3/9] dts: refactor build and node info classes Luca Vizzarro
@ 2024-11-08 11:40   ` Luca Vizzarro
  2024-11-20  8:48     ` Ali Alnubani
  2024-11-08 11:40   ` [PATCH v6 5/9] dts: remove warlock dependency Luca Vizzarro
                     ` (4 subsequent siblings)
  8 siblings, 1 reply; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-08 11:40 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

This change brings in pydantic in place of warlock. Pydantic offers
a built-in model validation system in the classes, which allows for
a more resilient and simpler code. As a consequence of this change:

- most validation is now built-in
- further validation is added to verify:
  - cross referencing of node names and ports
  - test suite and test cases names
- dictionaries representing the config schema are removed
- the config schema is no longer used and therefore dropped
- the TrafficGeneratorType enum has been changed from inheriting
  StrEnum to the native str and Enum. This change was necessary to
  enable the discriminator for object unions
- the structure of the classes has been slightly changed to perfectly
  match the structure of the configuration files
- the test suite argument catches the ValidationError that
  TestSuiteConfig can now raise
- the DPDK location has been wrapped under another configuration
  mapping `dpdk_location`
- the DPDK locations are now structured and enforced by classes,
  further simplifying the validation and handling thanks to
  pattern matching

Bugzilla ID: 1508

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
Reviewed-by: Patrick Robb <probb@iol.unh.edu>
---
 doc/api/dts/conf_yaml_schema.json             |   1 -
 doc/api/dts/framework.config.rst              |   6 -
 doc/api/dts/framework.config.types.rst        |   8 -
 dts/conf.yaml                                 |  11 +-
 dts/framework/config/__init__.py              | 832 +++++++++---------
 dts/framework/config/conf_yaml_schema.json    | 459 ----------
 dts/framework/config/types.py                 | 149 ----
 dts/framework/runner.py                       |  57 +-
 dts/framework/settings.py                     | 124 +--
 dts/framework/testbed_model/node.py           |  15 +-
 dts/framework/testbed_model/os_session.py     |   4 +-
 dts/framework/testbed_model/port.py           |   4 +-
 dts/framework/testbed_model/posix_session.py  |   4 +-
 dts/framework/testbed_model/sut_node.py       | 182 ++--
 dts/framework/testbed_model/topology.py       |  11 +-
 .../traffic_generator/__init__.py             |   4 +-
 .../traffic_generator/traffic_generator.py    |   2 +-
 dts/framework/utils.py                        |   2 +-
 dts/tests/TestSuite_smoke_tests.py            |   2 +-
 19 files changed, 660 insertions(+), 1217 deletions(-)
 delete mode 120000 doc/api/dts/conf_yaml_schema.json
 delete mode 100644 doc/api/dts/framework.config.types.rst
 delete mode 100644 dts/framework/config/conf_yaml_schema.json
 delete mode 100644 dts/framework/config/types.py

diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json
deleted file mode 120000
index 5978642d76..0000000000
--- a/doc/api/dts/conf_yaml_schema.json
+++ /dev/null
@@ -1 +0,0 @@
-../../../dts/framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst
index 261997aefa..cc266276c1 100644
--- a/doc/api/dts/framework.config.rst
+++ b/doc/api/dts/framework.config.rst
@@ -6,9 +6,3 @@ config - Configuration Package
 .. automodule:: framework.config
    :members:
    :show-inheritance:
-
-.. toctree::
-   :hidden:
-   :maxdepth: 1
-
-   framework.config.types
diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst
deleted file mode 100644
index a50a0c874a..0000000000
--- a/doc/api/dts/framework.config.types.rst
+++ /dev/null
@@ -1,8 +0,0 @@
-.. SPDX-License-Identifier: BSD-3-Clause
-
-config.types - Configuration Types
-==================================
-
-.. automodule:: framework.config.types
-   :members:
-   :show-inheritance:
diff --git a/dts/conf.yaml b/dts/conf.yaml
index 8a65a481d6..2496262854 100644
--- a/dts/conf.yaml
+++ b/dts/conf.yaml
@@ -5,11 +5,12 @@
 test_runs:
   # define one test run environment
   - dpdk_build:
-      # dpdk_tree: Commented out because `tarball` is defined.
-      tarball: dpdk-tarball.tar.xz
-      # Either `dpdk_tree` or `tarball` can be defined, but not both.
-      remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
-                    # is located on the SUT node, instead of the execution host.
+      dpdk_location:
+        # dpdk_tree: Commented out because `tarball` is defined.
+        tarball: dpdk-tarball.tar.xz
+        # Either `dpdk_tree` or `tarball` can be defined, but not both.
+        remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
+                      # is located on the SUT node, instead of the execution host.
 
       # precompiled_build_dir: Commented out because `build_options` is defined.
       build_options:
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index 7403ccbf14..d88fabc780 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -2,17 +2,18 @@
 # Copyright(c) 2010-2021 Intel Corporation
 # Copyright(c) 2022-2023 University of New Hampshire
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2024 Arm Limited
 
 """Testbed configuration and test suite specification.
 
 This package offers classes that hold real-time information about the testbed, hold test run
 configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
-the YAML test run configuration file
-and validates it according to :download:`the schema <conf_yaml_schema.json>`.
+the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
+model.
 
 The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
-this package. The allowed keys and types inside this dictionary are defined in
-the :doc:`types <framework.config.types>` module.
+this package. The allowed keys and types inside this dictionary map directly to the
+:class:`Configuration` model, its fields and sub-models.
 
 The test run configuration has two main sections:
 
@@ -24,39 +25,42 @@
 
 The real-time information about testbed is supposed to be gathered at runtime.
 
-The classes defined in this package make heavy use of :mod:`dataclasses`.
-All of them use slots and are frozen:
+The classes defined in this package make heavy use of :mod:`pydantic`.
+Nearly all of them are frozen:
 
-    * Slots enables some optimizations, by pre-allocating space for the defined
-      attributes in the underlying data structure,
     * Frozen makes the object immutable. This enables further optimizations,
       and makes it thread safe should we ever want to move in that direction.
 """
 
-import json
-import os.path
 import tarfile
-from dataclasses import dataclass, fields
-from enum import auto, unique
-from pathlib import Path
-from typing import Union
+from enum import Enum, auto, unique
+from functools import cached_property
+from pathlib import Path, PurePath
+from typing import TYPE_CHECKING, Annotated, Any, Literal, NamedTuple
 
-import warlock  # type: ignore[import-untyped]
 import yaml
+from pydantic import (
+    BaseModel,
+    ConfigDict,
+    Field,
+    ValidationError,
+    field_validator,
+    model_validator,
+)
 from typing_extensions import Self
 
-from framework.config.types import (
-    ConfigurationDict,
-    DPDKBuildConfigDict,
-    DPDKConfigurationDict,
-    NodeConfigDict,
-    PortConfigDict,
-    TestRunConfigDict,
-    TestSuiteConfigDict,
-    TrafficGeneratorConfigDict,
-)
 from framework.exception import ConfigurationError
-from framework.utils import StrEnum
+from framework.utils import REGEX_FOR_PCI_ADDRESS, StrEnum
+
+if TYPE_CHECKING:
+    from framework.test_suite import TestSuiteSpec
+
+
+class FrozenModel(BaseModel):
+    """A pre-configured :class:`~pydantic.BaseModel`."""
+
+    #: Fields are set as read-only and any extra fields are forbidden.
+    model_config = ConfigDict(frozen=True, extra="forbid")
 
 
 @unique
@@ -118,15 +122,14 @@ class Compiler(StrEnum):
 
 
 @unique
-class TrafficGeneratorType(StrEnum):
+class TrafficGeneratorType(str, Enum):
     """The supported traffic generators."""
 
     #:
-    SCAPY = auto()
+    SCAPY = "SCAPY"
 
 
-@dataclass(slots=True, frozen=True)
-class HugepageConfiguration:
+class HugepageConfiguration(FrozenModel):
     r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
@@ -138,12 +141,10 @@ class HugepageConfiguration:
     force_first_numa: bool
 
 
-@dataclass(slots=True, frozen=True)
-class PortConfig:
+class PortConfig(FrozenModel):
     r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
-        node: The :class:`~framework.testbed_model.node.Node` where this port exists.
         pci: The PCI address of the port.
         os_driver_for_dpdk: The operating system driver name for use with DPDK.
         os_driver: The operating system driver name when the operating system controls the port.
@@ -152,70 +153,57 @@ class PortConfig:
         peer_pci: The PCI address of the port connected to this port.
     """
 
-    node: str
-    pci: str
-    os_driver_for_dpdk: str
-    os_driver: str
-    peer_node: str
-    peer_pci: str
-
-    @classmethod
-    def from_dict(cls, node: str, d: PortConfigDict) -> Self:
-        """A convenience method that creates the object from fewer inputs.
-
-        Args:
-            node: The node where this port exists.
-            d: The configuration dictionary.
-
-        Returns:
-            The port configuration instance.
-        """
-        return cls(node=node, **d)
-
-
-@dataclass(slots=True, frozen=True)
-class TrafficGeneratorConfig:
-    """The configuration of traffic generators.
-
-    The class will be expanded when more configuration is needed.
+    pci: str = Field(
+        description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS
+    )
+    os_driver_for_dpdk: str = Field(
+        description="The driver that the kernel should bind this device to for DPDK to use it.",
+        examples=["vfio-pci", "mlx5_core"],
+    )
+    os_driver: str = Field(
+        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
+    )
+    peer_node: str = Field(description="The name of the peer node this port is connected to.")
+    peer_pci: str = Field(
+        description="The PCI address of the peer port this port is connected to.",
+        pattern=REGEX_FOR_PCI_ADDRESS,
+    )
+
+
+class TrafficGeneratorConfig(FrozenModel):
+    """A protocol required to define traffic generator types.
 
     Attributes:
-        traffic_generator_type: The type of the traffic generator.
+        type: The traffic generator type, the child class is required to define to be distinguished
+            among others.
     """
 
-    traffic_generator_type: TrafficGeneratorType
+    type: TrafficGeneratorType
 
-    @staticmethod
-    def from_dict(d: TrafficGeneratorConfigDict) -> "TrafficGeneratorConfig":
-        """A convenience method that produces traffic generator config of the proper type.
 
-        Args:
-            d: The configuration dictionary.
+class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
+    """Scapy traffic generator specific configuration."""
 
-        Returns:
-            The traffic generator configuration instance.
+    type: Literal[TrafficGeneratorType.SCAPY]
 
-        Raises:
-            ConfigurationError: An unknown traffic generator type was encountered.
-        """
-        match TrafficGeneratorType(d["type"]):
-            case TrafficGeneratorType.SCAPY:
-                return ScapyTrafficGeneratorConfig(
-                    traffic_generator_type=TrafficGeneratorType.SCAPY
-                )
-            case _:
-                raise ConfigurationError(f'Unknown traffic generator type "{d["type"]}".')
 
+#: A union type discriminating traffic generators by the `type` field.
+TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
 
-@dataclass(slots=True, frozen=True)
-class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
-    """Scapy traffic generator specific configuration."""
 
-    pass
+#: A field representing logical core ranges.
+LogicalCores = Annotated[
+    str,
+    Field(
+        description="Comma-separated list of logical cores to use. "
+        "An empty string means use all lcores.",
+        examples=["1,2,3,4,5,18-22", "10-15"],
+        pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
+    ),
+]
 
 
-@dataclass(slots=True, frozen=True)
-class NodeConfiguration:
+class NodeConfiguration(FrozenModel):
     r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
 
     Attributes:
@@ -234,69 +222,24 @@ class NodeConfiguration:
         ports: The ports that can be used in testing.
     """
 
-    name: str
-    hostname: str
-    user: str
-    password: str | None
+    name: str = Field(description="A unique identifier for this node.")
+    hostname: str = Field(description="The hostname or IP address of the node.")
+    user: str = Field(description="The login user to use to connect to this node.")
+    password: str | None = Field(
+        default=None,
+        description="The login password to use to connect to this node. "
+        "SSH keys are STRONGLY preferred, use only as last resort.",
+    )
     arch: Architecture
     os: OS
-    lcores: str
-    use_first_core: bool
-    hugepages: HugepageConfiguration | None
-    ports: list[PortConfig]
-
-    @staticmethod
-    def from_dict(
-        d: NodeConfigDict,
-    ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
-        """A convenience method that processes the inputs before creating a specialized instance.
-
-        Args:
-            d: The configuration dictionary.
-
-        Returns:
-            Either an SUT or TG configuration instance.
-        """
-        hugepage_config = None
-        if "hugepages_2mb" in d:
-            hugepage_config_dict = d["hugepages_2mb"]
-            if "force_first_numa" not in hugepage_config_dict:
-                hugepage_config_dict["force_first_numa"] = False
-            hugepage_config = HugepageConfiguration(**hugepage_config_dict)
-
-        # The calls here contain duplicated code which is here because Mypy doesn't
-        # properly support dictionary unpacking with TypedDicts
-        if "traffic_generator" in d:
-            return TGNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                traffic_generator=TrafficGeneratorConfig.from_dict(d["traffic_generator"]),
-            )
-        else:
-            return SutNodeConfiguration(
-                name=d["name"],
-                hostname=d["hostname"],
-                user=d["user"],
-                password=d.get("password"),
-                arch=Architecture(d["arch"]),
-                os=OS(d["os"]),
-                lcores=d.get("lcores", "1"),
-                use_first_core=d.get("use_first_core", False),
-                hugepages=hugepage_config,
-                ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
-                memory_channels=d.get("memory_channels", 1),
-            )
+    lcores: LogicalCores = "1"
+    use_first_core: bool = Field(
+        default=False, description="DPDK won't use the first physical core if set to False."
+    )
+    hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
+    ports: list[PortConfig] = Field(min_length=1)
 
 
-@dataclass(slots=True, frozen=True)
 class SutNodeConfiguration(NodeConfiguration):
     """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
 
@@ -304,10 +247,11 @@ class SutNodeConfiguration(NodeConfiguration):
         memory_channels: The number of memory channels to use when running DPDK.
     """
 
-    memory_channels: int
+    memory_channels: int = Field(
+        default=1, description="Number of memory channels to use when running DPDK."
+    )
 
 
-@dataclass(slots=True, frozen=True)
 class TGNodeConfiguration(NodeConfiguration):
     """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
 
@@ -315,204 +259,280 @@ class TGNodeConfiguration(NodeConfiguration):
         traffic_generator: The configuration of the traffic generator present on the TG node.
     """
 
-    traffic_generator: TrafficGeneratorConfig
+    traffic_generator: TrafficGeneratorConfigTypes
+
+
+#: Union type for all the node configuration types.
+NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKBuildConfiguration:
-    """DPDK build configuration.
+def resolve_path(path: Path) -> Path:
+    """Resolve a path into a real path."""
+    return path.resolve()
 
-    The configuration used for building DPDK.
+
+class BaseDPDKLocation(FrozenModel):
+    """DPDK location.
+
+    The path to the DPDK sources, build dir and type of location.
 
     Attributes:
-        arch: The target architecture to build for.
-        os: The target os to build for.
-        cpu: The target CPU to build for.
-        compiler: The compiler executable to use.
-        compiler_wrapper: This string will be put in front of the compiler when
-            executing the build. Useful for adding wrapper commands, such as ``ccache``.
-        name: The name of the compiler.
+        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
+            located on the SUT node, instead of the execution host.
     """
 
-    arch: Architecture
-    os: OS
-    cpu: CPUType
-    compiler: Compiler
-    compiler_wrapper: str
-    name: str
+    remote: bool = False
 
-    @classmethod
-    def from_dict(cls, d: DPDKBuildConfigDict) -> Self:
-        r"""A convenience method that processes the inputs before creating an instance.
 
-        `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
-        `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
+class LocalDPDKLocation(BaseDPDKLocation):
+    """Local DPDK location parent class.
 
-        Args:
-            d: The configuration dictionary.
+    This class is meant to represent any location that is present only locally.
+    """
 
-        Returns:
-            The DPDK build configuration instance.
-        """
-        return cls(
-            arch=Architecture(d["arch"]),
-            os=OS(d["os"]),
-            cpu=CPUType(d["cpu"]),
-            compiler=Compiler(d["compiler"]),
-            compiler_wrapper=d.get("compiler_wrapper", ""),
-            name=f"{d['arch']}-{d['os']}-{d['cpu']}-{d['compiler']}",
-        )
+    remote: Literal[False] = False
 
 
-@dataclass(slots=True, frozen=True)
-class DPDKLocation:
-    """DPDK location.
+class LocalDPDKTreeLocation(LocalDPDKLocation):
+    """Local DPDK tree location.
 
-    The path to the DPDK sources, build dir and type of location.
+    This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
+    validation.
 
     Attributes:
-        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
-            must be provided.
-        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
-            provided.
-        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
-            located on the SUT node, instead of the execution host.
-        build_dir: If it's defined, DPDK has been pre-compiled and the build directory is located in
-            a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will be using
-            `build_options` from configuration to build the DPDK from source.
+        dpdk_tree: The path to the DPDK source tree directory.
     """
 
-    dpdk_tree: str | None
-    tarball: str | None
-    remote: bool
-    build_dir: str | None
+    dpdk_tree: Path
 
-    @classmethod
-    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
-        """A convenience method that processes and validates the inputs before creating an instance.
+    #: Resolve the local DPDK tree path
+    resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
 
-        Validate existence and format of `dpdk_tree` or `tarball` on local filesystem, if
-        `remote` is False.
+    @model_validator(mode="after")
+    def validate_dpdk_tree_path(self) -> Self:
+        """Validate the provided DPDK tree path."""
+        assert self.dpdk_tree.exists(), "DPDK tree not found in local filesystem."
+        assert self.dpdk_tree.is_dir(), "The DPDK tree path must be a directory."
+        return self
 
-        Args:
-            d: The configuration dictionary.
 
-        Returns:
-            The DPDK location instance.
+class LocalDPDKTarballLocation(LocalDPDKLocation):
+    """Local DPDK tarball location.
 
-        Raises:
-            ConfigurationError: If `dpdk_tree` or `tarball` not found in local filesystem or they
-                aren't in the right format.
-        """
-        dpdk_tree = d.get("dpdk_tree")
-        tarball = d.get("tarball")
-        remote = d.get("remote", False)
-
-        if not remote:
-            if dpdk_tree:
-                if not Path(dpdk_tree).exists():
-                    raise ConfigurationError(
-                        f"DPDK tree '{dpdk_tree}' not found in local filesystem."
-                    )
-
-                if not Path(dpdk_tree).is_dir():
-                    raise ConfigurationError(f"The DPDK tree '{dpdk_tree}' must be a directory.")
-
-                dpdk_tree = os.path.realpath(dpdk_tree)
-
-            if tarball:
-                if not Path(tarball).exists():
-                    raise ConfigurationError(
-                        f"DPDK tarball '{tarball}' not found in local filesystem."
-                    )
-
-                if not tarfile.is_tarfile(tarball):
-                    raise ConfigurationError(
-                        f"The DPDK tarball '{tarball}' must be a valid tar archive."
-                    )
-
-        return cls(
-            dpdk_tree=dpdk_tree,
-            tarball=tarball,
-            remote=remote,
-            build_dir=d.get("precompiled_build_dir"),
-        )
+    This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
+    validation.
+
+    Attributes:
+        tarball: The path to the DPDK tarball.
+    """
+
+    tarball: Path
+
+    #: Resolve the local tarball path
+    resolve_tarball_path = field_validator("tarball")(resolve_path)
+
+    @model_validator(mode="after")
+    def validate_tarball_path(self) -> Self:
+        """Validate the provided tarball."""
+        assert self.tarball.exists(), "DPDK tarball not found in local filesystem."
+        assert tarfile.is_tarfile(self.tarball), "The DPDK tarball must be a valid tar archive."
+        return self
+
+
+class RemoteDPDKLocation(BaseDPDKLocation):
+    """Remote DPDK location parent class.
+
+    This class is meant to represent any location that is present only remotely.
+    """
+
+    remote: Literal[True] = True
+
+
+class RemoteDPDKTreeLocation(RemoteDPDKLocation):
+    """Remote DPDK tree location.
+
+    This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
+
+    Attributes:
+        dpdk_tree: The path to the DPDK source tree directory.
+    """
+
+    dpdk_tree: PurePath
 
 
-@dataclass
-class DPDKConfiguration:
-    """The configuration of the DPDK build.
+class RemoteDPDKTarballLocation(RemoteDPDKLocation):
+    """Remote DPDK tarball location.
 
-    The configuration contain the location of the DPDK and configuration used for
-    building it.
+    This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
+    validation.
+
+    Attributes:
+        tarball: The path to the DPDK tarball.
+    """
+
+    tarball: PurePath
+
+
+#: Union type for different DPDK locations
+DPDKLocation = (
+    LocalDPDKTreeLocation
+    | LocalDPDKTarballLocation
+    | RemoteDPDKTreeLocation
+    | RemoteDPDKTarballLocation
+)
+
+
+class BaseDPDKBuildConfiguration(FrozenModel):
+    """The base configuration for different types of build.
+
+    The configuration contain the location of the DPDK and configuration used for building it.
 
     Attributes:
         dpdk_location: The location of the DPDK tree.
-        dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
-            DTS will use pre-built DPDK from `build_dir` in a :class:`DPDKLocation`.
     """
 
     dpdk_location: DPDKLocation
-    dpdk_build_config: DPDKBuildConfiguration | None
 
-    @classmethod
-    def from_dict(cls, d: DPDKConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
 
-        Args:
-            d: The configuration dictionary.
+class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration):
+    """DPDK precompiled build configuration.
 
-        Returns:
-            The DPDK configuration.
-        """
-        return cls(
-            dpdk_location=DPDKLocation.from_dict(d),
-            dpdk_build_config=(
-                DPDKBuildConfiguration.from_dict(d["build_options"])
-                if d.get("build_options")
-                else None
-            ),
-        )
+    Attributes:
+        precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory
+            is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will
+            be using `dpdk_build_config` from configuration to build the DPDK from source.
+    """
+
+    precompiled_build_dir: str = Field(min_length=1)
+
+
+class DPDKBuildOptionsConfiguration(FrozenModel):
+    """DPDK build options configuration.
+
+    The build options used for building DPDK.
+
+    Attributes:
+        arch: The target architecture to build for.
+        os: The target os to build for.
+        cpu: The target CPU to build for.
+        compiler: The compiler executable to use.
+        compiler_wrapper: This string will be put in front of the compiler when executing the build.
+            Useful for adding wrapper commands, such as ``ccache``.
+    """
+
+    arch: Architecture
+    os: OS
+    cpu: CPUType
+    compiler: Compiler
+    compiler_wrapper: str = ""
+
+    @cached_property
+    def name(self) -> str:
+        """The name of the compiler."""
+        return f"{self.arch}-{self.os}-{self.cpu}-{self.compiler}"
+
+
+class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration):
+    """DPDK uncompiled build configuration.
+
+    Attributes:
+        build_options: The build options to compile DPDK.
+    """
+
+    build_options: DPDKBuildOptionsConfiguration
 
 
-@dataclass(slots=True, frozen=True)
-class TestSuiteConfig:
+#: Union type for different build configurations
+DPDKBuildConfiguration = DPDKPrecompiledBuildConfiguration | DPDKUncompiledBuildConfiguration
+
+
+class TestSuiteConfig(FrozenModel):
     """Test suite configuration.
 
-    Information about a single test suite to be executed.
+    Information about a single test suite to be executed. This can also be represented as a string
+    instead of a mapping, example:
+
+    .. code:: yaml
+
+        test_runs:
+        - test_suites:
+            # As string representation:
+            - hello_world # test all of `hello_world`, or
+            - hello_world hello_world_single_core # test only `hello_world_single_core`
+            # or as model fields:
+            - test_suite: hello_world
+              test_cases: [hello_world_single_core] # without this field all test cases are run
 
     Attributes:
-        test_suite: The name of the test suite module without the starting ``TestSuite_``.
-        test_cases: The names of test cases from this test suite to execute.
+        test_suite_name: The name of the test suite module without the starting ``TestSuite_``.
+        test_cases_names: The names of test cases from this test suite to execute.
             If empty, all test cases will be executed.
     """
 
-    test_suite: str
-    test_cases: list[str]
-
+    test_suite_name: str = Field(
+        title="Test suite name",
+        description="The identifying module name of the test suite without the prefix.",
+        alias="test_suite",
+    )
+    test_cases_names: list[str] = Field(
+        default_factory=list,
+        title="Test cases by name",
+        description="The identifying name of the test cases of the test suite.",
+        alias="test_cases",
+    )
+
+    @cached_property
+    def test_suite_spec(self) -> "TestSuiteSpec":
+        """The specification of the requested test suite."""
+        from framework.test_suite import find_by_name
+
+        test_suite_spec = find_by_name(self.test_suite_name)
+        assert (
+            test_suite_spec is not None
+        ), f"{self.test_suite_name} is not a valid test suite module name."
+        return test_suite_spec
+
+    @model_validator(mode="before")
     @classmethod
-    def from_dict(
-        cls,
-        entry: str | TestSuiteConfigDict,
-    ) -> Self:
-        """Create an instance from two different types.
+    def convert_from_string(cls, data: Any) -> Any:
+        """Convert the string representation of the model into a valid mapping."""
+        if isinstance(data, str):
+            [test_suite, *test_cases] = data.split()
+            return dict(test_suite=test_suite, test_cases=test_cases)
+        return data
+
+    @model_validator(mode="after")
+    def validate_names(self) -> Self:
+        """Validate the supplied test suite and test cases names.
+
+        This validator relies on the cached property `test_suite_spec` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        available_test_cases = map(
+            lambda t: t.name, self.test_suite_spec.class_obj.get_test_cases()
+        )
+        for requested_test_case in self.test_cases_names:
+            assert requested_test_case in available_test_cases, (
+                f"{requested_test_case} is not a valid test case "
+                f"of test suite {self.test_suite_name}."
+            )
 
-        Args:
-            entry: Either a suite name or a dictionary containing the config.
+        return self
 
-        Returns:
-            The test suite configuration instance.
-        """
-        if isinstance(entry, str):
-            return cls(test_suite=entry, test_cases=[])
-        elif isinstance(entry, dict):
-            return cls(test_suite=entry["suite"], test_cases=entry["cases"])
-        else:
-            raise TypeError(f"{type(entry)} is not valid for a test suite config.")
 
+class TestRunSUTNodeConfiguration(FrozenModel):
+    """The SUT node configuration of a test run.
 
-@dataclass(slots=True, frozen=True)
-class TestRunConfiguration:
+    Attributes:
+        node_name: The SUT node to use in this test run.
+        vdevs: The names of virtual devices to test.
+    """
+
+    node_name: str
+    vdevs: list[str] = Field(default_factory=list)
+
+
+class TestRunConfiguration(FrozenModel):
     """The configuration of a test run.
 
     The configuration contains testbed information, what tests to execute
@@ -524,144 +544,130 @@ class TestRunConfiguration:
         func: Whether to run functional tests.
         skip_smoke_tests: Whether to skip smoke tests.
         test_suites: The names of test suites and/or test cases to execute.
-        system_under_test_node: The SUT node to use in this test run.
-        traffic_generator_node: The TG node to use in this test run.
-        vdevs: The names of virtual devices to test.
+        system_under_test_node: The SUT node configuration to use in this test run.
+        traffic_generator_node: The TG node name to use in this test run.
         random_seed: The seed to use for pseudo-random generation.
     """
 
-    dpdk_config: DPDKConfiguration
-    perf: bool
-    func: bool
-    skip_smoke_tests: bool
-    test_suites: list[TestSuiteConfig]
-    system_under_test_node: SutNodeConfiguration
-    traffic_generator_node: TGNodeConfiguration
-    vdevs: list[str]
-    random_seed: int | None
+    dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
+    perf: bool = Field(description="Enable performance testing.")
+    func: bool = Field(description="Enable functional testing.")
+    skip_smoke_tests: bool = False
+    test_suites: list[TestSuiteConfig] = Field(min_length=1)
+    system_under_test_node: TestRunSUTNodeConfiguration
+    traffic_generator_node: str
+    random_seed: int | None = None
 
-    @classmethod
-    def from_dict(
-        cls,
-        d: TestRunConfigDict,
-        node_map: dict[str, SutNodeConfiguration | TGNodeConfiguration],
-    ) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
-
-        The DPDK build and the test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
-
-        Args:
-            d: The test run configuration dictionary.
-            node_map: A dictionary mapping node names to their config objects.
-
-        Returns:
-            The test run configuration instance.
-        """
-        test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"]))
-        sut_name = d["system_under_test_node"]["node_name"]
-        skip_smoke_tests = d.get("skip_smoke_tests", False)
-        assert sut_name in node_map, f"Unknown SUT {sut_name} in test run {d}"
-        system_under_test_node = node_map[sut_name]
-        assert isinstance(
-            system_under_test_node, SutNodeConfiguration
-        ), f"Invalid SUT configuration {system_under_test_node}"
-
-        tg_name = d["traffic_generator_node"]
-        assert tg_name in node_map, f"Unknown TG {tg_name} in test run {d}"
-        traffic_generator_node = node_map[tg_name]
-        assert isinstance(
-            traffic_generator_node, TGNodeConfiguration
-        ), f"Invalid TG configuration {traffic_generator_node}"
-
-        vdevs = (
-            d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else []
-        )
-        random_seed = d.get("random_seed", None)
-        return cls(
-            dpdk_config=DPDKConfiguration.from_dict(d["dpdk_build"]),
-            perf=d["perf"],
-            func=d["func"],
-            skip_smoke_tests=skip_smoke_tests,
-            test_suites=test_suites,
-            system_under_test_node=system_under_test_node,
-            traffic_generator_node=traffic_generator_node,
-            vdevs=vdevs,
-            random_seed=random_seed,
-        )
-
-    def copy_and_modify(self, **kwargs) -> Self:
-        """Create a shallow copy with any of the fields modified.
-
-        The only new data are those passed to this method.
-        The rest are copied from the object's fields calling the method.
 
-        Args:
-            **kwargs: The names and types of keyword arguments are defined
-                by the fields of the :class:`TestRunConfiguration` class.
-
-        Returns:
-            The copied and modified test run configuration.
-        """
-        new_config = {}
-        for field in fields(self):
-            if field.name in kwargs:
-                new_config[field.name] = kwargs[field.name]
-            else:
-                new_config[field.name] = getattr(self, field.name)
+class TestRunWithNodesConfiguration(NamedTuple):
+    """Tuple containing the configuration of the test run and its associated nodes."""
 
-        return type(self)(**new_config)
+    #:
+    test_run_config: TestRunConfiguration
+    #:
+    sut_node_config: SutNodeConfiguration
+    #:
+    tg_node_config: TGNodeConfiguration
 
 
-@dataclass(slots=True, frozen=True)
-class Configuration:
+class Configuration(FrozenModel):
     """DTS testbed and test configuration.
 
-    The node configuration is not stored in this object. Rather, all used node configurations
-    are stored inside the test run configuration where the nodes are actually used.
-
     Attributes:
         test_runs: Test run configurations.
+        nodes: Node configurations.
     """
 
-    test_runs: list[TestRunConfiguration]
+    test_runs: list[TestRunConfiguration] = Field(min_length=1)
+    nodes: list[NodeConfigurationTypes] = Field(min_length=1)
 
-    @classmethod
-    def from_dict(cls, d: ConfigurationDict) -> Self:
-        """A convenience method that processes the inputs before creating an instance.
+    @cached_property
+    def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]:
+        """List of test runs with the associated nodes."""
+        test_runs_with_nodes = []
 
-        DPDK build and test suite config are transformed into their respective objects.
-        SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
-        are just stored.
+        for test_run_no, test_run in enumerate(self.test_runs):
+            sut_node_name = test_run.system_under_test_node.node_name
+            sut_node = next(filter(lambda n: n.name == sut_node_name, self.nodes), None)
 
-        Args:
-            d: The configuration dictionary.
+            assert sut_node is not None, (
+                f"test_runs.{test_run_no}.sut_node_config.node_name "
+                f"({test_run.system_under_test_node.node_name}) is not a valid node name"
+            )
+            assert isinstance(sut_node, SutNodeConfiguration), (
+                f"test_runs.{test_run_no}.sut_node_config.node_name is a valid node name, "
+                "but it is not a valid SUT node"
+            )
 
-        Returns:
-            The whole configuration instance.
-        """
-        nodes: list[SutNodeConfiguration | TGNodeConfiguration] = list(
-            map(NodeConfiguration.from_dict, d["nodes"])
-        )
-        assert len(nodes) > 0, "There must be a node to test"
+            tg_node_name = test_run.traffic_generator_node
+            tg_node = next(filter(lambda n: n.name == tg_node_name, self.nodes), None)
 
-        node_map = {node.name: node for node in nodes}
-        assert len(nodes) == len(node_map), "Duplicate node names are not allowed"
+            assert tg_node is not None, (
+                f"test_runs.{test_run_no}.tg_node_name "
+                f"({test_run.traffic_generator_node}) is not a valid node name"
+            )
+            assert isinstance(tg_node, TGNodeConfiguration), (
+                f"test_runs.{test_run_no}.tg_node_name is a valid node name, "
+                "but it is not a valid TG node"
+            )
 
-        test_runs: list[TestRunConfiguration] = list(
-            map(TestRunConfiguration.from_dict, d["test_runs"], [node_map for _ in d])
-        )
+            test_runs_with_nodes.append(TestRunWithNodesConfiguration(test_run, sut_node, tg_node))
 
-        return cls(test_runs=test_runs)
+        return test_runs_with_nodes
+
+    @field_validator("nodes")
+    @classmethod
+    def validate_node_names(cls, nodes: list[NodeConfiguration]) -> list[NodeConfiguration]:
+        """Validate that the node names are unique."""
+        nodes_by_name: dict[str, int] = {}
+        for node_no, node in enumerate(nodes):
+            assert node.name not in nodes_by_name, (
+                f"node {node_no} cannot have the same name as node {nodes_by_name[node.name]} "
+                f"({node.name})"
+            )
+            nodes_by_name[node.name] = node_no
+
+        return nodes
+
+    @model_validator(mode="after")
+    def validate_ports(self) -> Self:
+        """Validate that the ports are all linked to valid ones."""
+        port_links: dict[tuple[str, str], Literal[False] | tuple[int, int]] = {
+            (node.name, port.pci): False for node in self.nodes for port in node.ports
+        }
+
+        for node_no, node in enumerate(self.nodes):
+            for port_no, port in enumerate(node.ports):
+                peer_port_identifier = (port.peer_node, port.peer_pci)
+                peer_port = port_links.get(peer_port_identifier, None)
+                assert peer_port is not None, (
+                    "invalid peer port specified for " f"nodes.{node_no}.ports.{port_no}"
+                )
+                assert peer_port is False, (
+                    f"the peer port specified for nodes.{node_no}.ports.{port_no} "
+                    f"is already linked to nodes.{peer_port[0]}.ports.{peer_port[1]}"
+                )
+                port_links[peer_port_identifier] = (node_no, port_no)
+
+        return self
+
+    @model_validator(mode="after")
+    def validate_test_runs_with_nodes(self) -> Self:
+        """Validate the test runs to nodes associations.
+
+        This validator relies on the cached property `test_runs_with_nodes` to run for the first
+        time in this call, therefore triggering the assertions if needed.
+        """
+        if self.test_runs_with_nodes:
+            pass
+        return self
 
 
 def load_config(config_file_path: Path) -> Configuration:
     """Load DTS test run configuration from a file.
 
-    Load the YAML test run configuration file
-    and :download:`the configuration file schema <conf_yaml_schema.json>`,
-    validate the test run configuration file, and create a test run configuration object.
+    Load the YAML test run configuration file, validate it, and create a test run configuration
+    object.
 
     The YAML test run configuration file is specified in the :option:`--config-file` command line
     argument or the :envvar:`DTS_CFG_FILE` environment variable.
@@ -671,14 +677,14 @@ def load_config(config_file_path: Path) -> Configuration:
 
     Returns:
         The parsed test run configuration.
+
+    Raises:
+        ConfigurationError: If the supplied configuration file is invalid.
     """
     with open(config_file_path, "r") as f:
         config_data = yaml.safe_load(f)
 
-    schema_path = os.path.join(Path(__file__).parent.resolve(), "conf_yaml_schema.json")
-
-    with open(schema_path, "r") as f:
-        schema = json.load(f)
-    config = warlock.model_factory(schema, name="_Config")(config_data)
-    config_obj: Configuration = Configuration.from_dict(dict(config))  # type: ignore[arg-type]
-    return config_obj
+    try:
+        return Configuration.model_validate(config_data)
+    except ValidationError as e:
+        raise ConfigurationError("failed to load the supplied configuration") from e
diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
deleted file mode 100644
index cc3e78cef5..0000000000
--- a/dts/framework/config/conf_yaml_schema.json
+++ /dev/null
@@ -1,459 +0,0 @@
-{
-  "$schema": "https://json-schema.org/draft-07/schema",
-  "title": "DTS Config Schema",
-  "definitions": {
-    "node_name": {
-      "type": "string",
-      "description": "A unique identifier for a node"
-    },
-    "NIC": {
-      "type": "string",
-      "enum": [
-        "ALL",
-        "ConnectX3_MT4103",
-        "ConnectX4_LX_MT4117",
-        "ConnectX4_MT4115",
-        "ConnectX5_MT4119",
-        "ConnectX5_MT4121",
-        "I40E_10G-10G_BASE_T_BC",
-        "I40E_10G-10G_BASE_T_X722",
-        "I40E_10G-SFP_X722",
-        "I40E_10G-SFP_XL710",
-        "I40E_10G-X722_A0",
-        "I40E_1G-1G_BASE_T_X722",
-        "I40E_25G-25G_SFP28",
-        "I40E_40G-QSFP_A",
-        "I40E_40G-QSFP_B",
-        "IAVF-ADAPTIVE_VF",
-        "IAVF-VF",
-        "IAVF_10G-X722_VF",
-        "ICE_100G-E810C_QSFP",
-        "ICE_25G-E810C_SFP",
-        "ICE_25G-E810_XXV_SFP",
-        "IGB-I350_VF",
-        "IGB_1G-82540EM",
-        "IGB_1G-82545EM_COPPER",
-        "IGB_1G-82571EB_COPPER",
-        "IGB_1G-82574L",
-        "IGB_1G-82576",
-        "IGB_1G-82576_QUAD_COPPER",
-        "IGB_1G-82576_QUAD_COPPER_ET2",
-        "IGB_1G-82580_COPPER",
-        "IGB_1G-I210_COPPER",
-        "IGB_1G-I350_COPPER",
-        "IGB_1G-I354_SGMII",
-        "IGB_1G-PCH_LPTLP_I218_LM",
-        "IGB_1G-PCH_LPTLP_I218_V",
-        "IGB_1G-PCH_LPT_I217_LM",
-        "IGB_1G-PCH_LPT_I217_V",
-        "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
-        "IGC-I225_LM",
-        "IGC-I226_LM",
-        "IXGBE_10G-82599_SFP",
-        "IXGBE_10G-82599_SFP_SF_QP",
-        "IXGBE_10G-82599_T3_LOM",
-        "IXGBE_10G-82599_VF",
-        "IXGBE_10G-X540T",
-        "IXGBE_10G-X540_VF",
-        "IXGBE_10G-X550EM_A_SFP",
-        "IXGBE_10G-X550EM_X_10G_T",
-        "IXGBE_10G-X550EM_X_SFP",
-        "IXGBE_10G-X550EM_X_VF",
-        "IXGBE_10G-X550T",
-        "IXGBE_10G-X550_VF",
-        "brcm_57414",
-        "brcm_P2100G",
-        "cavium_0011",
-        "cavium_a034",
-        "cavium_a063",
-        "cavium_a064",
-        "fastlinq_ql41000",
-        "fastlinq_ql41000_vf",
-        "fastlinq_ql45000",
-        "fastlinq_ql45000_vf",
-        "hi1822",
-        "virtio"
-      ]
-    },
-
-    "ARCH": {
-      "type": "string",
-      "enum": [
-        "x86_64",
-        "arm64",
-        "ppc64le"
-      ]
-    },
-    "OS": {
-      "type": "string",
-      "enum": [
-        "linux"
-      ]
-    },
-    "cpu": {
-      "type": "string",
-      "description": "Native should be the default on x86",
-      "enum": [
-        "native",
-        "armv8a",
-        "dpaa2",
-        "thunderx",
-        "xgene1"
-      ]
-    },
-    "compiler": {
-      "type": "string",
-      "enum": [
-        "gcc",
-        "clang",
-        "icc",
-        "mscv"
-      ]
-    },
-    "build_options": {
-      "type": "object",
-      "properties": {
-        "arch": {
-          "type": "string",
-          "enum": [
-            "ALL",
-            "x86_64",
-            "arm64",
-            "ppc64le",
-            "other"
-          ]
-        },
-        "os": {
-          "$ref": "#/definitions/OS"
-        },
-        "cpu": {
-          "$ref": "#/definitions/cpu"
-        },
-        "compiler": {
-          "$ref": "#/definitions/compiler"
-        },
-        "compiler_wrapper": {
-          "type": "string",
-          "description": "This will be added before compiler to the CC variable when building DPDK. Optional."
-        }
-      },
-      "additionalProperties": false,
-      "required": [
-        "arch",
-        "os",
-        "cpu",
-        "compiler"
-      ]
-    },
-    "dpdk_build": {
-      "type": "object",
-      "description": "DPDK source and build configuration.",
-      "properties": {
-        "dpdk_tree": {
-          "type": "string",
-          "description": "The path to the DPDK source tree directory to test. Only one of `dpdk_tree` or `tarball` must be provided."
-        },
-        "tarball": {
-          "type": "string",
-          "description": "The path to the DPDK source tarball to test. Only one of `dpdk_tree` or `tarball` must be provided."
-        },
-        "remote": {
-          "type": "boolean",
-          "description": "Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball` is located on the SUT node, instead of the execution host."
-        },
-        "precompiled_build_dir": {
-          "type": "string",
-          "description": "If it's defined, DPDK has been pre-built and the build directory is located in a subdirectory of DPDK tree root directory. Otherwise, will be using a `build_options` to build the DPDK from source. Either this or `build_options` must be defined, but not both."
-        },
-        "build_options": {
-          "$ref": "#/definitions/build_options",
-          "description": "Either this or `precompiled_build_dir` must be defined, but not both. DPDK build configuration supported by DTS."
-        }
-      },
-      "allOf": [
-        {
-          "oneOf": [
-            {
-            "required": [
-              "dpdk_tree"
-              ]
-            },
-            {
-              "required": [
-                "tarball"
-              ]
-            }
-          ]
-        },
-        {
-          "oneOf": [
-            {
-              "required": [
-                "precompiled_build_dir"
-              ]
-            },
-            {
-              "required": [
-                "build_options"
-              ]
-            }
-          ]
-        }
-      ],
-      "additionalProperties": false
-    },
-    "hugepages_2mb": {
-      "type": "object",
-      "description": "Optional hugepage configuration. If not specified, hugepages won't be configured and DTS will use system configuration.",
-      "properties": {
-        "number_of": {
-          "type": "integer",
-          "description": "The number of hugepages to configure. Hugepage size will be the system default."
-        },
-        "force_first_numa": {
-          "type": "boolean",
-          "description": "Set to True to force configuring hugepages on the first NUMA node. Defaults to False."
-        }
-      },
-      "additionalProperties": false,
-      "required": [
-        "number_of"
-      ]
-    },
-    "mac_address": {
-      "type": "string",
-      "description": "A MAC address",
-      "pattern": "^([0-9A-Fa-f]{2}[:-]){5}([0-9A-Fa-f]{2})$"
-    },
-    "pci_address": {
-      "type": "string",
-      "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
-    },
-    "port_peer_address": {
-      "description": "Peer is a TRex port, and IXIA port or a PCI address",
-      "oneOf": [
-        {
-          "description": "PCI peer port",
-          "$ref": "#/definitions/pci_address"
-        }
-      ]
-    },
-    "test_suite": {
-      "type": "string",
-      "enum": [
-        "hello_world",
-        "os_udp",
-        "pmd_buffer_scatter",
-        "vlan"
-      ]
-    },
-    "test_target": {
-      "type": "object",
-      "properties": {
-        "suite": {
-          "$ref": "#/definitions/test_suite"
-        },
-        "cases": {
-          "type": "array",
-          "description": "If specified, only this subset of test suite's test cases will be run.",
-          "items": {
-            "type": "string"
-          },
-          "minimum": 1
-        }
-      },
-      "required": [
-        "suite"
-      ],
-      "additionalProperties": false
-    }
-  },
-  "type": "object",
-  "properties": {
-    "nodes": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "name": {
-            "type": "string",
-            "description": "A unique identifier for this node"
-          },
-          "hostname": {
-            "type": "string",
-            "description": "A hostname from which the node running DTS can access this node. This can also be an IP address."
-          },
-          "user": {
-            "type": "string",
-            "description": "The user to access this node with."
-          },
-          "password": {
-            "type": "string",
-            "description": "The password to use on this node. Use only as a last resort. SSH keys are STRONGLY preferred."
-          },
-          "arch": {
-            "$ref": "#/definitions/ARCH"
-          },
-          "os": {
-            "$ref": "#/definitions/OS"
-          },
-          "lcores": {
-            "type": "string",
-            "pattern": "^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
-            "description": "Optional comma-separated list of logical cores to use, e.g.: 1,2,3,4,5,18-22. Defaults to 1. An empty string means use all lcores."
-          },
-          "use_first_core": {
-            "type": "boolean",
-            "description": "Indicate whether DPDK should use the first physical core. It won't be used by default."
-          },
-          "memory_channels": {
-            "type": "integer",
-            "description": "How many memory channels to use. Optional, defaults to 1."
-          },
-          "hugepages_2mb": {
-            "$ref": "#/definitions/hugepages_2mb"
-          },
-          "ports": {
-            "type": "array",
-            "items": {
-              "type": "object",
-              "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.",
-              "properties": {
-                "pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The local PCI address of the port"
-                },
-                "os_driver_for_dpdk": {
-                  "type": "string",
-                  "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)"
-                },
-                "os_driver": {
-                  "type": "string",
-                  "description": "The driver normally used by this port (ex: i40e)"
-                },
-                "peer_node": {
-                  "type": "string",
-                  "description": "The name of the node the peer port is on"
-                },
-                "peer_pci": {
-                  "$ref": "#/definitions/pci_address",
-                  "description": "The PCI address of the peer port"
-                }
-              },
-              "additionalProperties": false,
-              "required": [
-                "pci",
-                "os_driver_for_dpdk",
-                "os_driver",
-                "peer_node",
-                "peer_pci"
-              ]
-            },
-            "minimum": 1
-          },
-          "traffic_generator": {
-            "oneOf": [
-              {
-                "type": "object",
-                "description": "Scapy traffic generator. Used for functional testing.",
-                "properties": {
-                  "type": {
-                    "type": "string",
-                    "enum": [
-                      "SCAPY"
-                    ]
-                  }
-                }
-              }
-            ]
-          }
-        },
-        "additionalProperties": false,
-        "required": [
-          "name",
-          "hostname",
-          "user",
-          "arch",
-          "os"
-        ]
-      },
-      "minimum": 1
-    },
-    "test_runs": {
-      "type": "array",
-      "items": {
-        "type": "object",
-        "properties": {
-          "dpdk_build": {
-            "$ref": "#/definitions/dpdk_build"
-          },
-          "perf": {
-            "type": "boolean",
-            "description": "Enable performance testing."
-          },
-          "func": {
-            "type": "boolean",
-            "description": "Enable functional testing."
-          },
-          "test_suites": {
-            "type": "array",
-            "items": {
-              "oneOf": [
-                {
-                  "$ref": "#/definitions/test_suite"
-                },
-                {
-                  "$ref": "#/definitions/test_target"
-                }
-              ]
-            }
-          },
-          "skip_smoke_tests": {
-            "description": "Optional field that allows you to skip smoke testing",
-            "type": "boolean"
-          },
-          "system_under_test_node": {
-            "type":"object",
-            "properties": {
-              "node_name": {
-                "$ref": "#/definitions/node_name"
-              },
-              "vdevs": {
-                "description": "Optional list of names of vdevs to be used in the test run",
-                "type": "array",
-                "items": {
-                  "type": "string"
-                }
-              }
-            },
-            "required": [
-              "node_name"
-            ]
-          },
-          "traffic_generator_node": {
-            "$ref": "#/definitions/node_name"
-          },
-          "random_seed": {
-            "type": "integer",
-            "description": "Optional field. Allows you to set a seed for pseudo-random generation."
-          }
-        },
-        "additionalProperties": false,
-        "required": [
-          "dpdk_build",
-          "perf",
-          "func",
-          "test_suites",
-          "system_under_test_node",
-          "traffic_generator_node"
-        ]
-      },
-      "minimum": 1
-    }
-  },
-  "required": [
-    "test_runs",
-    "nodes"
-  ],
-  "additionalProperties": false
-}
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
deleted file mode 100644
index 02e738a61e..0000000000
--- a/dts/framework/config/types.py
+++ /dev/null
@@ -1,149 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-"""Configuration dictionary contents specification.
-
-These type definitions serve as documentation of the configuration dictionary contents.
-
-The definitions use the built-in :class:`~typing.TypedDict` construct.
-"""
-
-from typing import TypedDict
-
-
-class PortConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    pci: str
-    #:
-    os_driver_for_dpdk: str
-    #:
-    os_driver: str
-    #:
-    peer_node: str
-    #:
-    peer_pci: str
-
-
-class TrafficGeneratorConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    type: str
-
-
-class HugepageConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    number_of: int
-    #:
-    force_first_numa: bool
-
-
-class NodeConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    hugepages_2mb: HugepageConfigurationDict
-    #:
-    name: str
-    #:
-    hostname: str
-    #:
-    user: str
-    #:
-    password: str
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    lcores: str
-    #:
-    use_first_core: bool
-    #:
-    ports: list[PortConfigDict]
-    #:
-    memory_channels: int
-    #:
-    traffic_generator: TrafficGeneratorConfigDict
-
-
-class DPDKBuildConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    arch: str
-    #:
-    os: str
-    #:
-    cpu: str
-    #:
-    compiler: str
-    #:
-    compiler_wrapper: str
-
-
-class DPDKConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    dpdk_tree: str | None
-    #:
-    tarball: str | None
-    #:
-    remote: bool
-    #:
-    precompiled_build_dir: str | None
-    #:
-    build_options: DPDKBuildConfigDict
-
-
-class TestSuiteConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    suite: str
-    #:
-    cases: list[str]
-
-
-class TestRunSUTConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    node_name: str
-    #:
-    vdevs: list[str]
-
-
-class TestRunConfigDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    dpdk_build: DPDKConfigurationDict
-    #:
-    perf: bool
-    #:
-    func: bool
-    #:
-    skip_smoke_tests: bool
-    #:
-    test_suites: TestSuiteConfigDict
-    #:
-    system_under_test_node: TestRunSUTConfigDict
-    #:
-    traffic_generator_node: str
-    #:
-    random_seed: int
-
-
-class ConfigurationDict(TypedDict):
-    """Allowed keys and values."""
-
-    #:
-    nodes: list[NodeConfigDict]
-    #:
-    test_runs: list[TestRunConfigDict]
diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 195622c653..c3d9a27a8c 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -30,7 +30,15 @@
 from framework.testbed_model.sut_node import SutNode
 from framework.testbed_model.tg_node import TGNode
 
-from .config import Configuration, TestRunConfiguration, TestSuiteConfig, load_config
+from .config import (
+    Configuration,
+    DPDKPrecompiledBuildConfiguration,
+    SutNodeConfiguration,
+    TestRunConfiguration,
+    TestSuiteConfig,
+    TGNodeConfiguration,
+    load_config,
+)
 from .exception import (
     BlockingTestSuiteError,
     ConfigurationError,
@@ -133,11 +141,10 @@ def run(self) -> None:
             self._result.update_setup(Result.PASS)
 
             # for all test run sections
-            for test_run_config in self._configuration.test_runs:
+            for test_run_with_nodes_config in self._configuration.test_runs_with_nodes:
+                test_run_config, sut_node_config, tg_node_config = test_run_with_nodes_config
                 self._logger.set_stage(DtsStage.test_run_setup)
-                self._logger.info(
-                    f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
-                )
+                self._logger.info(f"Running test run with SUT '{sut_node_config.name}'.")
                 self._init_random_seed(test_run_config)
                 test_run_result = self._result.add_test_run(test_run_config)
                 # we don't want to modify the original config, so create a copy
@@ -145,7 +152,7 @@ def run(self) -> None:
                     SETTINGS.test_suites if SETTINGS.test_suites else test_run_config.test_suites
                 )
                 if not test_run_config.skip_smoke_tests:
-                    test_run_test_suites[:0] = [TestSuiteConfig.from_dict("smoke_tests")]
+                    test_run_test_suites[:0] = [TestSuiteConfig(test_suite="smoke_tests")]
                 try:
                     test_suites_with_cases = self._get_test_suites_with_cases(
                         test_run_test_suites, test_run_config.func, test_run_config.perf
@@ -161,6 +168,8 @@ def run(self) -> None:
                     self._connect_nodes_and_run_test_run(
                         sut_nodes,
                         tg_nodes,
+                        sut_node_config,
+                        tg_node_config,
                         test_run_config,
                         test_run_result,
                         test_suites_with_cases,
@@ -223,10 +232,10 @@ def _get_test_suites_with_cases(
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite)
+            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
             test_cases: list[type[TestCase]] = []
             func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
-                test_suite_config.test_cases
+                test_suite_config.test_cases_names
             )
             if func:
                 test_cases.extend(func_test_cases)
@@ -305,6 +314,8 @@ def _connect_nodes_and_run_test_run(
         self,
         sut_nodes: dict[str, SutNode],
         tg_nodes: dict[str, TGNode],
+        sut_node_config: SutNodeConfiguration,
+        tg_node_config: TGNodeConfiguration,
         test_run_config: TestRunConfiguration,
         test_run_result: TestRunResult,
         test_suites_with_cases: Iterable[TestSuiteWithCases],
@@ -319,24 +330,26 @@ def _connect_nodes_and_run_test_run(
         Args:
             sut_nodes: A dictionary storing connected/to be connected SUT nodes.
             tg_nodes: A dictionary storing connected/to be connected TG nodes.
+            sut_node_config: The test run's SUT node configuration.
+            tg_node_config: The test run's TG node configuration.
             test_run_config: A test run configuration.
             test_run_result: The test run's result.
             test_suites_with_cases: The test suites with test cases to run.
         """
-        sut_node = sut_nodes.get(test_run_config.system_under_test_node.name)
-        tg_node = tg_nodes.get(test_run_config.traffic_generator_node.name)
+        sut_node = sut_nodes.get(sut_node_config.name)
+        tg_node = tg_nodes.get(tg_node_config.name)
 
         try:
             if not sut_node:
-                sut_node = SutNode(test_run_config.system_under_test_node)
+                sut_node = SutNode(sut_node_config)
                 sut_nodes[sut_node.name] = sut_node
             if not tg_node:
-                tg_node = TGNode(test_run_config.traffic_generator_node)
+                tg_node = TGNode(tg_node_config)
                 tg_nodes[tg_node.name] = tg_node
         except Exception as e:
-            failed_node = test_run_config.system_under_test_node.name
+            failed_node = test_run_config.system_under_test_node.node_name
             if sut_node:
-                failed_node = test_run_config.traffic_generator_node.name
+                failed_node = test_run_config.traffic_generator_node
             self._logger.exception(f"The Creation of node {failed_node} failed.")
             test_run_result.update_setup(Result.FAIL, e)
 
@@ -369,14 +382,22 @@ def _run_test_run(
             ConfigurationError: If the DPDK sources or build is not set up from config or settings.
         """
         self._logger.info(
-            f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
+            f"Running test run with SUT '{test_run_config.system_under_test_node.node_name}'."
         )
         test_run_result.add_sut_info(sut_node.node_info)
         try:
-            dpdk_location = SETTINGS.dpdk_location or test_run_config.dpdk_config.dpdk_location
-            sut_node.set_up_test_run(test_run_config, dpdk_location)
+            dpdk_build_config = test_run_config.dpdk_config
+            if new_location := SETTINGS.dpdk_location:
+                dpdk_build_config = dpdk_build_config.model_copy(
+                    update={"dpdk_location": new_location}
+                )
+            if dir := SETTINGS.precompiled_build_dir:
+                dpdk_build_config = DPDKPrecompiledBuildConfiguration(
+                    dpdk_location=dpdk_build_config.dpdk_location, precompiled_build_dir=dir
+                )
+            sut_node.set_up_test_run(test_run_config, dpdk_build_config)
             test_run_result.add_dpdk_build_info(sut_node.get_dpdk_build_info())
-            tg_node.set_up_test_run(test_run_config, dpdk_location)
+            tg_node.set_up_test_run(test_run_config, dpdk_build_config)
             test_run_result.update_setup(Result.PASS)
         except Exception as e:
             self._logger.exception("Test run setup failed.")
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index a32137dbb8..5a8e6e5aee 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -60,9 +60,8 @@
 .. option:: --precompiled-build-dir
 .. envvar:: DTS_PRECOMPILED_BUILD_DIR
 
-    Define the subdirectory under the DPDK tree root directory where the pre-compiled binaries are
-    located. If set, DTS will build DPDK under the `build` directory instead. Can only be used with
-    --dpdk-tree or --tarball.
+    Define the subdirectory under the DPDK tree root directory or tarball where the pre-compiled
+    binaries are located.
 
 .. option:: --test-suite
 .. envvar:: DTS_TEST_SUITES
@@ -95,13 +94,21 @@
 import argparse
 import os
 import sys
-import tarfile
 from argparse import Action, ArgumentDefaultsHelpFormatter, _get_action_name
 from dataclasses import dataclass, field
 from pathlib import Path
 from typing import Callable
 
-from .config import DPDKLocation, TestSuiteConfig
+from pydantic import ValidationError
+
+from .config import (
+    DPDKLocation,
+    LocalDPDKTarballLocation,
+    LocalDPDKTreeLocation,
+    RemoteDPDKTarballLocation,
+    RemoteDPDKTreeLocation,
+    TestSuiteConfig,
+)
 
 
 @dataclass(slots=True)
@@ -122,6 +129,8 @@ class Settings:
     #:
     dpdk_location: DPDKLocation | None = None
     #:
+    precompiled_build_dir: str | None = None
+    #:
     compile_timeout: float = 1200
     #:
     test_suites: list[TestSuiteConfig] = field(default_factory=list)
@@ -383,13 +392,11 @@ def _get_parser() -> _DTSArgumentParser:
 
     action = dpdk_build.add_argument(
         "--precompiled-build-dir",
-        help="Define the subdirectory under the DPDK tree root directory where the pre-compiled "
-        "binaries are located. If set, DTS will build DPDK under the `build` directory instead. "
-        "Can only be used with --dpdk-tree or --tarball.",
+        help="Define the subdirectory under the DPDK tree root directory or tarball where the "
+        "pre-compiled binaries are located.",
         metavar="DIR_NAME",
     )
     _add_env_var_to_action(action)
-    _required_with_one_of(parser, action, "dpdk_tarball_path", "dpdk_tree_path")
 
     action = parser.add_argument(
         "--compile-timeout",
@@ -442,61 +449,61 @@ def _get_parser() -> _DTSArgumentParser:
 
 
 def _process_dpdk_location(
+    parser: _DTSArgumentParser,
     dpdk_tree: str | None,
     tarball: str | None,
     remote: bool,
-    build_dir: str | None,
-):
+) -> DPDKLocation | None:
     """Process and validate DPDK build arguments.
 
     Ensures that either `dpdk_tree` or `tarball` is provided. Validate existence and format of
     `dpdk_tree` or `tarball` on local filesystem, if `remote` is False. Constructs and returns
-    the :class:`DPDKLocation` with the provided parameters if validation is successful.
+    any valid :class:`DPDKLocation` with the provided parameters if validation is successful.
 
     Args:
-        dpdk_tree: The path to the DPDK source tree directory. Only one of `dpdk_tree` or `tarball`
-            must be provided.
-        tarball: The path to the DPDK tarball. Only one of `dpdk_tree` or `tarball` must be
-            provided.
+        dpdk_tree: The path to the DPDK source tree directory.
+        tarball: The path to the DPDK tarball.
         remote: If :data:`True`, `dpdk_tree` or `tarball` is located on the SUT node, instead of the
             execution host.
-        build_dir: If it's defined, DPDK has been pre-built and the build directory is located in a
-            subdirectory of `dpdk_tree` or `tarball` root directory.
 
     Returns:
         A DPDK location if construction is successful, otherwise None.
-
-    Raises:
-        argparse.ArgumentTypeError: If `dpdk_tree` or `tarball` not found in local filesystem or
-            they aren't in the right format.
     """
-    if not (dpdk_tree or tarball):
-        return None
-
-    if not remote:
-        if dpdk_tree:
-            if not Path(dpdk_tree).exists():
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tree '{dpdk_tree}' not found in local filesystem."
-                )
-
-            if not Path(dpdk_tree).is_dir():
-                raise argparse.ArgumentTypeError(f"DPDK tree '{dpdk_tree}' must be a directory.")
-
-            dpdk_tree = os.path.realpath(dpdk_tree)
-
-        if tarball:
-            if not Path(tarball).exists():
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tarball '{tarball}' not found in local filesystem."
-                )
-
-            if not tarfile.is_tarfile(tarball):
-                raise argparse.ArgumentTypeError(
-                    f"DPDK tarball '{tarball}' must be a valid tar archive."
-                )
-
-    return DPDKLocation(dpdk_tree=dpdk_tree, tarball=tarball, remote=remote, build_dir=build_dir)
+    if dpdk_tree:
+        action = parser.find_action("dpdk_tree", _is_from_env)
+
+        try:
+            if remote:
+                return RemoteDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
+            else:
+                return LocalDPDKTreeLocation.model_validate({"dpdk_tree": dpdk_tree})
+        except ValidationError as e:
+            print(
+                "An error has occurred while validating the DPDK tree supplied in the "
+                f"{'environment variable' if action else 'arguments'}:",
+                file=sys.stderr,
+            )
+            print(e, file=sys.stderr)
+            sys.exit(1)
+
+    if tarball:
+        action = parser.find_action("tarball", _is_from_env)
+
+        try:
+            if remote:
+                return RemoteDPDKTarballLocation.model_validate({"tarball": tarball})
+            else:
+                return LocalDPDKTarballLocation.model_validate({"tarball": tarball})
+        except ValidationError as e:
+            print(
+                "An error has occurred while validating the DPDK tarball supplied in the "
+                f"{'environment variable' if action else 'arguments'}:",
+                file=sys.stderr,
+            )
+            print(e, file=sys.stderr)
+            sys.exit(1)
+
+    return None
 
 
 def _process_test_suites(
@@ -512,11 +519,24 @@ def _process_test_suites(
     Returns:
         A list of test suite configurations to execute.
     """
-    if parser.find_action("test_suites", _is_from_env):
+    action = parser.find_action("test_suites", _is_from_env)
+    if action:
         # Environment variable in the form of "SUITE1 CASE1 CASE2, SUITE2 CASE1, SUITE3, ..."
         args = [suite_with_cases.split() for suite_with_cases in args[0][0].split(",")]
 
-    return [TestSuiteConfig(test_suite, test_cases) for [test_suite, *test_cases] in args]
+    try:
+        return [
+            TestSuiteConfig(test_suite=test_suite, test_cases=test_cases)
+            for [test_suite, *test_cases] in args
+        ]
+    except ValidationError as e:
+        print(
+            "An error has occurred while validating the test suites supplied in the "
+            f"{'environment variable' if action else 'arguments'}:",
+            file=sys.stderr,
+        )
+        print(e, file=sys.stderr)
+        sys.exit(1)
 
 
 def get_settings() -> Settings:
@@ -532,7 +552,7 @@ def get_settings() -> Settings:
     args = parser.parse_args()
 
     args.dpdk_location = _process_dpdk_location(
-        args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source, args.precompiled_build_dir
+        parser, args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source
     )
     args.test_suites = _process_test_suites(parser, args.test_suites)
 
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index 62867fd80c..6031eaf937 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -17,7 +17,12 @@
 from ipaddress import IPv4Interface, IPv6Interface
 from typing import Union
 
-from framework.config import OS, DPDKLocation, NodeConfiguration, TestRunConfiguration
+from framework.config import (
+    OS,
+    DPDKBuildConfiguration,
+    NodeConfiguration,
+    TestRunConfiguration,
+)
 from framework.exception import ConfigurationError
 from framework.logger import DTSLogger, get_dts_logger
 
@@ -89,13 +94,15 @@ def __init__(self, node_config: NodeConfiguration):
         self._init_ports()
 
     def _init_ports(self) -> None:
-        self.ports = [Port(port_config) for port_config in self.config.ports]
+        self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
         self.main_session.update_ports(self.ports)
         for port in self.ports:
             self.configure_port_state(port)
 
     def set_up_test_run(
-        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+        self,
+        test_run_config: TestRunConfiguration,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Test run setup steps.
 
@@ -105,7 +112,7 @@ def set_up_test_run(
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
-            dpdk_location: The target source of the DPDK tree.
+            dpdk_build_config: The build configuration of DPDK.
         """
         self._setup_hugepages()
 
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index db37424954..294f5b36ba 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -364,7 +364,7 @@ def extract_remote_tarball(
         """
 
     @abstractmethod
-    def is_remote_dir(self, remote_path: str) -> bool:
+    def is_remote_dir(self, remote_path: PurePath) -> bool:
         """Check if the `remote_path` is a directory.
 
         Args:
@@ -375,7 +375,7 @@ def is_remote_dir(self, remote_path: str) -> bool:
         """
 
     @abstractmethod
-    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
+    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
         """Check if the `remote_tarball_path` is a tar archive.
 
         Args:
diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
index 82c84cf4f8..817405bea4 100644
--- a/dts/framework/testbed_model/port.py
+++ b/dts/framework/testbed_model/port.py
@@ -54,7 +54,7 @@ class Port:
     mac_address: str = ""
     logical_name: str = ""
 
-    def __init__(self, config: PortConfig):
+    def __init__(self, node_name: str, config: PortConfig):
         """Initialize the port from `node_name` and `config`.
 
         Args:
@@ -62,7 +62,7 @@ def __init__(self, config: PortConfig):
             config: The test run configuration of the port.
         """
         self.identifier = PortIdentifier(
-            node=config.node,
+            node=node_name,
             pci=config.pci,
         )
         self.os_driver = config.os_driver
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index d7a1f38cad..c0cca2ac50 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -201,12 +201,12 @@ def extract_remote_tarball(
         if expected_dir:
             self.send_command(f"ls {expected_dir}", verify=True)
 
-    def is_remote_dir(self, remote_path: str) -> bool:
+    def is_remote_dir(self, remote_path: PurePath) -> bool:
         """Overrides :meth:`~.os_session.OSSession.is_remote_dir`."""
         result = self.send_command(f"test -d {remote_path}")
         return not result.return_code
 
-    def is_remote_tarfile(self, remote_tarball_path: str) -> bool:
+    def is_remote_tarfile(self, remote_tarball_path: PurePath) -> bool:
         """Overrides :meth:`~.os_session.OSSession.is_remote_tarfile`."""
         result = self.send_command(f"tar -tvf {remote_tarball_path}")
         return not result.return_code
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 5474d436a1..be3faf7474 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -15,11 +15,17 @@
 import os
 import time
 from dataclasses import dataclass
-from pathlib import PurePath
+from pathlib import Path, PurePath
 
 from framework.config import (
     DPDKBuildConfiguration,
-    DPDKLocation,
+    DPDKBuildOptionsConfiguration,
+    DPDKPrecompiledBuildConfiguration,
+    DPDKUncompiledBuildConfiguration,
+    LocalDPDKTarballLocation,
+    LocalDPDKTreeLocation,
+    RemoteDPDKTarballLocation,
+    RemoteDPDKTreeLocation,
     SutNodeConfiguration,
     TestRunConfiguration,
 )
@@ -178,7 +184,9 @@ def get_dpdk_build_info(self) -> DPDKBuildInfo:
         return DPDKBuildInfo(dpdk_version=self.dpdk_version, compiler_version=self.compiler_version)
 
     def set_up_test_run(
-        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+        self,
+        test_run_config: TestRunConfiguration,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Extend the test run setup with vdev config and DPDK build set up.
 
@@ -188,12 +196,12 @@ def set_up_test_run(
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
-            dpdk_location: The target source of the DPDK tree.
+            dpdk_build_config: The build configuration of DPDK.
         """
-        super().set_up_test_run(test_run_config, dpdk_location)
-        for vdev in test_run_config.vdevs:
+        super().set_up_test_run(test_run_config, dpdk_build_config)
+        for vdev in test_run_config.system_under_test_node.vdevs:
             self.virtual_devices.append(VirtualDevice(vdev))
-        self._set_up_dpdk(dpdk_location, test_run_config.dpdk_config.dpdk_build_config)
+        self._set_up_dpdk(dpdk_build_config)
 
     def tear_down_test_run(self) -> None:
         """Extend the test run teardown with virtual device teardown and DPDK teardown."""
@@ -202,7 +210,8 @@ def tear_down_test_run(self) -> None:
         self._tear_down_dpdk()
 
     def _set_up_dpdk(
-        self, dpdk_location: DPDKLocation, dpdk_build_config: DPDKBuildConfiguration | None
+        self,
+        dpdk_build_config: DPDKBuildConfiguration,
     ) -> None:
         """Set up DPDK the SUT node and bind ports.
 
@@ -211,21 +220,26 @@ def _set_up_dpdk(
         are bound to those that DPDK needs.
 
         Args:
-            dpdk_location: The location of the DPDK tree.
-            dpdk_build_config: A DPDK build configuration to test. If :data:`None`,
-                DTS will use pre-built DPDK from a :dataclass:`DPDKLocation`.
+            dpdk_build_config: A DPDK build configuration to test.
         """
-        self._set_remote_dpdk_tree_path(dpdk_location.dpdk_tree, dpdk_location.remote)
-        if not self._remote_dpdk_tree_path:
-            if dpdk_location.dpdk_tree:
-                self._copy_dpdk_tree(dpdk_location.dpdk_tree)
-            elif dpdk_location.tarball:
-                self._prepare_and_extract_dpdk_tarball(dpdk_location.tarball, dpdk_location.remote)
-
-        self._set_remote_dpdk_build_dir(dpdk_location.build_dir)
-        if not self.remote_dpdk_build_dir and dpdk_build_config:
-            self._configure_dpdk_build(dpdk_build_config)
-            self._build_dpdk()
+        match dpdk_build_config.dpdk_location:
+            case RemoteDPDKTreeLocation(dpdk_tree=dpdk_tree):
+                self._set_remote_dpdk_tree_path(dpdk_tree)
+            case LocalDPDKTreeLocation(dpdk_tree=dpdk_tree):
+                self._copy_dpdk_tree(dpdk_tree)
+            case RemoteDPDKTarballLocation(tarball=tarball):
+                self._validate_remote_dpdk_tarball(tarball)
+                self._prepare_and_extract_dpdk_tarball(tarball)
+            case LocalDPDKTarballLocation(tarball=tarball):
+                remote_tarball = self._copy_dpdk_tarball_to_remote(tarball)
+                self._prepare_and_extract_dpdk_tarball(remote_tarball)
+
+        match dpdk_build_config:
+            case DPDKPrecompiledBuildConfiguration(precompiled_build_dir=build_dir):
+                self._set_remote_dpdk_build_dir(build_dir)
+            case DPDKUncompiledBuildConfiguration(build_options=build_options):
+                self._configure_dpdk_build(build_options)
+                self._build_dpdk()
 
         self.bind_ports_to_driver()
 
@@ -238,37 +252,29 @@ def _tear_down_dpdk(self) -> None:
         self.compiler_version = None
         self.bind_ports_to_driver(for_dpdk=False)
 
-    def _set_remote_dpdk_tree_path(self, dpdk_tree: str | None, remote: bool):
+    def _set_remote_dpdk_tree_path(self, dpdk_tree: PurePath):
         """Set the path to the remote DPDK source tree based on the provided DPDK location.
 
-        If :data:`dpdk_tree` and :data:`remote` are defined, check existence of :data:`dpdk_tree`
-        on SUT node and sets the `_remote_dpdk_tree_path` property. Otherwise, sets nothing.
-
         Verify DPDK source tree existence on the SUT node, if exists sets the
         `_remote_dpdk_tree_path` property, otherwise sets nothing.
 
         Args:
             dpdk_tree: The path to the DPDK source tree directory.
-            remote: Indicates whether the `dpdk_tree` is already on the SUT node, instead of the
-                execution host.
 
         Raises:
             RemoteFileNotFoundError: If the DPDK source tree is expected to be on the SUT node but
                 is not found.
         """
-        if remote and dpdk_tree:
-            if not self.main_session.remote_path_exists(dpdk_tree):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
-                )
-            if not self.main_session.is_remote_dir(dpdk_tree):
-                raise ConfigurationError(
-                    f"Remote DPDK source tree '{dpdk_tree}' must be a directory."
-                )
-
-            self.__remote_dpdk_tree_path = PurePath(dpdk_tree)
-
-    def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
+        if not self.main_session.remote_path_exists(dpdk_tree):
+            raise RemoteFileNotFoundError(
+                f"Remote DPDK source tree '{dpdk_tree}' not found in SUT node."
+            )
+        if not self.main_session.is_remote_dir(dpdk_tree):
+            raise ConfigurationError(f"Remote DPDK source tree '{dpdk_tree}' must be a directory.")
+
+        self.__remote_dpdk_tree_path = dpdk_tree
+
+    def _copy_dpdk_tree(self, dpdk_tree_path: Path) -> None:
         """Copy the DPDK source tree to the SUT.
 
         Args:
@@ -288,25 +294,45 @@ def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
             self._remote_tmp_dir, PurePath(dpdk_tree_path).name
         )
 
-    def _prepare_and_extract_dpdk_tarball(self, dpdk_tarball: str, remote: bool) -> None:
-        """Ensure the DPDK tarball is available on the SUT node and extract it.
+    def _validate_remote_dpdk_tarball(self, dpdk_tarball: PurePath) -> None:
+        """Validate the DPDK tarball on the SUT node.
 
-        This method ensures that the DPDK source tree tarball is available on the
-        SUT node. If the `dpdk_tarball` is local, it is copied to the SUT node. If the
-        `dpdk_tarball` is already on the SUT node, it verifies its existence.
-        The `dpdk_tarball` is then extracted on the SUT node.
+        Args:
+            dpdk_tarball: The path to the DPDK tarball on the SUT node.
 
-        This method sets the `_remote_dpdk_tree_path` property to the path of the
-        extracted DPDK tree on the SUT node.
+        Raises:
+            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but is
+                not found.
+            ConfigurationError: If the `dpdk_tarball` is a valid path but not a valid tar archive.
+        """
+        if not self.main_session.remote_path_exists(dpdk_tarball):
+            raise RemoteFileNotFoundError(f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT.")
+        if not self.main_session.is_remote_tarfile(dpdk_tarball):
+            raise ConfigurationError(f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive.")
+
+    def _copy_dpdk_tarball_to_remote(self, dpdk_tarball: Path) -> PurePath:
+        """Copy the local DPDK tarball to the SUT node.
 
         Args:
-            dpdk_tarball: The path to the DPDK tarball, either locally or on the SUT node.
-            remote: Indicates whether the `dpdk_tarball` is already on the SUT node, instead of the
-                execution host.
+            dpdk_tarball: The local path to the DPDK tarball.
 
-        Raises:
-            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but
-                is not found.
+        Returns:
+            The path of the copied tarball on the SUT node.
+        """
+        self._logger.info(
+            f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
+        )
+        self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
+        return self.main_session.join_remote_path(self._remote_tmp_dir, dpdk_tarball.name)
+
+    def _prepare_and_extract_dpdk_tarball(self, remote_tarball_path: PurePath) -> None:
+        """Prepare the remote DPDK tree path and extract the tarball.
+
+        This method extracts the remote tarball and sets the `_remote_dpdk_tree_path` property to
+        the path of the extracted DPDK tree on the SUT node.
+
+        Args:
+            remote_tarball_path: The path to the DPDK tarball on the SUT node.
         """
 
         def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
@@ -324,30 +350,9 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
                     return PurePath(str(remote_tarball_path).replace(suffixes_to_remove, ""))
             return remote_tarball_path.with_suffix("")
 
-        if remote:
-            if not self.main_session.remote_path_exists(dpdk_tarball):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT."
-                )
-            if not self.main_session.is_remote_tarfile(dpdk_tarball):
-                raise ConfigurationError(
-                    f"Remote DPDK tarball '{dpdk_tarball}' must be a tar archive."
-                )
-
-            remote_tarball_path = PurePath(dpdk_tarball)
-        else:
-            self._logger.info(
-                f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
-            )
-            self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
-
-            remote_tarball_path = self.main_session.join_remote_path(
-                self._remote_tmp_dir, PurePath(dpdk_tarball).name
-            )
-
         tarball_top_dir = self.main_session.get_tarball_top_dir(remote_tarball_path)
         self.__remote_dpdk_tree_path = self.main_session.join_remote_path(
-            PurePath(remote_tarball_path).parent,
+            remote_tarball_path.parent,
             tarball_top_dir or remove_tarball_suffix(remote_tarball_path),
         )
 
@@ -360,33 +365,32 @@ def remove_tarball_suffix(remote_tarball_path: PurePath) -> PurePath:
             self._remote_dpdk_tree_path,
         )
 
-    def _set_remote_dpdk_build_dir(self, build_dir: str | None):
+    def _set_remote_dpdk_build_dir(self, build_dir: str):
         """Set the `remote_dpdk_build_dir` on the SUT.
 
-        If :data:`build_dir` is defined, check existence on the SUT node and sets the
+        Check existence on the SUT node and sets the
         `remote_dpdk_build_dir` property by joining the `_remote_dpdk_tree_path` and `build_dir`.
         Otherwise, sets nothing.
 
         Args:
-            build_dir: If it's defined, DPDK has been pre-built and the build directory is located
+            build_dir: DPDK has been pre-built and the build directory is located
                 in a subdirectory of `dpdk_tree` or `tarball` root directory.
 
         Raises:
             RemoteFileNotFoundError: If the `build_dir` is expected but does not exist on the SUT
                 node.
         """
-        if build_dir:
-            remote_dpdk_build_dir = self.main_session.join_remote_path(
-                self._remote_dpdk_tree_path, build_dir
+        remote_dpdk_build_dir = self.main_session.join_remote_path(
+            self._remote_dpdk_tree_path, build_dir
+        )
+        if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
+            raise RemoteFileNotFoundError(
+                f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
             )
-            if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
-                raise RemoteFileNotFoundError(
-                    f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
-                )
 
-            self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
+        self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
 
-    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
+    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildOptionsConfiguration) -> None:
         """Populate common environment variables and set the DPDK build related properties.
 
         This method sets `compiler_version` for additional information and `remote_dpdk_build_dir`
diff --git a/dts/framework/testbed_model/topology.py b/dts/framework/testbed_model/topology.py
index d38ae36c2a..17b333e76a 100644
--- a/dts/framework/testbed_model/topology.py
+++ b/dts/framework/testbed_model/topology.py
@@ -99,7 +99,16 @@ def __init__(self, sut_ports: Iterable[Port], tg_ports: Iterable[Port]):
                     port_links.append(PortLink(sut_port=sut_port, tg_port=tg_port))
 
         self.type = TopologyType.get_from_value(len(port_links))
-        dummy_port = Port(PortConfig("", "", "", "", "", ""))
+        dummy_port = Port(
+            "",
+            PortConfig(
+                pci="0000:00:00.0",
+                os_driver_for_dpdk="",
+                os_driver="",
+                peer_node="",
+                peer_pci="0000:00:00.0",
+            ),
+        )
         self.tg_port_egress = dummy_port
         self.sut_port_ingress = dummy_port
         self.sut_port_egress = dummy_port
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
index a319fa5320..945f6bbbbb 100644
--- a/dts/framework/testbed_model/traffic_generator/__init__.py
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -38,6 +38,4 @@ def create_traffic_generator(
         case ScapyTrafficGeneratorConfig():
             return ScapyTrafficGenerator(tg_node, traffic_generator_config, privileged=True)
         case _:
-            raise ConfigurationError(
-                f"Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
-            )
+            raise ConfigurationError(f"Unknown traffic generator: {traffic_generator_config.type}")
diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 469a12a780..5ac61cd4e1 100644
--- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -45,7 +45,7 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig, **kwargs):
         """
         self._config = config
         self._tg_node = tg_node
-        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
+        self._logger = get_dts_logger(f"{self._tg_node.name} {self._config.type}")
         super().__init__(tg_node, **kwargs)
 
     def send_packet(self, packet: Packet, port: Port) -> None:
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index 43e2592fce..bc3f8d6d0f 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -28,7 +28,7 @@
 
 from .exception import InternalError
 
-REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
+REGEX_FOR_PCI_ADDRESS: str = r"[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}"
 _REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC: str = r"(?:[\da-fA-F]{2}[:-]){5}[\da-fA-F]{2}"
 _REGEX_FOR_DOT_SEP_MAC: str = r"(?:[\da-fA-F]{4}.){2}[\da-fA-F]{4}"
 REGEX_FOR_MAC_ADDRESS: str = rf"{_REGEX_FOR_COLON_OR_HYPHEN_SEP_MAC}|{_REGEX_FOR_DOT_SEP_MAC}"
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
index d7870bd40f..bc3a2a6bf9 100644
--- a/dts/tests/TestSuite_smoke_tests.py
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -127,7 +127,7 @@ def test_device_bound_to_driver(self) -> None:
         path_to_devbind = self.sut_node.path_to_devbind_script
 
         all_nics_in_dpdk_devbind = self.sut_node.main_session.send_command(
-            f"{path_to_devbind} --status | awk '{REGEX_FOR_PCI_ADDRESS}'",
+            f"{path_to_devbind} --status | awk '/{REGEX_FOR_PCI_ADDRESS}/'",
             SETTINGS.timeout,
         ).stdout
 
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v6 5/9] dts: remove warlock dependency
  2024-11-08 11:39 ` [PATCH v6 0/9] " Luca Vizzarro
                     ` (3 preceding siblings ...)
  2024-11-08 11:40   ` [PATCH v6 4/9] dts: use pydantic in the configuration Luca Vizzarro
@ 2024-11-08 11:40   ` Luca Vizzarro
  2024-11-08 11:40   ` [PATCH v6 6/9] dts: add autodoc pydantic Luca Vizzarro
                     ` (3 subsequent siblings)
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-08 11:40 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

Since pydantic has completely replaced warlock, there is no more need to
keep it as a dependency. This removes it.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
Reviewed-by: Patrick Robb <probb@iol.unh.edu>
---
 dts/poetry.lock    | 227 +--------------------------------------------
 dts/pyproject.toml |   1 -
 2 files changed, 1 insertion(+), 227 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index 56c50ad52c..9f7db60793 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -34,24 +34,6 @@ files = [
     {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
 ]
 
-[[package]]
-name = "attrs"
-version = "23.1.0"
-description = "Classes Without Boilerplate"
-optional = false
-python-versions = ">=3.7"
-files = [
-    {file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
-    {file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
-]
-
-[package.extras]
-cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
-dev = ["attrs[docs,tests]", "pre-commit"]
-docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
-tests = ["attrs[tests-no-zope]", "zope-interface"]
-tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
-
 [[package]]
 name = "babel"
 version = "2.13.1"
@@ -491,66 +473,6 @@ MarkupSafe = ">=2.0"
 [package.extras]
 i18n = ["Babel (>=2.7)"]
 
-[[package]]
-name = "jsonpatch"
-version = "1.33"
-description = "Apply JSON-Patches (RFC 6902)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpatch-1.33-py2.py3-none-any.whl", hash = "sha256:0ae28c0cd062bbd8b8ecc26d7d164fbbea9652a1a3693f3b956c1eae5145dade"},
-    {file = "jsonpatch-1.33.tar.gz", hash = "sha256:9fcd4009c41e6d12348b4a0ff2563ba56a2923a7dfee731d004e212e1ee5030c"},
-]
-
-[package.dependencies]
-jsonpointer = ">=1.9"
-
-[[package]]
-name = "jsonpointer"
-version = "2.4"
-description = "Identify specific nodes in a JSON document (RFC 6901)"
-optional = false
-python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*, !=3.6.*"
-files = [
-    {file = "jsonpointer-2.4-py2.py3-none-any.whl", hash = "sha256:15d51bba20eea3165644553647711d150376234112651b4f1811022aecad7d7a"},
-    {file = "jsonpointer-2.4.tar.gz", hash = "sha256:585cee82b70211fa9e6043b7bb89db6e1aa49524340dde8ad6b63206ea689d88"},
-]
-
-[[package]]
-name = "jsonschema"
-version = "4.18.4"
-description = "An implementation of JSON Schema validation for Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema-4.18.4-py3-none-any.whl", hash = "sha256:971be834317c22daaa9132340a51c01b50910724082c2c1a2ac87eeec153a3fe"},
-    {file = "jsonschema-4.18.4.tar.gz", hash = "sha256:fb3642735399fa958c0d2aad7057901554596c63349f4f6b283c493cf692a25d"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-jsonschema-specifications = ">=2023.03.6"
-referencing = ">=0.28.4"
-rpds-py = ">=0.7.1"
-
-[package.extras]
-format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
-format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
-
-[[package]]
-name = "jsonschema-specifications"
-version = "2023.7.1"
-description = "The JSON Schema meta-schemas and vocabularies, exposed as a Registry"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "jsonschema_specifications-2023.7.1-py3-none-any.whl", hash = "sha256:05adf340b659828a004220a9613be00fa3f223f2b82002e273dee62fd50524b1"},
-    {file = "jsonschema_specifications-2023.7.1.tar.gz", hash = "sha256:c91a50404e88a1f6ba40636778e2ee08f6e24c5613fe4c53ac24578a5a7f72bb"},
-]
-
-[package.dependencies]
-referencing = ">=0.28.0"
-
 [[package]]
 name = "markupsafe"
 version = "2.1.3"
@@ -1073,21 +995,6 @@ files = [
     {file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
 ]
 
-[[package]]
-name = "referencing"
-version = "0.30.0"
-description = "JSON Referencing + Python"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "referencing-0.30.0-py3-none-any.whl", hash = "sha256:c257b08a399b6c2f5a3510a50d28ab5dbc7bbde049bcaf954d43c446f83ab548"},
-    {file = "referencing-0.30.0.tar.gz", hash = "sha256:47237742e990457f7512c7d27486394a9aadaf876cbfaa4be65b27b4f4d47c6b"},
-]
-
-[package.dependencies]
-attrs = ">=22.2.0"
-rpds-py = ">=0.7.0"
-
 [[package]]
 name = "requests"
 version = "2.31.0"
@@ -1109,112 +1016,6 @@ urllib3 = ">=1.21.1,<3"
 socks = ["PySocks (>=1.5.6,!=1.5.7)"]
 use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
 
-[[package]]
-name = "rpds-py"
-version = "0.9.2"
-description = "Python bindings to Rust's persistent data structures (rpds)"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:ab6919a09c055c9b092798ce18c6c4adf49d24d4d9e43a92b257e3f2548231e7"},
-    {file = "rpds_py-0.9.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d55777a80f78dd09410bd84ff8c95ee05519f41113b2df90a69622f5540c4f8b"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a216b26e5af0a8e265d4efd65d3bcec5fba6b26909014effe20cd302fd1138fa"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:29cd8bfb2d716366a035913ced99188a79b623a3512292963d84d3e06e63b496"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:44659b1f326214950a8204a248ca6199535e73a694be8d3e0e869f820767f12f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:745f5a43fdd7d6d25a53ab1a99979e7f8ea419dfefebcab0a5a1e9095490ee5e"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a987578ac5214f18b99d1f2a3851cba5b09f4a689818a106c23dbad0dfeb760f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bf4151acb541b6e895354f6ff9ac06995ad9e4175cbc6d30aaed08856558201f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:03421628f0dc10a4119d714a17f646e2837126a25ac7a256bdf7c3943400f67f"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:13b602dc3e8dff3063734f02dcf05111e887f301fdda74151a93dbbc249930fe"},
-    {file = "rpds_py-0.9.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:fae5cb554b604b3f9e2c608241b5d8d303e410d7dfb6d397c335f983495ce7f6"},
-    {file = "rpds_py-0.9.2-cp310-none-win32.whl", hash = "sha256:47c5f58a8e0c2c920cc7783113df2fc4ff12bf3a411d985012f145e9242a2764"},
-    {file = "rpds_py-0.9.2-cp310-none-win_amd64.whl", hash = "sha256:4ea6b73c22d8182dff91155af018b11aac9ff7eca085750455c5990cb1cfae6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:e564d2238512c5ef5e9d79338ab77f1cbbda6c2d541ad41b2af445fb200385e3"},
-    {file = "rpds_py-0.9.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f411330a6376fb50e5b7a3e66894e4a39e60ca2e17dce258d53768fea06a37bd"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0e7521f5af0233e89939ad626b15278c71b69dc1dfccaa7b97bd4cdf96536bb7"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8d3335c03100a073883857e91db9f2e0ef8a1cf42dc0369cbb9151c149dbbc1b"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d25b1c1096ef0447355f7293fbe9ad740f7c47ae032c2884113f8e87660d8f6e"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6a5d3fbd02efd9cf6a8ffc2f17b53a33542f6b154e88dd7b42ef4a4c0700fdad"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c5934e2833afeaf36bd1eadb57256239785f5af0220ed8d21c2896ec4d3a765f"},
-    {file = "rpds_py-0.9.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:095b460e117685867d45548fbd8598a8d9999227e9061ee7f012d9d264e6048d"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:91378d9f4151adc223d584489591dbb79f78814c0734a7c3bfa9c9e09978121c"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:24a81c177379300220e907e9b864107614b144f6c2a15ed5c3450e19cf536fae"},
-    {file = "rpds_py-0.9.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:de0b6eceb46141984671802d412568d22c6bacc9b230174f9e55fc72ef4f57de"},
-    {file = "rpds_py-0.9.2-cp311-none-win32.whl", hash = "sha256:700375326ed641f3d9d32060a91513ad668bcb7e2cffb18415c399acb25de2ab"},
-    {file = "rpds_py-0.9.2-cp311-none-win_amd64.whl", hash = "sha256:0766babfcf941db8607bdaf82569ec38107dbb03c7f0b72604a0b346b6eb3298"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_10_7_x86_64.whl", hash = "sha256:b1440c291db3f98a914e1afd9d6541e8fc60b4c3aab1a9008d03da4651e67386"},
-    {file = "rpds_py-0.9.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0f2996fbac8e0b77fd67102becb9229986396e051f33dbceada3debaacc7033f"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9f30d205755566a25f2ae0382944fcae2f350500ae4df4e795efa9e850821d82"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:159fba751a1e6b1c69244e23ba6c28f879a8758a3e992ed056d86d74a194a0f3"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a1f044792e1adcea82468a72310c66a7f08728d72a244730d14880cd1dabe36b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9251eb8aa82e6cf88510530b29eef4fac825a2b709baf5b94a6094894f252387"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01899794b654e616c8625b194ddd1e5b51ef5b60ed61baa7a2d9c2ad7b2a4238"},
-    {file = "rpds_py-0.9.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b0c43f8ae8f6be1d605b0465671124aa8d6a0e40f1fb81dcea28b7e3d87ca1e1"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:207f57c402d1f8712618f737356e4b6f35253b6d20a324d9a47cb9f38ee43a6b"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:b52e7c5ae35b00566d244ffefba0f46bb6bec749a50412acf42b1c3f402e2c90"},
-    {file = "rpds_py-0.9.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:978fa96dbb005d599ec4fd9ed301b1cc45f1a8f7982d4793faf20b404b56677d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:6aa8326a4a608e1c28da191edd7c924dff445251b94653988efb059b16577a4d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:aad51239bee6bff6823bbbdc8ad85136c6125542bbc609e035ab98ca1e32a192"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4bd4dc3602370679c2dfb818d9c97b1137d4dd412230cfecd3c66a1bf388a196"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:dd9da77c6ec1f258387957b754f0df60766ac23ed698b61941ba9acccd3284d1"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:190ca6f55042ea4649ed19c9093a9be9d63cd8a97880106747d7147f88a49d18"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:876bf9ed62323bc7dcfc261dbc5572c996ef26fe6406b0ff985cbcf460fc8a4c"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa2818759aba55df50592ecbc95ebcdc99917fa7b55cc6796235b04193eb3c55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9ea4d00850ef1e917815e59b078ecb338f6a8efda23369677c54a5825dbebb55"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:5855c85eb8b8a968a74dc7fb014c9166a05e7e7a8377fb91d78512900aadd13d"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:14c408e9d1a80dcb45c05a5149e5961aadb912fff42ca1dd9b68c0044904eb32"},
-    {file = "rpds_py-0.9.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:65a0583c43d9f22cb2130c7b110e695fff834fd5e832a776a107197e59a1898e"},
-    {file = "rpds_py-0.9.2-cp38-none-win32.whl", hash = "sha256:71f2f7715935a61fa3e4ae91d91b67e571aeb5cb5d10331ab681256bda2ad920"},
-    {file = "rpds_py-0.9.2-cp38-none-win_amd64.whl", hash = "sha256:674c704605092e3ebbbd13687b09c9f78c362a4bc710343efe37a91457123044"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:07e2c54bef6838fa44c48dfbc8234e8e2466d851124b551fc4e07a1cfeb37260"},
-    {file = "rpds_py-0.9.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:f7fdf55283ad38c33e35e2855565361f4bf0abd02470b8ab28d499c663bc5d7c"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:890ba852c16ace6ed9f90e8670f2c1c178d96510a21b06d2fa12d8783a905193"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:50025635ba8b629a86d9d5474e650da304cb46bbb4d18690532dd79341467846"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:517cbf6e67ae3623c5127206489d69eb2bdb27239a3c3cc559350ef52a3bbf0b"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0836d71ca19071090d524739420a61580f3f894618d10b666cf3d9a1688355b1"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c439fd54b2b9053717cca3de9583be6584b384d88d045f97d409f0ca867d80f"},
-    {file = "rpds_py-0.9.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f68996a3b3dc9335037f82754f9cdbe3a95db42bde571d8c3be26cc6245f2324"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7d68dc8acded354c972116f59b5eb2e5864432948e098c19fe6994926d8e15c3"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:f963c6b1218b96db85fc37a9f0851eaf8b9040aa46dec112611697a7023da535"},
-    {file = "rpds_py-0.9.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:5a46859d7f947061b4010e554ccd1791467d1b1759f2dc2ec9055fa239f1bc26"},
-    {file = "rpds_py-0.9.2-cp39-none-win32.whl", hash = "sha256:e07e5dbf8a83c66783a9fe2d4566968ea8c161199680e8ad38d53e075df5f0d0"},
-    {file = "rpds_py-0.9.2-cp39-none-win_amd64.whl", hash = "sha256:682726178138ea45a0766907957b60f3a1bf3acdf212436be9733f28b6c5af3c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_10_7_x86_64.whl", hash = "sha256:196cb208825a8b9c8fc360dc0f87993b8b260038615230242bf18ec84447c08d"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:c7671d45530fcb6d5e22fd40c97e1e1e01965fc298cbda523bb640f3d923b387"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:83b32f0940adec65099f3b1c215ef7f1d025d13ff947975a055989cb7fd019a4"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7f67da97f5b9eac838b6980fc6da268622e91f8960e083a34533ca710bec8611"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:03975db5f103997904c37e804e5f340c8fdabbb5883f26ee50a255d664eed58c"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:987b06d1cdb28f88a42e4fb8a87f094e43f3c435ed8e486533aea0bf2e53d931"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c861a7e4aef15ff91233751619ce3a3d2b9e5877e0fcd76f9ea4f6847183aa16"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:02938432352359805b6da099c9c95c8a0547fe4b274ce8f1a91677401bb9a45f"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:ef1f08f2a924837e112cba2953e15aacfccbbfcd773b4b9b4723f8f2ddded08e"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:35da5cc5cb37c04c4ee03128ad59b8c3941a1e5cd398d78c37f716f32a9b7f67"},
-    {file = "rpds_py-0.9.2-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:141acb9d4ccc04e704e5992d35472f78c35af047fa0cfae2923835d153f091be"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_10_7_x86_64.whl", hash = "sha256:79f594919d2c1a0cc17d1988a6adaf9a2f000d2e1048f71f298b056b1018e872"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-macosx_11_0_arm64.whl", hash = "sha256:a06418fe1155e72e16dddc68bb3780ae44cebb2912fbd8bb6ff9161de56e1798"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b2eb034c94b0b96d5eddb290b7b5198460e2d5d0c421751713953a9c4e47d10"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b08605d248b974eb02f40bdcd1a35d3924c83a2a5e8f5d0fa5af852c4d960af"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a0805911caedfe2736935250be5008b261f10a729a303f676d3d5fea6900c96a"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ab2299e3f92aa5417d5e16bb45bb4586171c1327568f638e8453c9f8d9e0f020"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c8d7594e38cf98d8a7df25b440f684b510cf4627fe038c297a87496d10a174f"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8b9ec12ad5f0a4625db34db7e0005be2632c1013b253a4a60e8302ad4d462afd"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:1fcdee18fea97238ed17ab6478c66b2095e4ae7177e35fb71fbe561a27adf620"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_i686.whl", hash = "sha256:933a7d5cd4b84f959aedeb84f2030f0a01d63ae6cf256629af3081cf3e3426e8"},
-    {file = "rpds_py-0.9.2-pp38-pypy38_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:686ba516e02db6d6f8c279d1641f7067ebb5dc58b1d0536c4aaebb7bf01cdc5d"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_10_7_x86_64.whl", hash = "sha256:0173c0444bec0a3d7d848eaeca2d8bd32a1b43f3d3fde6617aac3731fa4be05f"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:d576c3ef8c7b2d560e301eb33891d1944d965a4d7a2eacb6332eee8a71827db6"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ed89861ee8c8c47d6beb742a602f912b1bb64f598b1e2f3d758948721d44d468"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1054a08e818f8e18910f1bee731583fe8f899b0a0a5044c6e680ceea34f93876"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99e7c4bb27ff1aab90dcc3e9d37ee5af0231ed98d99cb6f5250de28889a3d502"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c545d9d14d47be716495076b659db179206e3fd997769bc01e2d550eeb685596"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9039a11bca3c41be5a58282ed81ae422fa680409022b996032a43badef2a3752"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fb39aca7a64ad0c9490adfa719dbeeb87d13be137ca189d2564e596f8ba32c07"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:2d8b3b3a2ce0eaa00c5bbbb60b6713e94e7e0becab7b3db6c5c77f979e8ed1f1"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:99b1c16f732b3a9971406fbfe18468592c5a3529585a45a35adbc1389a529a03"},
-    {file = "rpds_py-0.9.2-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:c27ee01a6c3223025f4badd533bea5e87c988cb0ba2811b690395dfe16088cfe"},
-    {file = "rpds_py-0.9.2.tar.gz", hash = "sha256:8d70e8f14900f2657c249ea4def963bed86a29b81f81f5b76b5a9215680de945"},
-]
-
 [[package]]
 name = "scapy"
 version = "2.5.0"
@@ -1472,17 +1273,6 @@ files = [
     {file = "types_PyYAML-6.0.12.11-py3-none-any.whl", hash = "sha256:a461508f3096d1d5810ec5ab95d7eeecb651f3a15b71959999988942063bf01d"},
 ]
 
-[[package]]
-name = "typing-extensions"
-version = "4.11.0"
-description = "Backported and Experimental Type Hints for Python 3.8+"
-optional = false
-python-versions = ">=3.8"
-files = [
-    {file = "typing_extensions-4.11.0-py3-none-any.whl", hash = "sha256:c1f94d72897edaf4ce775bb7558d5b79d8126906a14ea5ed1635921406c0387a"},
-    {file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
-]
-
 [[package]]
 name = "typing-extensions"
 version = "4.12.2"
@@ -1511,22 +1301,7 @@ secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.
 socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
 zstd = ["zstandard (>=0.18.0)"]
 
-[[package]]
-name = "warlock"
-version = "2.0.1"
-description = "Python object model built on JSON schema and JSON patch."
-optional = false
-python-versions = ">=3.7,<4.0"
-files = [
-    {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"},
-    {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"},
-]
-
-[package.dependencies]
-jsonpatch = ">=1,<2"
-jsonschema = ">=4,<5"
-
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "6f86f59ac1f8bffc7c778a1c125b334127f6be40492b74ea23a6e42dd928f827"
+content-hash = "310e2d3725e20ffc6ef017db92e8000c042eb2ac98a1a5eb441de17c87417e9f"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 6c2d1ca8a4..9a3fb02ee9 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -20,7 +20,6 @@ documentation = "https://doc.dpdk.org/guides/tools/dts.html"
 
 [tool.poetry.dependencies]
 python = "^3.10"
-warlock = "^2.0.1"
 PyYAML = "^6.0"
 types-PyYAML = "^6.0.8"
 fabric = "^2.7.1"
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v6 6/9] dts: add autodoc pydantic
  2024-11-08 11:39 ` [PATCH v6 0/9] " Luca Vizzarro
                     ` (4 preceding siblings ...)
  2024-11-08 11:40   ` [PATCH v6 5/9] dts: remove warlock dependency Luca Vizzarro
@ 2024-11-08 11:40   ` Luca Vizzarro
  2024-11-08 11:40   ` [PATCH v6 7/9] dts: improve configuration API docs Luca Vizzarro
                     ` (2 subsequent siblings)
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-08 11:40 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

Add and enable the autodoc-pydantic sphinx extension. Pydantic models
are not correctly recognised by autodoc, causing the generated docs to
lack all the actual model information. The autodoc-pydantic sphinx
extension fixes the original behaviour by correctly formatting them.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
Reviewed-by: Patrick Robb <probb@iol.unh.edu>
---
 doc/guides/conf.py       |  13 +++
 doc/guides/tools/dts.rst | 187 ++-------------------------------------
 dts/poetry.lock          |  59 +++++++++++-
 dts/pyproject.toml       |   1 +
 4 files changed, 79 insertions(+), 181 deletions(-)

diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index b553d9d5bf..71fed45b3d 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -60,6 +60,19 @@
 # DTS API docs additional configuration
 if environ.get('DTS_DOC_BUILD'):
     extensions = ['sphinx.ext.napoleon', 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx']
+
+    # Pydantic models require autodoc_pydantic for the right formatting
+    try:
+        import sphinxcontrib.autodoc_pydantic
+
+        extensions.append("sphinxcontrib.autodoc_pydantic")
+    except ImportError:
+        print(
+            "The DTS API doc dependencies are missing. The generated output won't be "
+            "as intended, and autodoc may throw unexpected warnings.",
+            file=stderr,
+        )
+
     # Napoleon enables the Google format of Python doscstrings.
     napoleon_numpy_docstring = False
     napoleon_attr_annotations = True
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index c52de1808c..fb6504fa59 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -204,9 +204,10 @@ node, and then run the tests with the newly built binaries.
 Configuring DTS
 ~~~~~~~~~~~~~~~
 
-DTS configuration is split into nodes and test runs and build targets within test runs,
-and follows a defined schema as described in `Configuration Schema`_.
-By default, DTS will try to use the ``dts/conf.yaml`` :ref:`config file <configuration_schema_example>`,
+DTS configuration is split into nodes and test runs, and must respect the model definitions as
+documented in the DTS API docs under the ``config`` page. The root of the configuration is
+represented by the ``Configuration`` model.
+By default, DTS will try to use the ``dts/conf.yaml`` :ref:`config file <configuration_example>`,
 which is a template that illustrates what can be configured in DTS.
 
 The user must have :ref:`administrator privileges <sut_admin_user>`
@@ -470,184 +471,10 @@ The output is generated in ``build/doc/api/dts/html``.
 
    Make sure to fix any Sphinx warnings when adding or updating docstrings.
 
+.. _configuration_example:
 
-Configuration Schema
---------------------
-
-Definitions
-~~~~~~~~~~~
-
-_`Node name`
-   *string* – A unique identifier for a node.
-   **Examples**: ``SUT1``, ``TG1``.
-
-_`ARCH`
-   *string* – The CPU architecture.
-   **Supported values**: ``x86_64``, ``arm64``, ``ppc64le``.
-
-_`CPU`
-   *string* – The CPU microarchitecture. Use ``native`` for x86.
-   **Supported values**: ``native``, ``armv8a``, ``dpaa2``, ``thunderx``, ``xgene1``.
-
-_`OS`
-   *string* – The operating system. **Supported values**: ``linux``.
-
-_`Compiler`
-   *string* – The compiler used for building DPDK.
-   **Supported values**: ``gcc``, ``clang``, ``icc``, ``mscv``.
-
-_`Build target`
-   *mapping* – Build targets supported by DTS for building DPDK, described as:
-
-   ==================== =================================================================
-   ``arch``             See `ARCH`_
-   ``os``               See `OS`_
-   ``cpu``              See `CPU`_
-   ``compiler``         See `Compiler`_
-   ``compiler_wrapper`` *string* – Value prepended to the CC variable for the DPDK build.
-
-                        **Example**: ``ccache``
-   ==================== =================================================================
-
-_`hugepages_2mb`
-   *mapping* – hugepages_2mb described as:
-
-   ==================== ================================================================
-   ``number_of``        *integer* – The number of 2MB hugepages to configure.
-
-                        Hugepage size will be the system default.
-   ``force_first_numa`` (*optional*, defaults to ``false``) – If ``true``, it forces the
-
-                        configuration of hugepages on the first NUMA node.
-   ==================== ================================================================
-
-_`Network port`
-   *mapping* – the NIC port described as:
-
-   ====================== =================================================================================
-   ``pci``                *string* – the local PCI address of the port. **Example**: ``0000:00:08.0``
-   ``os_driver_for_dpdk`` | *string* – this port's device driver when using with DPDK
-                          | When setting up the SUT, DTS will bind the network device to this driver
-                          | for compatibility with DPDK.
-
-                          **Examples**: ``vfio-pci``, ``mlx5_core``
-   ``os_driver``          | *string* – this port's device driver when **not** using with DPDK
-                          | When tearing down the tests on the SUT, DTS will bind the network device
-                          | *back* to this driver. This driver is meant to be the one that the SUT would
-                          | normally use for this device, or whichever driver it is preferred to leave the
-                          | device bound to after testing.
-                          | This also represents the driver that is used in conjunction with the traffic
-                          | generator software.
-
-                          **Examples**: ``i40e``, ``mlx5_core``
-   ``peer_node``          *string* – the name of the peer node connected to this port.
-   ``peer_pci``           *string* – the PCI address of the peer node port. **Example**: ``000a:01:00.1``
-   ====================== =================================================================================
-
-_`Test suite`
-   *string* – name of the test suite to run. **Examples**: ``hello_world``, ``os_udp``
-
-_`Test target`
-   *mapping* – selects specific test cases to run from a test suite. Mapping is described as follows:
-
-   ========= ===============================================================================================
-   ``suite`` See `Test suite`_
-   ``cases`` (*optional*) *sequence* of *string* – list of the selected test cases in the test suite to run.
-
-             Unknown test cases will be silently ignored.
-   ========= ===============================================================================================
-
-
-Properties
-~~~~~~~~~~
-
-The configuration requires listing all the test run environments and nodes
-involved in the testing. These can be defined with the following mappings:
-
-``test runs``
-   `sequence <https://docs.python.org/3/library/stdtypes.html#sequence-types-list-tuple-range>`_ listing
-   the test run environments. Each entry is described as per the following
-   `mapping <https://docs.python.org/3/library/stdtypes.html#mapping-types-dict>`_:
-
-   +----------------------------+-------------------------------------------------------------------+
-   | ``build_targets``          | *sequence* of `Build target`_                                     |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``perf``                   | *boolean* – Enable performance testing.                           |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``func``                   | *boolean* – Enable functional testing.                            |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``test_suites``            | *sequence* of **one of** `Test suite`_ **or** `Test target`_      |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``skip_smoke_tests``       | (*optional*) *boolean* – Allows you to skip smoke testing         |
-   |                            | if ``true``.                                                      |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``system_under_test_node`` | System under test node specified with:                            |
-   |                            +---------------+---------------------------------------------------+
-   |                            | ``node_name`` | See `Node name`_                                  |
-   |                            +---------------+---------------------------------------------------+
-   |                            | ``vdevs``     | (*optional*) *sequence* of *string*               |
-   |                            |               |                                                   |
-   |                            |               | List of virtual devices passed with the ``--vdev``|
-   |                            |               | argument to DPDK. **Example**: ``crypto_openssl`` |
-   +----------------------------+---------------+---------------------------------------------------+
-   | ``traffic_generator_node`` | Node name for the traffic generator node.                         |
-   +----------------------------+-------------------------------------------------------------------+
-   | ``random_seed``            | (*optional*) *int* – Set a seed for pseudo-random generation.     |
-   +----------------------------+-------------------------------------------------------------------+
-
-``nodes``
-   `sequence <https://docs.python.org/3/library/stdtypes.html#sequence-types-list-tuple-range>`_ listing
-   the nodes. Each entry is described as per the following
-   `mapping <https://docs.python.org/3/library/stdtypes.html#mapping-types-dict>`_:
-
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``name``              | See `Node name`_                                                                      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``hostname``          | *string* – The network hostname or IP address of this node.                           |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``user``              | *string* – The SSH user credential to use to login to this node.                      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``password``          | (*optional*) *string* – The SSH password credential for this node.                    |
-   |                       |                                                                                       |
-   |                       | **NB**: Use only as last resort. SSH keys are **strongly** preferred.                 |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``arch``              | The architecture of this node. See `ARCH`_ for supported values.                      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``os``                | The operating system of this node. See `OS`_ for supported values.                    |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``lcores``            | | (*optional*, defaults to 1) *string* – Comma-separated list of logical              |
-   |                       | | cores to use. An empty string means use all lcores.                                 |
-   |                       |                                                                                       |
-   |                       | **Example**: ``1,2,3,4,5,18-22``                                                      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``use_first_core``    | (*optional*, defaults to ``false``) *boolean*                                         |
-   |                       |                                                                                       |
-   |                       | Indicates whether DPDK should use only the first physical core or not.                |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``memory_channels``   | (*optional*, defaults to 1) *integer*                                                 |
-   |                       |                                                                                       |
-   |                       | The number of the memory channels to use.                                             |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``hugepages_2mb``     | (*optional*) See `hugepages_2mb`_. If unset, 2MB hugepages won't be configured        |
-   |                       |                                                                                       |
-   |                       | in favour of the system configuration.                                                |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``ports``             | | *sequence* of `Network port`_ – Describe ports that are **directly** paired with    |
-   |                       | | other nodes used in conjunction with this one. Both ends of the links must be       |
-   |                       | | described. If there any inconsistencies DTS won't run.                              |
-   |                       |                                                                                       |
-   |                       | **Example**: port 1 of node ``SUT1`` is connected to port 1 of node ``TG1`` etc.      |
-   +-----------------------+---------------------------------------------------------------------------------------+
-   | ``traffic_generator`` | (*optional*) Traffic generator, if any, setup on this node described as:              |
-   |                       +----------+----------------------------------------------------------------------------+
-   |                       | ``type`` | *string* – **Supported values**: *SCAPY*                                   |
-   +-----------------------+----------+----------------------------------------------------------------------------+
-
-
-.. _configuration_schema_example:
-
-Example
-~~~~~~~
+Configuration Example
+---------------------
 
 The following example (which can be found in ``dts/conf.yaml``) sets up two nodes:
 
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 9f7db60793..ee564676b4 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -34,6 +34,29 @@ files = [
     {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"},
 ]
 
+[[package]]
+name = "autodoc-pydantic"
+version = "2.2.0"
+description = "Seamlessly integrate pydantic models in your Sphinx documentation."
+optional = false
+python-versions = "<4.0.0,>=3.8.1"
+files = [
+    {file = "autodoc_pydantic-2.2.0-py3-none-any.whl", hash = "sha256:8c6a36fbf6ed2700ea9c6d21ea76ad541b621fbdf16b5a80ee04673548af4d95"},
+]
+
+[package.dependencies]
+pydantic = ">=2.0,<3.0.0"
+pydantic-settings = ">=2.0,<3.0.0"
+Sphinx = ">=4.0"
+
+[package.extras]
+docs = ["myst-parser (>=3.0.0,<4.0.0)", "sphinx-copybutton (>=0.5.0,<0.6.0)", "sphinx-rtd-theme (>=2.0.0,<3.0.0)", "sphinx-tabs (>=3,<4)", "sphinxcontrib-mermaid (>=0.9.0,<0.10.0)"]
+erdantic = ["erdantic (<2.0)"]
+linting = ["ruff (>=0.4.0,<0.5.0)"]
+security = ["pip-audit (>=2.7.2,<3.0.0)"]
+test = ["coverage (>=7,<8)", "defusedxml (>=0.7.1)", "pytest (>=8.0.0,<9.0.0)", "pytest-sugar (>=1.0.0,<2.0.0)"]
+type-checking = ["mypy (>=1.9,<2.0)", "types-docutils (>=0.20,<0.21)", "typing-extensions (>=4.11,<5.0)"]
+
 [[package]]
 name = "babel"
 version = "2.13.1"
@@ -829,6 +852,26 @@ files = [
 [package.dependencies]
 typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
 
+[[package]]
+name = "pydantic-settings"
+version = "2.6.0"
+description = "Settings management using Pydantic"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "pydantic_settings-2.6.0-py3-none-any.whl", hash = "sha256:4a819166f119b74d7f8c765196b165f95cc7487ce58ea27dec8a5a26be0970e0"},
+    {file = "pydantic_settings-2.6.0.tar.gz", hash = "sha256:44a1804abffac9e6a30372bb45f6cafab945ef5af25e66b1c634c01dd39e0188"},
+]
+
+[package.dependencies]
+pydantic = ">=2.7.0"
+python-dotenv = ">=0.21.0"
+
+[package.extras]
+azure-key-vault = ["azure-identity (>=1.16.0)", "azure-keyvault-secrets (>=4.8.0)"]
+toml = ["tomli (>=2.0.1)"]
+yaml = ["pyyaml (>=6.0.1)"]
+
 [[package]]
 name = "pydocstyle"
 version = "6.1.1"
@@ -935,6 +978,20 @@ cffi = ">=1.4.1"
 docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"]
 tests = ["hypothesis (>=3.27.0)", "pytest (>=3.2.1,!=3.3.0)"]
 
+[[package]]
+name = "python-dotenv"
+version = "1.0.1"
+description = "Read key-value pairs from a .env file and set them as environment variables"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "python-dotenv-1.0.1.tar.gz", hash = "sha256:e324ee90a023d808f1959c46bcbc04446a10ced277783dc6ee09987c37ec10ca"},
+    {file = "python_dotenv-1.0.1-py3-none-any.whl", hash = "sha256:f7b63ef50f1b690dddf550d03497b66d609393b40b564ed0d674909a68ebf16a"},
+]
+
+[package.extras]
+cli = ["click (>=5.0)"]
+
 [[package]]
 name = "pyyaml"
 version = "6.0.1"
@@ -1304,4 +1361,4 @@ zstd = ["zstandard (>=0.18.0)"]
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "310e2d3725e20ffc6ef017db92e8000c042eb2ac98a1a5eb441de17c87417e9f"
+content-hash = "fe9a9fdf7b43e8dce2fb5ee600921d4047fef2f4037a78bbd150f71df202493e"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 9a3fb02ee9..f69c70877a 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -44,6 +44,7 @@ optional = true
 sphinx = "<=7"
 sphinx-rtd-theme = ">=1.2.2"
 pyelftools = "^0.31"
+autodoc-pydantic = "^2.2.0"
 
 [build-system]
 requires = ["poetry-core>=1.0.0"]
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v6 7/9] dts: improve configuration API docs
  2024-11-08 11:39 ` [PATCH v6 0/9] " Luca Vizzarro
                     ` (5 preceding siblings ...)
  2024-11-08 11:40   ` [PATCH v6 6/9] dts: add autodoc pydantic Luca Vizzarro
@ 2024-11-08 11:40   ` Luca Vizzarro
  2024-11-08 11:40   ` [PATCH v6 8/9] dts: fix custom enum behaviour with docs Luca Vizzarro
  2024-11-08 11:40   ` [PATCH v6 9/9] dts: use TestSuiteSpec class imports Luca Vizzarro
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-08 11:40 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

Pydantic models are not treated the same way as dataclasses by autodoc.
As a consequence the docstrings need to be applied directly to each
field. Otherwise the generated API documentation page would present two
entries per each field with each their own differences.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
Reviewed-by: Patrick Robb <probb@iol.unh.edu>
---
 doc/guides/tools/dts.rst         |   5 +-
 dts/framework/config/__init__.py | 253 +++++++++++--------------------
 2 files changed, 88 insertions(+), 170 deletions(-)

diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index fb6504fa59..f4e297413d 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -1,5 +1,6 @@
 ..  SPDX-License-Identifier: BSD-3-Clause
     Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
+    Copyright(c) 2024 Arm Limited
 
 DPDK Test Suite
 ===============
@@ -327,8 +328,8 @@ where we deviate or where some additional clarification is helpful:
    * The ``dataclass.dataclass`` decorator changes how the attributes are processed.
      The dataclass attributes which result in instance variables/attributes
      should also be recorded in the ``Attributes:`` section.
-   * Class variables/attributes, on the other hand, should be documented with ``#:``
-     above the type annotated line.
+   * Class variables/attributes and Pydantic model fields, on the other hand, should be documented
+     with ``#:`` above the type annotated line.
      The description may be omitted if the meaning is obvious.
    * The ``Enum`` and ``TypedDict`` also process the attributes in particular ways
      and should be documented with ``#:`` as well.
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index d88fabc780..82113a6257 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -130,54 +130,34 @@ class TrafficGeneratorType(str, Enum):
 
 
 class HugepageConfiguration(FrozenModel):
-    r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
-
-    Attributes:
-        number_of: The number of hugepages to allocate.
-        force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.
-    """
+    r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s."""
 
+    #: The number of hugepages to allocate.
     number_of: int
+    #: If :data:`True`, the hugepages will be configured on the first NUMA node.
     force_first_numa: bool
 
 
 class PortConfig(FrozenModel):
-    r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
-
-    Attributes:
-        pci: The PCI address of the port.
-        os_driver_for_dpdk: The operating system driver name for use with DPDK.
-        os_driver: The operating system driver name when the operating system controls the port.
-        peer_node: The :class:`~framework.testbed_model.node.Node` of the port
-            connected to this port.
-        peer_pci: The PCI address of the port connected to this port.
-    """
+    r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s."""
 
-    pci: str = Field(
-        description="The local PCI address of the port.", pattern=REGEX_FOR_PCI_ADDRESS
-    )
-    os_driver_for_dpdk: str = Field(
-        description="The driver that the kernel should bind this device to for DPDK to use it.",
-        examples=["vfio-pci", "mlx5_core"],
-    )
-    os_driver: str = Field(
-        description="The driver normally used by this port", examples=["i40e", "ice", "mlx5_core"]
-    )
-    peer_node: str = Field(description="The name of the peer node this port is connected to.")
-    peer_pci: str = Field(
-        description="The PCI address of the peer port this port is connected to.",
-        pattern=REGEX_FOR_PCI_ADDRESS,
-    )
+    #: The PCI address of the port.
+    pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS)
+    #: The driver that the kernel should bind this device to for DPDK to use it.
+    os_driver_for_dpdk: str = Field(examples=["vfio-pci", "mlx5_core"])
+    #: The operating system driver name when the operating system controls the port.
+    os_driver: str = Field(examples=["i40e", "ice", "mlx5_core"])
+    #: The name of the peer node this port is connected to.
+    peer_node: str
+    #: The PCI address of the peer port connected to this port.
+    peer_pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS)
 
 
 class TrafficGeneratorConfig(FrozenModel):
-    """A protocol required to define traffic generator types.
-
-    Attributes:
-        type: The traffic generator type, the child class is required to define to be distinguished
-            among others.
-    """
+    """A protocol required to define traffic generator types."""
 
+    #: The traffic generator type the child class is required to define to be distinguished among
+    #: others.
     type: TrafficGeneratorType
 
 
@@ -190,13 +170,10 @@ class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
 #: A union type discriminating traffic generators by the `type` field.
 TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
 
-
-#: A field representing logical core ranges.
+#: Comma-separated list of logical cores to use. An empty string means use all lcores.
 LogicalCores = Annotated[
     str,
     Field(
-        description="Comma-separated list of logical cores to use. "
-        "An empty string means use all lcores.",
         examples=["1,2,3,4,5,18-22", "10-15"],
         pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$",
     ),
@@ -204,61 +181,41 @@ class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
 
 
 class NodeConfiguration(FrozenModel):
-    r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
-
-    Attributes:
-        name: The name of the :class:`~framework.testbed_model.node.Node`.
-        hostname: The hostname of the :class:`~framework.testbed_model.node.Node`.
-            Can be an IP or a domain name.
-        user: The name of the user used to connect to
-            the :class:`~framework.testbed_model.node.Node`.
-        password: The password of the user. The use of passwords is heavily discouraged.
-            Please use keys instead.
-        arch: The architecture of the :class:`~framework.testbed_model.node.Node`.
-        os: The operating system of the :class:`~framework.testbed_model.node.Node`.
-        lcores: A comma delimited list of logical cores to use when running DPDK.
-        use_first_core: If :data:`True`, the first logical core won't be used.
-        hugepages: An optional hugepage configuration.
-        ports: The ports that can be used in testing.
-    """
-
-    name: str = Field(description="A unique identifier for this node.")
-    hostname: str = Field(description="The hostname or IP address of the node.")
-    user: str = Field(description="The login user to use to connect to this node.")
-    password: str | None = Field(
-        default=None,
-        description="The login password to use to connect to this node. "
-        "SSH keys are STRONGLY preferred, use only as last resort.",
-    )
+    r"""The configuration of :class:`~framework.testbed_model.node.Node`\s."""
+
+    #: The name of the :class:`~framework.testbed_model.node.Node`.
+    name: str
+    #: The hostname of the :class:`~framework.testbed_model.node.Node`. Can also be an IP address.
+    hostname: str
+    #: The name of the user used to connect to the :class:`~framework.testbed_model.node.Node`.
+    user: str
+    #: The password of the user. The use of passwords is heavily discouraged, please use SSH keys.
+    password: str | None = None
+    #: The architecture of the :class:`~framework.testbed_model.node.Node`.
     arch: Architecture
+    #: The operating system of the :class:`~framework.testbed_model.node.Node`.
     os: OS
+    #: A comma delimited list of logical cores to use when running DPDK.
     lcores: LogicalCores = "1"
-    use_first_core: bool = Field(
-        default=False, description="DPDK won't use the first physical core if set to False."
-    )
+    #: If :data:`True`, the first logical core won't be used.
+    use_first_core: bool = False
+    #: An optional hugepage configuration.
     hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
+    #: The ports that can be used in testing.
     ports: list[PortConfig] = Field(min_length=1)
 
 
 class SutNodeConfiguration(NodeConfiguration):
-    """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
+    """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration."""
 
-    Attributes:
-        memory_channels: The number of memory channels to use when running DPDK.
-    """
-
-    memory_channels: int = Field(
-        default=1, description="Number of memory channels to use when running DPDK."
-    )
+    #: The number of memory channels to use when running DPDK.
+    memory_channels: int = 1
 
 
 class TGNodeConfiguration(NodeConfiguration):
-    """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
-
-    Attributes:
-        traffic_generator: The configuration of the traffic generator present on the TG node.
-    """
+    """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration."""
 
+    #: The configuration of the traffic generator present on the TG node.
     traffic_generator: TrafficGeneratorConfigTypes
 
 
@@ -272,20 +229,18 @@ def resolve_path(path: Path) -> Path:
 
 
 class BaseDPDKLocation(FrozenModel):
-    """DPDK location.
+    """DPDK location base class.
 
-    The path to the DPDK sources, build dir and type of location.
-
-    Attributes:
-        remote: Optional, defaults to :data:`False`. If :data:`True`, `dpdk_tree` or `tarball` is
-            located on the SUT node, instead of the execution host.
+    The path to the DPDK sources and type of location.
     """
 
+    #: Specifies whether to find DPDK on the SUT node or on the local host. Which are respectively
+    #: represented by :class:`RemoteDPDKLocation` and :class:`LocalDPDKTreeLocation`.
     remote: bool = False
 
 
 class LocalDPDKLocation(BaseDPDKLocation):
-    """Local DPDK location parent class.
+    """Local DPDK location base class.
 
     This class is meant to represent any location that is present only locally.
     """
@@ -298,14 +253,12 @@ class LocalDPDKTreeLocation(LocalDPDKLocation):
 
     This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
     validation.
-
-    Attributes:
-        dpdk_tree: The path to the DPDK source tree directory.
     """
 
+    #: The path to the DPDK source tree directory on the local host passed as string.
     dpdk_tree: Path
 
-    #: Resolve the local DPDK tree path
+    #: Resolve the local DPDK tree path.
     resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
 
     @model_validator(mode="after")
@@ -321,14 +274,12 @@ class LocalDPDKTarballLocation(LocalDPDKLocation):
 
     This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
     validation.
-
-    Attributes:
-        tarball: The path to the DPDK tarball.
     """
 
+    #: The path to the DPDK tarball on the local host passed as string.
     tarball: Path
 
-    #: Resolve the local tarball path
+    #: Resolve the local tarball path.
     resolve_tarball_path = field_validator("tarball")(resolve_path)
 
     @model_validator(mode="after")
@@ -340,7 +291,7 @@ def validate_tarball_path(self) -> Self:
 
 
 class RemoteDPDKLocation(BaseDPDKLocation):
-    """Remote DPDK location parent class.
+    """Remote DPDK location base class.
 
     This class is meant to represent any location that is present only remotely.
     """
@@ -352,11 +303,9 @@ class RemoteDPDKTreeLocation(RemoteDPDKLocation):
     """Remote DPDK tree location.
 
     This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
-
-    Attributes:
-        dpdk_tree: The path to the DPDK source tree directory.
     """
 
+    #: The path to the DPDK source tree directory on the remote node passed as string.
     dpdk_tree: PurePath
 
 
@@ -365,11 +314,9 @@ class RemoteDPDKTarballLocation(RemoteDPDKLocation):
 
     This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
     validation.
-
-    Attributes:
-        tarball: The path to the DPDK tarball.
     """
 
+    #: The path to the DPDK tarball on the remote node passed as string.
     tarball: PurePath
 
 
@@ -386,23 +333,17 @@ class BaseDPDKBuildConfiguration(FrozenModel):
     """The base configuration for different types of build.
 
     The configuration contain the location of the DPDK and configuration used for building it.
-
-    Attributes:
-        dpdk_location: The location of the DPDK tree.
     """
 
+    #: The location of the DPDK tree.
     dpdk_location: DPDKLocation
 
 
 class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration):
-    """DPDK precompiled build configuration.
-
-    Attributes:
-        precompiled_build_dir: If it's defined, DPDK has been pre-compiled and the build directory
-            is located in a subdirectory of `dpdk_tree` or `tarball` root directory. Otherwise, will
-            be using `dpdk_build_config` from configuration to build the DPDK from source.
-    """
+    """DPDK precompiled build configuration."""
 
+    #: If it's defined, DPDK has been pre-compiled and the build directory is located in a
+    #: subdirectory of `~dpdk_location.dpdk_tree` or `~dpdk_location.tarball` root directory.
     precompiled_build_dir: str = Field(min_length=1)
 
 
@@ -410,20 +351,18 @@ class DPDKBuildOptionsConfiguration(FrozenModel):
     """DPDK build options configuration.
 
     The build options used for building DPDK.
-
-    Attributes:
-        arch: The target architecture to build for.
-        os: The target os to build for.
-        cpu: The target CPU to build for.
-        compiler: The compiler executable to use.
-        compiler_wrapper: This string will be put in front of the compiler when executing the build.
-            Useful for adding wrapper commands, such as ``ccache``.
     """
 
+    #: The target architecture to build for.
     arch: Architecture
+    #: The target OS to build for.
     os: OS
+    #: The target CPU to build for.
     cpu: CPUType
+    #: The compiler executable to use.
     compiler: Compiler
+    #: This string will be put in front of the compiler when executing the build. Useful for adding
+    #: wrapper commands, such as ``ccache``.
     compiler_wrapper: str = ""
 
     @cached_property
@@ -433,12 +372,9 @@ def name(self) -> str:
 
 
 class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration):
-    """DPDK uncompiled build configuration.
-
-    Attributes:
-        build_options: The build options to compile DPDK.
-    """
+    """DPDK uncompiled build configuration."""
 
+    #: The build options to compiled DPDK with.
     build_options: DPDKBuildOptionsConfiguration
 
 
@@ -462,24 +398,13 @@ class TestSuiteConfig(FrozenModel):
             # or as model fields:
             - test_suite: hello_world
               test_cases: [hello_world_single_core] # without this field all test cases are run
-
-    Attributes:
-        test_suite_name: The name of the test suite module without the starting ``TestSuite_``.
-        test_cases_names: The names of test cases from this test suite to execute.
-            If empty, all test cases will be executed.
     """
 
-    test_suite_name: str = Field(
-        title="Test suite name",
-        description="The identifying module name of the test suite without the prefix.",
-        alias="test_suite",
-    )
-    test_cases_names: list[str] = Field(
-        default_factory=list,
-        title="Test cases by name",
-        description="The identifying name of the test cases of the test suite.",
-        alias="test_cases",
-    )
+    #: The name of the test suite module without the starting ``TestSuite_``.
+    test_suite_name: str = Field(alias="test_suite")
+    #: The names of test cases from this test suite to execute. If empty, all test cases will be
+    #: executed.
+    test_cases_names: list[str] = Field(default_factory=list, alias="test_cases")
 
     @cached_property
     def test_suite_spec(self) -> "TestSuiteSpec":
@@ -521,14 +446,11 @@ def validate_names(self) -> Self:
 
 
 class TestRunSUTNodeConfiguration(FrozenModel):
-    """The SUT node configuration of a test run.
-
-    Attributes:
-        node_name: The SUT node to use in this test run.
-        vdevs: The names of virtual devices to test.
-    """
+    """The SUT node configuration of a test run."""
 
+    #: The SUT node to use in this test run.
     node_name: str
+    #: The names of virtual devices to test.
     vdevs: list[str] = Field(default_factory=list)
 
 
@@ -537,25 +459,23 @@ class TestRunConfiguration(FrozenModel):
 
     The configuration contains testbed information, what tests to execute
     and with what DPDK build.
-
-    Attributes:
-        dpdk_config: The DPDK configuration used to test.
-        perf: Whether to run performance tests.
-        func: Whether to run functional tests.
-        skip_smoke_tests: Whether to skip smoke tests.
-        test_suites: The names of test suites and/or test cases to execute.
-        system_under_test_node: The SUT node configuration to use in this test run.
-        traffic_generator_node: The TG node name to use in this test run.
-        random_seed: The seed to use for pseudo-random generation.
     """
 
+    #: The DPDK configuration used to test.
     dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
-    perf: bool = Field(description="Enable performance testing.")
-    func: bool = Field(description="Enable functional testing.")
+    #: Whether to run performance tests.
+    perf: bool
+    #: Whether to run functional tests.
+    func: bool
+    #: Whether to skip smoke tests.
     skip_smoke_tests: bool = False
+    #: The names of test suites and/or test cases to execute.
     test_suites: list[TestSuiteConfig] = Field(min_length=1)
+    #: The SUT node configuration to use in this test run.
     system_under_test_node: TestRunSUTNodeConfiguration
+    #: The TG node name to use in this test run.
     traffic_generator_node: str
+    #: The seed to use for pseudo-random generation.
     random_seed: int | None = None
 
 
@@ -571,14 +491,11 @@ class TestRunWithNodesConfiguration(NamedTuple):
 
 
 class Configuration(FrozenModel):
-    """DTS testbed and test configuration.
-
-    Attributes:
-        test_runs: Test run configurations.
-        nodes: Node configurations.
-    """
+    """DTS testbed and test configuration."""
 
+    #: Test run configurations.
     test_runs: list[TestRunConfiguration] = Field(min_length=1)
+    #: Node configurations.
     nodes: list[NodeConfigurationTypes] = Field(min_length=1)
 
     @cached_property
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v6 8/9] dts: fix custom enum behaviour with docs
  2024-11-08 11:39 ` [PATCH v6 0/9] " Luca Vizzarro
                     ` (6 preceding siblings ...)
  2024-11-08 11:40   ` [PATCH v6 7/9] dts: improve configuration API docs Luca Vizzarro
@ 2024-11-08 11:40   ` Luca Vizzarro
  2024-11-08 11:40   ` [PATCH v6 9/9] dts: use TestSuiteSpec class imports Luca Vizzarro
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-08 11:40 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro

When building docs without any dependencies, autodoc will mock all the
packages missing from the system. Because DTS makes use of a special
enum library called aenum, autodoc fails to recognise enum inheriting
it as such and raises exceptions as a consequence.

This change extends the already in-place mechanism for type checking
that pretends that aenums are builtin enums to the API doc building
process.

Fixes: 039256daa8bf ("dts: add topology capability")
Fixes: c89d00380603 ("dts: add NIC capability support")

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
---
 dts/framework/remote_session/testpmd_shell.py | 3 ++-
 dts/framework/testbed_model/topology.py       | 3 ++-
 2 files changed, 4 insertions(+), 2 deletions(-)

diff --git a/dts/framework/remote_session/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
index 8a45a5231b..1561862f46 100644
--- a/dts/framework/remote_session/testpmd_shell.py
+++ b/dts/framework/remote_session/testpmd_shell.py
@@ -20,10 +20,11 @@
 from collections.abc import Callable, MutableSet
 from dataclasses import dataclass, field
 from enum import Flag, auto
+from os import environ
 from pathlib import PurePath
 from typing import TYPE_CHECKING, Any, ClassVar, Concatenate, ParamSpec, TypeAlias
 
-if TYPE_CHECKING:
+if TYPE_CHECKING or environ.get("DTS_DOC_BUILD"):
     from enum import Enum as NoAliasEnum
 else:
     from aenum import NoAliasEnum
diff --git a/dts/framework/testbed_model/topology.py b/dts/framework/testbed_model/topology.py
index 17b333e76a..3824804310 100644
--- a/dts/framework/testbed_model/topology.py
+++ b/dts/framework/testbed_model/topology.py
@@ -8,9 +8,10 @@
 """
 
 from dataclasses import dataclass
+from os import environ
 from typing import TYPE_CHECKING, Iterable
 
-if TYPE_CHECKING:
+if TYPE_CHECKING or environ.get("DTS_DOC_BUILD"):
     from enum import Enum as NoAliasEnum
 else:
     from aenum import NoAliasEnum
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* [PATCH v6 9/9] dts: use TestSuiteSpec class imports
  2024-11-08 11:39 ` [PATCH v6 0/9] " Luca Vizzarro
                     ` (7 preceding siblings ...)
  2024-11-08 11:40   ` [PATCH v6 8/9] dts: fix custom enum behaviour with docs Luca Vizzarro
@ 2024-11-08 11:40   ` Luca Vizzarro
  8 siblings, 0 replies; 83+ messages in thread
From: Luca Vizzarro @ 2024-11-08 11:40 UTC (permalink / raw)
  To: dev; +Cc: Paul Szczepanek, Patrick Robb, Luca Vizzarro, Nicholas Pratte

The introduction of TestSuiteSpec adds auto-discovery of test suites,
which are also automatically imported. This causes double imports as the
runner loads the test suites. This changes the behaviour of the runner
to load the imported classes from TestSuiteSpec instead of importing
them again.

Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
Reviewed-by: Patrick Robb <probb@iol.unh.edu>
---
 dts/framework/runner.py | 84 ++++-------------------------------------
 1 file changed, 7 insertions(+), 77 deletions(-)

diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index c3d9a27a8c..5f5837a132 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -2,6 +2,7 @@
 # Copyright(c) 2010-2019 Intel Corporation
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
+# Copyright(c) 2024 Arm Limited
 
 """Test suite runner module.
 
@@ -17,8 +18,6 @@
 and the test case stage runs test cases individually.
 """
 
-import importlib
-import inspect
 import os
 import random
 import sys
@@ -39,12 +38,7 @@
     TGNodeConfiguration,
     load_config,
 )
-from .exception import (
-    BlockingTestSuiteError,
-    ConfigurationError,
-    SSHTimeoutError,
-    TestCaseVerifyError,
-)
+from .exception import BlockingTestSuiteError, SSHTimeoutError, TestCaseVerifyError
 from .logger import DTSLogger, DtsStage, get_dts_logger
 from .settings import SETTINGS
 from .test_result import (
@@ -215,11 +209,10 @@ def _get_test_suites_with_cases(
         func: bool,
         perf: bool,
     ) -> list[TestSuiteWithCases]:
-        """Test suites with test cases discovery.
+        """Get test suites with selected cases.
 
-        The test suites with test cases defined in the user configuration are discovered
-        and stored for future use so that we don't import the modules twice and so that
-        the list of test suites with test cases is available for recording right away.
+        The test suites with test cases defined in the user configuration are selected
+        and the corresponding functions and classes are gathered.
 
         Args:
             test_suite_configs: Test suite configurations.
@@ -227,12 +220,12 @@ def _get_test_suites_with_cases(
             perf: Whether to include performance test cases in the final list.
 
         Returns:
-            The discovered test suites, each with test cases.
+            The test suites, each with test cases.
         """
         test_suites_with_cases = []
 
         for test_suite_config in test_suite_configs:
-            test_suite_class = self._get_test_suite_class(test_suite_config.test_suite_name)
+            test_suite_class = test_suite_config.test_suite_spec.class_obj
             test_cases: list[type[TestCase]] = []
             func_test_cases, perf_test_cases = test_suite_class.filter_test_cases(
                 test_suite_config.test_cases_names
@@ -245,71 +238,8 @@ def _get_test_suites_with_cases(
             test_suites_with_cases.append(
                 TestSuiteWithCases(test_suite_class=test_suite_class, test_cases=test_cases)
             )
-
         return test_suites_with_cases
 
-    def _get_test_suite_class(self, module_name: str) -> type[TestSuite]:
-        """Find the :class:`TestSuite` class in `module_name`.
-
-        The full module name is `module_name` prefixed with `self._test_suite_module_prefix`.
-        The module name is a standard filename with words separated with underscores.
-        Search the `module_name` for a :class:`TestSuite` class which starts
-        with `self._test_suite_class_prefix`, continuing with CamelCase `module_name`.
-        The first matching class is returned.
-
-        The CamelCase convention applies to abbreviations, acronyms, initialisms and so on::
-
-            OS -> Os
-            TCP -> Tcp
-
-        Args:
-            module_name: The module name without prefix where to search for the test suite.
-
-        Returns:
-            The found test suite class.
-
-        Raises:
-            ConfigurationError: If the corresponding module is not found or
-                a valid :class:`TestSuite` is not found in the module.
-        """
-
-        def is_test_suite(object) -> bool:
-            """Check whether `object` is a :class:`TestSuite`.
-
-            The `object` is a subclass of :class:`TestSuite`, but not :class:`TestSuite` itself.
-
-            Args:
-                object: The object to be checked.
-
-            Returns:
-                :data:`True` if `object` is a subclass of `TestSuite`.
-            """
-            try:
-                if issubclass(object, TestSuite) and object is not TestSuite:
-                    return True
-            except TypeError:
-                return False
-            return False
-
-        testsuite_module_path = f"{self._test_suite_module_prefix}{module_name}"
-        try:
-            test_suite_module = importlib.import_module(testsuite_module_path)
-        except ModuleNotFoundError as e:
-            raise ConfigurationError(
-                f"Test suite module '{testsuite_module_path}' not found."
-            ) from e
-
-        camel_case_suite_name = "".join(
-            [suite_word.capitalize() for suite_word in module_name.split("_")]
-        )
-        full_suite_name_to_find = f"{self._test_suite_class_prefix}{camel_case_suite_name}"
-        for class_name, class_obj in inspect.getmembers(test_suite_module, is_test_suite):
-            if class_name == full_suite_name_to_find:
-                return class_obj
-        raise ConfigurationError(
-            f"Couldn't find any valid test suites in {test_suite_module.__name__}."
-        )
-
     def _connect_nodes_and_run_test_run(
         self,
         sut_nodes: dict[str, SutNode],
-- 
2.43.0


^ permalink raw reply	[flat|nested] 83+ messages in thread

* RE: [PATCH v6 4/9] dts: use pydantic in the configuration
  2024-11-08 11:40   ` [PATCH v6 4/9] dts: use pydantic in the configuration Luca Vizzarro
@ 2024-11-20  8:48     ` Ali Alnubani
  0 siblings, 0 replies; 83+ messages in thread
From: Ali Alnubani @ 2024-11-20  8:48 UTC (permalink / raw)
  To: Luca Vizzarro, dev; +Cc: Paul Szczepanek, Patrick Robb, Nicholas Pratte

> -----Original Message-----
> From: Luca Vizzarro <luca.vizzarro@arm.com>
> Sent: Friday, November 8, 2024 1:40 PM
> To: dev@dpdk.org
> Cc: Paul Szczepanek <paul.szczepanek@arm.com>; Patrick Robb
> <probb@iol.unh.edu>; Luca Vizzarro <luca.vizzarro@arm.com>; Nicholas Pratte
> <npratte@iol.unh.edu>
> Subject: [PATCH v6 4/9] dts: use pydantic in the configuration
> 
> This change brings in pydantic in place of warlock. Pydantic offers
> a built-in model validation system in the classes, which allows for
> a more resilient and simpler code. As a consequence of this change:
> 
> - most validation is now built-in
> - further validation is added to verify:
>   - cross referencing of node names and ports
>   - test suite and test cases names
> - dictionaries representing the config schema are removed
> - the config schema is no longer used and therefore dropped
> - the TrafficGeneratorType enum has been changed from inheriting
>   StrEnum to the native str and Enum. This change was necessary to
>   enable the discriminator for object unions
> - the structure of the classes has been slightly changed to perfectly
>   match the structure of the configuration files
> - the test suite argument catches the ValidationError that
>   TestSuiteConfig can now raise
> - the DPDK location has been wrapped under another configuration
>   mapping `dpdk_location`
> - the DPDK locations are now structured and enforced by classes,
>   further simplifying the validation and handling thanks to
>   pattern matching
> 
> Bugzilla ID: 1508
> 
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
> Reviewed-by: Patrick Robb <probb@iol.unh.edu>
> ---

Hello,

Documentation fails to build on 24.11.0-rc3 in Fedora 40 because of this patchset:

"""
$ ninja-build -C build doc
[..]
[4/6] Generating doc/api/dts/dts_api_html with a custom command
FAILED: doc/api/dts/html
/usr/bin/python3 ../..buildtools/call-sphinx-build.py /usr/bin/sphinx-build 24.11.0-rc2 doc/api/dts build/doc/api/html/dts -E -c doc/guides -W
[..]
Warning, treated as error:
Failed to get a method signature for framework.config.TestSuiteConfig.convert_from_string: <classmethod(<function TestSuiteConfig.convert_from_string at 0x7f4922675e40>)> is not a callable object
[5/6] Generating doc/guides/html_guides with a custom command
"""

Can you please have a look?

Thanks,
Ali

^ permalink raw reply	[flat|nested] 83+ messages in thread

* RE: [PATCH v6 2/9] dts: add TestSuiteSpec class and discovery
  2024-11-08 11:39   ` [PATCH v6 2/9] dts: add TestSuiteSpec class and discovery Luca Vizzarro
@ 2024-11-20  8:48     ` Ali Alnubani
  0 siblings, 0 replies; 83+ messages in thread
From: Ali Alnubani @ 2024-11-20  8:48 UTC (permalink / raw)
  To: Luca Vizzarro, dev; +Cc: Paul Szczepanek, Patrick Robb, Nicholas Pratte

> -----Original Message-----
> From: Luca Vizzarro <luca.vizzarro@arm.com>
> Sent: Friday, November 8, 2024 1:40 PM
> To: dev@dpdk.org
> Cc: Paul Szczepanek <paul.szczepanek@arm.com>; Patrick Robb
> <probb@iol.unh.edu>; Luca Vizzarro <luca.vizzarro@arm.com>; Nicholas Pratte
> <npratte@iol.unh.edu>
> Subject: [PATCH v6 2/9] dts: add TestSuiteSpec class and discovery
> 
> Currently there is a lack of a definition which identifies all the test
> suites available to test. This change intends to simplify the process to
> discover all the test suites and identify them.
> 
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>
> Reviewed-by: Patrick Robb <probb@iol.unh.edu>
> ---

Hello,

Documentation seems to be failing to build on top of this patch on Fedora 40:

"""
$ ninja-build -C build doc
[..]
Warning, treated as error:
autodoc: failed to import module 'runner' from module 'framework'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/lib/python3.12/site-packages/sphinx/ext/autodoc/importer.py", line 69, in import_module
    return importlib.import_module(modname)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 995, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "dts/framework/runner.py", line 42, in <module>
    from .test_result import (
  File "dts/framework/test_result.py", line 37, in <module>
    from .test_suite import TestCase, TestSuite
  File "dts/framework/test_suite.py", line 715, in <module>
    AVAILABLE_TEST_SUITES: list[TestSuiteSpec] = TestSuiteSpec.discover_all()
                                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "dts/framework/test_suite.py", line 707, in discover_all
    if test_suite.class_obj:
       ^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.12/functools.py", line 993, in __get__
    val = self.func(instance)
          ^^^^^^^^^^^^^^^^^^^
  File "dts/framework/test_suite.py", line 658, in class_obj
    for class_name, class_obj in inspect.getmembers(self.module, is_test_suite):
                                                    ^^^^^^^^^^^
  File "/usr/lib64/python3.12/functools.py", line 993, in __get__
    val = self.func(instance)
          ^^^^^^^^^^^^^^^^^^^
  File "dts/framework/test_suite.py", line 629, in module
    return import_module(f"{self.TEST_SUITES_PACKAGE_NAME}.{self.module_name}")
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "dts/tests/TestSuite_hello_world.py", line 20, in <module>
    @requires(topology_type=TopologyType.no_link)
     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "dts/framework/testbed_model/capability.py", line 479, in add_required_capability
    topology_capability = TopologyCapability.get_unique(topology_type)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "dts/framework/testbed_model/capability.py", line 331, in get_unique
    if topology_type.name not in cls._unique_capabilities:
       ^^^^^^^^^^^^^^^^^^
AttributeError: 'int' object has no attribute 'name'
"""

Regards,
Ali

^ permalink raw reply	[flat|nested] 83+ messages in thread

end of thread, other threads:[~2024-11-20  8:48 UTC | newest]

Thread overview: 83+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2024-08-22 16:39 [PATCH 0/5] dts: Pydantic configuration Luca Vizzarro
2024-08-22 16:39 ` [PATCH 1/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
2024-09-16 13:00   ` Juraj Linkeš
2024-10-29 12:57     ` Luca Vizzarro
2024-09-19 20:01   ` Nicholas Pratte
2024-08-22 16:39 ` [PATCH 2/5] dts: add Pydantic and remove Warlock Luca Vizzarro
2024-09-16 13:17   ` Juraj Linkeš
2024-09-19 19:56   ` Nicholas Pratte
2024-09-30 20:41   ` Dean Marx
2024-08-22 16:39 ` [PATCH 3/5] dts: use Pydantic in the configuration Luca Vizzarro
2024-09-17 11:13   ` Juraj Linkeš
2024-10-29 13:00     ` Luca Vizzarro
2024-09-30 17:56   ` Nicholas Pratte
2024-10-29 12:41     ` Luca Vizzarro
2024-09-30 21:45   ` Dean Marx
2024-10-29 12:51     ` Luca Vizzarro
2024-08-22 16:39 ` [PATCH 4/5] dts: use TestSuiteSpec class imports Luca Vizzarro
2024-09-17 11:39   ` Juraj Linkeš
2024-10-29 12:52     ` Luca Vizzarro
2024-10-01 17:12   ` Dean Marx
2024-10-29 12:54     ` Luca Vizzarro
2024-10-01 20:45   ` Nicholas Pratte
2024-10-29 12:56     ` Luca Vizzarro
2024-11-04 17:49       ` Nicholas Pratte
2024-08-22 16:39 ` [PATCH 5/5] dts: add JSON schema generation script Luca Vizzarro
2024-09-17 11:59   ` Juraj Linkeš
2024-10-01 20:48   ` Nicholas Pratte
2024-10-25 15:58 ` [PATCH v2 0/5] dts: Pydantic configuration Luca Vizzarro
2024-10-25 15:58   ` [PATCH v2 1/5] dts: add pydantic dependency Luca Vizzarro
2024-10-25 15:58   ` [PATCH v2 2/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
2024-10-25 15:58   ` [PATCH v2 3/5] dts: use pydantic in the configuration Luca Vizzarro
2024-10-25 15:58   ` [PATCH v2 4/5] dts: remove warlock dependency Luca Vizzarro
2024-10-25 15:58   ` [PATCH v2 5/5] dts: use TestSuiteSpec class imports Luca Vizzarro
2024-10-25 16:43 ` [PATCH v3 0/5] dts: Pydantic configuration Luca Vizzarro
2024-10-25 16:43   ` [PATCH v3 1/5] dts: add pydantic dependency Luca Vizzarro
2024-10-25 16:43   ` [PATCH v3 2/5] dts: add TestSuiteSpec class and discovery Luca Vizzarro
2024-10-25 16:43   ` [PATCH v3 3/5] dts: use pydantic in the configuration Luca Vizzarro
2024-10-25 16:43   ` [PATCH v3 4/5] dts: remove warlock dependency Luca Vizzarro
2024-10-25 16:43   ` [PATCH v3 5/5] dts: use TestSuiteSpec class imports Luca Vizzarro
2024-10-28 17:49 ` [PATCH v4 0/8] dts: Pydantic configuration Luca Vizzarro
2024-10-28 17:49   ` [PATCH v4 1/8] dts: add pydantic dependency Luca Vizzarro
2024-10-31 18:42     ` Nicholas Pratte
2024-10-28 17:49   ` [PATCH v4 2/8] dts: add TestSuiteSpec class and discovery Luca Vizzarro
2024-10-31 19:32     ` Nicholas Pratte
2024-10-31 20:21     ` Nicholas Pratte
2024-11-06 17:58       ` Luca Vizzarro
2024-10-28 17:49   ` [PATCH v4 3/8] dts: refactor build and node info classes Luca Vizzarro
2024-10-31 20:16     ` Nicholas Pratte
2024-11-06 18:02       ` Luca Vizzarro
2024-10-28 17:49   ` [PATCH v4 4/8] dts: use pydantic in the configuration Luca Vizzarro
2024-10-31 20:20     ` Nicholas Pratte
2024-10-28 17:49   ` [PATCH v4 5/8] dts: remove warlock dependency Luca Vizzarro
2024-10-31 20:23     ` Nicholas Pratte
2024-10-28 17:49   ` [PATCH v4 6/8] dts: add autodoc pydantic Luca Vizzarro
2024-10-31 20:52     ` Nicholas Pratte
2024-11-06 18:04       ` Luca Vizzarro
2024-10-28 17:49   ` [PATCH v4 7/8] dts: improve configuration API docs Luca Vizzarro
2024-11-04 17:34     ` Nicholas Pratte
2024-10-28 17:49   ` [PATCH v4 8/8] dts: use TestSuiteSpec class imports Luca Vizzarro
2024-11-04 17:50     ` Nicholas Pratte
2024-11-06 18:09 ` [PATCH v5 0/8] dts: Pydantic configuration Luca Vizzarro
2024-11-06 18:09   ` [PATCH v5 1/8] dts: add pydantic dependency Luca Vizzarro
2024-11-06 18:09   ` [PATCH v5 2/8] dts: add TestSuiteSpec class and discovery Luca Vizzarro
2024-11-06 18:09   ` [PATCH v5 3/8] dts: refactor build and node info classes Luca Vizzarro
2024-11-06 18:09   ` [PATCH v5 4/8] dts: use pydantic in the configuration Luca Vizzarro
2024-11-07  0:33     ` Patrick Robb
2024-11-06 18:09   ` [PATCH v5 5/8] dts: remove warlock dependency Luca Vizzarro
2024-11-06 18:09   ` [PATCH v5 6/8] dts: add autodoc pydantic Luca Vizzarro
2024-11-06 18:09   ` [PATCH v5 7/8] dts: improve configuration API docs Luca Vizzarro
2024-11-06 18:09   ` [PATCH v5 8/8] dts: use TestSuiteSpec class imports Luca Vizzarro
2024-11-07  0:34   ` [PATCH v5 0/8] dts: Pydantic configuration Patrick Robb
2024-11-08 11:39 ` [PATCH v6 0/9] " Luca Vizzarro
2024-11-08 11:39   ` [PATCH v6 1/9] dts: add pydantic dependency Luca Vizzarro
2024-11-08 11:39   ` [PATCH v6 2/9] dts: add TestSuiteSpec class and discovery Luca Vizzarro
2024-11-20  8:48     ` Ali Alnubani
2024-11-08 11:39   ` [PATCH v6 3/9] dts: refactor build and node info classes Luca Vizzarro
2024-11-08 11:40   ` [PATCH v6 4/9] dts: use pydantic in the configuration Luca Vizzarro
2024-11-20  8:48     ` Ali Alnubani
2024-11-08 11:40   ` [PATCH v6 5/9] dts: remove warlock dependency Luca Vizzarro
2024-11-08 11:40   ` [PATCH v6 6/9] dts: add autodoc pydantic Luca Vizzarro
2024-11-08 11:40   ` [PATCH v6 7/9] dts: improve configuration API docs Luca Vizzarro
2024-11-08 11:40   ` [PATCH v6 8/9] dts: fix custom enum behaviour with docs Luca Vizzarro
2024-11-08 11:40   ` [PATCH v6 9/9] dts: use TestSuiteSpec class imports Luca Vizzarro

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).