DPDK patches and discussions
 help / color / mirror / Atom feed
From: Nicholas Pratte <npratte@iol.unh.edu>
To: Luca Vizzarro <luca.vizzarro@arm.com>
Cc: dev@dpdk.org, Paul Szczepanek <paul.szczepanek@arm.com>,
	Dean Marx <dmarx@iol.unh.edu>, Patrick Robb <probb@iol.unh.edu>
Subject: Re: [PATCH v4 6/7] dts: split configuration file
Date: Fri, 24 Jan 2025 13:18:47 -0500	[thread overview]
Message-ID: <CAKXZ7eg=Ym4uTeu0-RncA6Nn-HxaArTanXng6RUuJz91=8SRrw@mail.gmail.com> (raw)
In-Reply-To: <20250124113909.137128-7-luca.vizzarro@arm.com>

This is great! Before Jeremy left, he suggested going a step further
and putting the config in a directory of its own, potentially offering
more flexibility. Something we could consider looking into in the
future, if there is time.

Reviewed-by: Nicholas Pratte <npratte@iol.unh.edu>

On Fri, Jan 24, 2025 at 6:39 AM Luca Vizzarro <luca.vizzarro@arm.com> wrote:
>
> To avoid the creation of a big monolithic configuration file, nodes and
> test runs are now split into distinct files. This also allows
> flexibility to run different test runs on the same nodes.
>
> Since there are now 2 distinct configuration files, there are also 2
> command line arguments to specify them.
>
> Bugzilla ID: 1344
>
> Signed-off-by: Nicholas Pratte <npratte@iol.unh.edu>
> Signed-off-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Paul Szczepanek <paul.szczepanek@arm.com>
> Reviewed-by: Dean Marx <dmarx@iol.unh.edu>
> ---
>  doc/guides/tools/dts.rst                      |  78 ++-
>  dts/.gitignore                                |   4 +
>  dts/conf.yaml                                 |  84 ---
>  dts/framework/config/__init__.py              | 526 ++----------------
>  dts/framework/config/common.py                |  59 ++
>  dts/framework/config/node.py                  | 144 +++++
>  dts/framework/config/test_run.py              | 290 ++++++++++
>  dts/framework/runner.py                       |  11 +-
>  dts/framework/settings.py                     |  37 +-
>  dts/framework/test_result.py                  |   2 +-
>  dts/framework/testbed_model/node.py           |   6 +-
>  dts/framework/testbed_model/os_session.py     |   2 +-
>  dts/framework/testbed_model/port.py           |   2 +-
>  dts/framework/testbed_model/sut_node.py       |   6 +-
>  dts/framework/testbed_model/tg_node.py        |   2 +-
>  dts/framework/testbed_model/topology.py       |   2 +-
>  .../traffic_generator/__init__.py             |   2 +-
>  .../testbed_model/traffic_generator/scapy.py  |   2 +-
>  .../traffic_generator/traffic_generator.py    |   2 +-
>  dts/nodes.example.yaml                        |  53 ++
>  dts/test_runs.example.yaml                    |  33 ++
>  dts/tests/TestSuite_smoke_tests.py            |   2 +-
>  22 files changed, 729 insertions(+), 620 deletions(-)
>  create mode 100644 dts/.gitignore
>  delete mode 100644 dts/conf.yaml
>  create mode 100644 dts/framework/config/common.py
>  create mode 100644 dts/framework/config/node.py
>  create mode 100644 dts/framework/config/test_run.py
>  create mode 100644 dts/nodes.example.yaml
>  create mode 100644 dts/test_runs.example.yaml
>
> diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
> index abc389b42a..6fc4eb8dac 100644
> --- a/doc/guides/tools/dts.rst
> +++ b/doc/guides/tools/dts.rst
> @@ -210,8 +210,10 @@ DTS configuration is split into nodes and test runs,
>  and must respect the model definitions
>  as documented in the DTS API docs under the ``config`` page.
>  The root of the configuration is represented by the ``Configuration`` model.
> -By default, DTS will try to use the ``dts/conf.yaml`` :ref:`config file <configuration_example>`,
> -which is a template that illustrates what can be configured in DTS.
> +By default, DTS will try to use the ``dts/test_runs.example.yaml``
> +:ref:`config file <test_runs_configuration_example>`, and ``dts/nodes.example.yaml``
> +:ref:`config file <nodes_configuration_example>` which are templates that
> +illustrate what can be configured in DTS.
>
>  The user must have :ref:`administrator privileges <sut_admin_user>`
>  which don't require password authentication.
> @@ -225,16 +227,19 @@ DTS is run with ``main.py`` located in the ``dts`` directory after entering Poet
>  .. code-block:: console
>
>     (dts-py3.10) $ ./main.py --help
> -   usage: main.py [-h] [--config-file FILE_PATH] [--output-dir DIR_PATH] [-t SECONDS] [-v] [--dpdk-tree DIR_PATH | --tarball FILE_PATH] [--remote-source]
> -                  [--precompiled-build-dir DIR_NAME] [--compile-timeout SECONDS] [--test-suite TEST_SUITE [TEST_CASES ...]] [--re-run N_TIMES]
> -                  [--random-seed NUMBER]
> +   usage: main.py [-h] [--test-runs-config-file FILE_PATH] [--nodes-config-file FILE_PATH] [--output-dir DIR_PATH] [-t SECONDS] [-v]
> +                  [--dpdk-tree DIR_PATH | --tarball FILE_PATH] [--remote-source] [--precompiled-build-dir DIR_NAME]
> +                  [--compile-timeout SECONDS] [--test-suite TEST_SUITE [TEST_CASES ...]] [--re-run N_TIMES] [--random-seed NUMBER]
>
> -   Run DPDK test suites. All options may be specified with the environment variables provided in brackets. Command line arguments have higher priority.
> +   Run DPDK test suites. All options may be specified with the environment variables provided in brackets. Command line arguments have higher
> +   priority.
>
>     options:
>       -h, --help            show this help message and exit
> -     --config-file FILE_PATH
> -                           [DTS_CFG_FILE] The configuration file that describes the test cases, SUTs and DPDK build configs. (default: conf.yaml)
> +     --test-runs-config-file FILE_PATH
> +                           [DTS_TEST_RUNS_CFG_FILE] The configuration file that describes the test cases and DPDK build options. (default: test-runs.conf.yaml)
> +     --nodes-config-file FILE_PATH
> +                           [DTS_NODES_CFG_FILE] The configuration file that describes the SUT and TG nodes. (default: nodes.conf.yaml)
>       --output-dir DIR_PATH, --output DIR_PATH
>                             [DTS_OUTPUT_DIR] Output directory where DTS logs and results are saved. (default: output)
>       -t SECONDS, --timeout SECONDS
> @@ -243,31 +248,31 @@ DTS is run with ``main.py`` located in the ``dts`` directory after entering Poet
>       --compile-timeout SECONDS
>                             [DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK. (default: 1200)
>       --test-suite TEST_SUITE [TEST_CASES ...]
> -                           [DTS_TEST_SUITES] A list containing a test suite with test cases. The first parameter is the test suite name, and the rest are
> -                           test case names, which are optional. May be specified multiple times. To specify multiple test suites in the environment
> -                           variable, join the lists with a comma. Examples: --test-suite suite case case --test-suite suite case ... |
> -                           DTS_TEST_SUITES='suite case case, suite case, ...' | --test-suite suite --test-suite suite case ... | DTS_TEST_SUITES='suite,
> -                           suite case, ...' (default: [])
> +                           [DTS_TEST_SUITES] A list containing a test suite with test cases. The first parameter is the test suite name, and
> +                           the rest are test case names, which are optional. May be specified multiple times. To specify multiple test suites
> +                           in the environment variable, join the lists with a comma. Examples: --test-suite suite case case --test-suite
> +                           suite case ... | DTS_TEST_SUITES='suite case case, suite case, ...' | --test-suite suite --test-suite suite case
> +                           ... | DTS_TEST_SUITES='suite, suite case, ...' (default: [])
>       --re-run N_TIMES, --re_run N_TIMES
>                             [DTS_RERUN] Re-run each test case the specified number of times if a test failure occurs. (default: 0)
> -     --random-seed NUMBER  [DTS_RANDOM_SEED] The seed to use with the pseudo-random generator. If not specified, the configuration value is used instead.
> -                           If that's also not specified, a random seed is generated. (default: None)
> +     --random-seed NUMBER  [DTS_RANDOM_SEED] The seed to use with the pseudo-random generator. If not specified, the configuration value is
> +                           used instead. If that's also not specified, a random seed is generated. (default: None)
>
>     DPDK Build Options:
> -     Arguments in this group (and subgroup) will be applied to a DPDKLocation when the DPDK tree, tarball or revision will be provided, other arguments
> -     like remote source and build dir are optional. A DPDKLocation from settings are used instead of from config if construct successful.
> +     Arguments in this group (and subgroup) will be applied to a DPDKLocation when the DPDK tree, tarball or revision will be provided,
> +     other arguments like remote source and build dir are optional. A DPDKLocation from settings are used instead of from config if
> +     construct successful.
>
> -     --dpdk-tree DIR_PATH  [DTS_DPDK_TREE] The path to the DPDK source tree directory to test. Cannot be used in conjunction with --tarball. (default:
> -                             None)
> +     --dpdk-tree DIR_PATH  [DTS_DPDK_TREE] The path to the DPDK source tree directory to test. Cannot be used in conjunction with --tarball.
> +                           (default: None)
>       --tarball FILE_PATH, --snapshot FILE_PATH
> -                           [DTS_DPDK_TARBALL] The path to the DPDK source tarball to test. DPDK must be contained in a folder with the same name as the
> -                           tarball file. Cannot be used in conjunction with --dpdk-tree. (default: None)
> -     --remote-source       [DTS_REMOTE_SOURCE] Set this option if either the DPDK source tree or tarball to be used are located on the SUT node. Can only
> -                           be used with --dpdk-tree or --tarball. (default: False)
> +                           [DTS_DPDK_TARBALL] The path to the DPDK source tarball to test. DPDK must be contained in a folder with the same
> +                           name as the tarball file. Cannot be used in conjunction with --dpdk-tree. (default: None)
> +     --remote-source       [DTS_REMOTE_SOURCE] Set this option if either the DPDK source tree or tarball to be used are located on the SUT
> +                           node. Can only be used with --dpdk-tree or --tarball. (default: False)
>       --precompiled-build-dir DIR_NAME
> -                           [DTS_PRECOMPILED_BUILD_DIR] Define the subdirectory under the DPDK tree root directory where the pre-compiled binaries are
> -                           located. If set, DTS will build DPDK under the `build` directory instead. Can only be used with --dpdk-tree or --tarball.
> -                           (default: None)
> +                           [DTS_PRECOMPILED_BUILD_DIR] Define the subdirectory under the DPDK tree root directory or tarball where the pre-
> +                           compiled binaries are located. (default: None)
>
>
>  The brackets contain the names of environment variables that set the same thing.
> @@ -467,7 +472,7 @@ The output is generated in ``build/doc/api/dts/html``.
>  Configuration Example
>  ---------------------
>
> -The following example (which can be found in ``dts/conf.yaml``) sets up two nodes:
> +The following example configuration files sets up two nodes:
>
>  * ``SUT1`` which is already setup with the DPDK build requirements and any other
>    required for execution;
> @@ -479,6 +484,21 @@ And they both have two network ports which are physically connected to each othe
>     This example assumes that you have setup SSH keys in both the system under test
>     and traffic generator nodes.
>
> -.. literalinclude:: ../../../dts/conf.yaml
> +.. _test_runs_configuration_example:
> +
> +``dts/test_runs.example.yaml``
> +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> +
> +.. literalinclude:: ../../../dts/test_runs.example.yaml
> +   :language: yaml
> +   :start-at: # Define
> +
> +.. _nodes_configuration_example:
> +
> +
> +``dts/nodes.example.yaml``
> +~~~~~~~~~~~~~~~~~~~~~~~~~~
> +
> +.. literalinclude:: ../../../dts/nodes.example.yaml
>     :language: yaml
> -   :start-at: test_runs:
> +   :start-at: # Define
> diff --git a/dts/.gitignore b/dts/.gitignore
> new file mode 100644
> index 0000000000..d53a2f3b7e
> --- /dev/null
> +++ b/dts/.gitignore
> @@ -0,0 +1,4 @@
> +# default configuration files for DTS
> +nodes.yaml
> +test_runs.yaml
> +
> diff --git a/dts/conf.yaml b/dts/conf.yaml
> deleted file mode 100644
> index bc78882d0d..0000000000
> --- a/dts/conf.yaml
> +++ /dev/null
> @@ -1,84 +0,0 @@
> -# SPDX-License-Identifier: BSD-3-Clause
> -# Copyright 2022-2023 The DPDK contributors
> -# Copyright 2023 Arm Limited
> -
> -test_runs:
> -  # define one test run environment
> -  - dpdk_build:
> -      dpdk_location:
> -        # dpdk_tree: Commented out because `tarball` is defined.
> -        tarball: dpdk-tarball.tar.xz
> -        # Either `dpdk_tree` or `tarball` can be defined, but not both.
> -        remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
> -                      # is located on the SUT node, instead of the execution host.
> -
> -      # precompiled_build_dir: Commented out because `build_options` is defined.
> -      build_options:
> -        # the combination of the following two makes CC="ccache gcc"
> -        compiler: gcc
> -        compiler_wrapper: ccache # Optional.
> -      # If `precompiled_build_dir` is defined, DPDK has been pre-built and the build directory is
> -      # in a subdirectory of DPDK tree root directory. Otherwise, will be using the `build_options`
> -      # to build the DPDK from source. Either `precompiled_build_dir` or `build_options` can be
> -      # defined, but not both.
> -    perf: false # disable performance testing
> -    func: true # enable functional testing
> -    skip_smoke_tests: false # optional
> -    test_suites: # the following test suites will be run in their entirety
> -      - hello_world
> -    vdevs: # optional; if removed, vdevs won't be used in the execution
> -      - "crypto_openssl"
> -    # The machine running the DPDK test executable
> -    system_under_test_node: "SUT 1"
> -    # Traffic generator node to use for this execution environment
> -    traffic_generator_node: "TG 1"
> -nodes:
> -  # Define a system under test node, having two network ports physically
> -  # connected to the corresponding ports in TG 1 (the peer node)
> -  - name: "SUT 1"
> -    hostname: sut1.change.me.localhost
> -    user: dtsuser
> -    os: linux
> -    ports:
> -      # sets up the physical link between "SUT 1"@0000:00:08.0 and "TG 1"@0000:00:08.0
> -      - pci: "0000:00:08.0"
> -        os_driver_for_dpdk: vfio-pci # OS driver that DPDK will use
> -        os_driver: i40e              # OS driver to bind when the tests are not running
> -        peer_node: "TG 1"
> -        peer_pci: "0000:00:08.0"
> -      # sets up the physical link between "SUT 1"@0000:00:08.1 and "TG 1"@0000:00:08.1
> -      - pci: "0000:00:08.1"
> -        os_driver_for_dpdk: vfio-pci
> -        os_driver: i40e
> -        peer_node: "TG 1"
> -        peer_pci: "0000:00:08.1"
> -    hugepages_2mb: # optional; if removed, will use system hugepage configuration
> -        number_of: 256
> -        force_first_numa: false
> -    dpdk_config:
> -        lcores: "" # use all available logical cores (Skips first core)
> -        memory_channels: 4 # tells DPDK to use 4 memory channels
> -  # Define a Scapy traffic generator node, having two network ports
> -  # physically connected to the corresponding ports in SUT 1 (the peer node).
> -  - name: "TG 1"
> -    hostname: tg1.change.me.localhost
> -    user: dtsuser
> -    os: linux
> -    ports:
> -      # sets up the physical link between "TG 1"@0000:00:08.0 and "SUT 1"@0000:00:08.0
> -      - pci: "0000:00:08.0"
> -        os_driver_for_dpdk: rdma
> -        os_driver: rdma
> -        peer_node: "SUT 1"
> -        peer_pci: "0000:00:08.0"
> -      # sets up the physical link between "SUT 1"@0000:00:08.0 and "TG 1"@0000:00:08.0
> -      - pci: "0000:00:08.1"
> -        os_driver_for_dpdk: rdma
> -        os_driver: rdma
> -        peer_node: "SUT 1"
> -        peer_pci: "0000:00:08.1"
> -    hugepages_2mb: # optional; if removed, will use system hugepage configuration
> -        number_of: 256
> -        force_first_numa: false
> -    traffic_generator:
> -        type: SCAPY
> diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
> index 6ae98d0387..adbd4e952d 100644
> --- a/dts/framework/config/__init__.py
> +++ b/dts/framework/config/__init__.py
> @@ -8,20 +8,15 @@
>
>  This package offers classes that hold real-time information about the testbed, hold test run
>  configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
> -the YAML test run configuration file and validates it against the :class:`Configuration` Pydantic
> -model.
> +the YAML configuration files and validates them against the :class:`Configuration` Pydantic
> +model, which fields are directly mapped.
>
> -The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
> -this package. The allowed keys and types inside this dictionary map directly to the
> -:class:`Configuration` model, its fields and sub-models.
> +The configuration files are split in:
>
> -The test run configuration has two main sections:
> -
> -    * The :class:`TestRunConfiguration` which defines what tests are going to be run
> -      and how DPDK will be built. It also references the testbed where these tests and DPDK
> -      are going to be run,
> -    * The nodes of the testbed are defined in the other section,
> -      a :class:`list` of :class:`NodeConfiguration` objects.
> +    * A list of test run which are represented by :class:`~.test_run.TestRunConfiguration`
> +      defining what tests are going to be run and how DPDK will be built. It also references
> +      the testbed where these tests and DPDK are going to be run,
> +    * A list of the nodes of the testbed which ar represented by :class:`~.node.NodeConfiguration`.
>
>  The real-time information about testbed is supposed to be gathered at runtime.
>
> @@ -32,467 +27,24 @@
>        and makes it thread safe should we ever want to move in that direction.
>  """
>
> -import tarfile
> -from collections.abc import Callable, MutableMapping
> -from enum import Enum, auto, unique
>  from functools import cached_property
> -from pathlib import Path, PurePath
> -from typing import TYPE_CHECKING, Annotated, Any, Literal, NamedTuple, TypedDict, cast
> +from pathlib import Path
> +from typing import Annotated, Any, Literal, NamedTuple, TypeVar, cast
>
>  import yaml
> -from pydantic import (
> -    BaseModel,
> -    ConfigDict,
> -    Field,
> -    ValidationError,
> -    ValidationInfo,
> -    field_validator,
> -    model_validator,
> -)
> +from pydantic import Field, TypeAdapter, ValidationError, field_validator, model_validator
>  from typing_extensions import Self
>
>  from framework.exception import ConfigurationError
> -from framework.settings import Settings
> -from framework.utils import REGEX_FOR_PCI_ADDRESS, StrEnum
> -
> -if TYPE_CHECKING:
> -    from framework.test_suite import TestSuiteSpec
> -
> -
> -class ValidationContext(TypedDict):
> -    """A context dictionary to use for validation."""
> -
> -    #: The command line settings.
> -    settings: Settings
> -
> -
> -def load_fields_from_settings(
> -    *fields: str | tuple[str, str],
> -) -> Callable[[Any, ValidationInfo], Any]:
> -    """Before model validator that injects values from :attr:`ValidationContext.settings`.
> -
> -    Args:
> -        *fields: The name of the fields to apply the argument value to. If the settings field name
> -            is not the same as the configuration field, supply a tuple with the respective names.
> -
> -    Returns:
> -        Pydantic before model validator.
> -    """
> -
> -    def _loader(data: Any, info: ValidationInfo) -> Any:
> -        if not isinstance(data, MutableMapping):
> -            return data
> -
> -        settings = cast(ValidationContext, info.context)["settings"]
> -        for field in fields:
> -            if isinstance(field, tuple):
> -                settings_field = field[0]
> -                config_field = field[1]
> -            else:
> -                settings_field = config_field = field
> -
> -            if settings_data := getattr(settings, settings_field):
> -                data[config_field] = settings_data
> -
> -        return data
> -
> -    return _loader
> -
> -
> -class FrozenModel(BaseModel):
> -    """A pre-configured :class:`~pydantic.BaseModel`."""
> -
> -    #: Fields are set as read-only and any extra fields are forbidden.
> -    model_config = ConfigDict(frozen=True, extra="forbid")
> -
> -
> -@unique
> -class OS(StrEnum):
> -    r"""The supported operating systems of :class:`~framework.testbed_model.node.Node`\s."""
> -
> -    #:
> -    linux = auto()
> -    #:
> -    freebsd = auto()
> -    #:
> -    windows = auto()
> -
> -
> -@unique
> -class Compiler(StrEnum):
> -    r"""The supported compilers of :class:`~framework.testbed_model.node.Node`\s."""
> -
> -    #:
> -    gcc = auto()
> -    #:
> -    clang = auto()
> -    #:
> -    icc = auto()
> -    #:
> -    msvc = auto()
> -
> -
> -@unique
> -class TrafficGeneratorType(str, Enum):
> -    """The supported traffic generators."""
> -
> -    #:
> -    SCAPY = "SCAPY"
> -
> -
> -class HugepageConfiguration(FrozenModel):
> -    r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s."""
> -
> -    #: The number of hugepages to allocate.
> -    number_of: int
> -    #: If :data:`True`, the hugepages will be configured on the first NUMA node.
> -    force_first_numa: bool
> -
> -
> -class PortConfig(FrozenModel):
> -    r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s."""
> -
> -    #: The PCI address of the port.
> -    pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS)
> -    #: The driver that the kernel should bind this device to for DPDK to use it.
> -    os_driver_for_dpdk: str = Field(examples=["vfio-pci", "mlx5_core"])
> -    #: The operating system driver name when the operating system controls the port.
> -    os_driver: str = Field(examples=["i40e", "ice", "mlx5_core"])
> -    #: The name of the peer node this port is connected to.
> -    peer_node: str
> -    #: The PCI address of the peer port connected to this port.
> -    peer_pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS)
> -
> -
> -class TrafficGeneratorConfig(FrozenModel):
> -    """A protocol required to define traffic generator types."""
> -
> -    #: The traffic generator type the child class is required to define to be distinguished among
> -    #: others.
> -    type: TrafficGeneratorType
> -
> -
> -class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
> -    """Scapy traffic generator specific configuration."""
> -
> -    type: Literal[TrafficGeneratorType.SCAPY]
> -
> -
> -#: A union type discriminating traffic generators by the `type` field.
> -TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
> -
> -#: Comma-separated list of logical cores to use. An empty string or ```any``` means use all lcores.
> -LogicalCores = Annotated[
> -    str,
> -    Field(
> -        examples=["1,2,3,4,5,18-22", "10-15", "any"],
> -        pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$|any",
> -    ),
> -]
> -
> -
> -class NodeConfiguration(FrozenModel):
> -    r"""The configuration of :class:`~framework.testbed_model.node.Node`\s."""
> -
> -    #: The name of the :class:`~framework.testbed_model.node.Node`.
> -    name: str
> -    #: The hostname of the :class:`~framework.testbed_model.node.Node`. Can also be an IP address.
> -    hostname: str
> -    #: The name of the user used to connect to the :class:`~framework.testbed_model.node.Node`.
> -    user: str
> -    #: The password of the user. The use of passwords is heavily discouraged, please use SSH keys.
> -    password: str | None = None
> -    #: The operating system of the :class:`~framework.testbed_model.node.Node`.
> -    os: OS
> -    #: An optional hugepage configuration.
> -    hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
> -    #: The ports that can be used in testing.
> -    ports: list[PortConfig] = Field(min_length=1)
> -
> -
> -class DPDKConfiguration(FrozenModel):
> -    """Configuration of the DPDK EAL parameters."""
> -
> -    #: A comma delimited list of logical cores to use when running DPDK. ```any```, an empty
> -    #: string or omitting this field means use any core except for the first one. The first core
> -    #: will only be used if explicitly set.
> -    lcores: LogicalCores = ""
> -
> -    #: The number of memory channels to use when running DPDK.
> -    memory_channels: int = 1
> -
> -    @property
> -    def use_first_core(self) -> bool:
> -        """Returns :data:`True` if `lcores` explicitly selects the first core."""
> -        return "0" in self.lcores
> -
> -
> -class SutNodeConfiguration(NodeConfiguration):
> -    """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration."""
> -
> -    #: The runtime configuration for DPDK.
> -    dpdk_config: DPDKConfiguration
> -
> -
> -class TGNodeConfiguration(NodeConfiguration):
> -    """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration."""
> -
> -    #: The configuration of the traffic generator present on the TG node.
> -    traffic_generator: TrafficGeneratorConfigTypes
> -
> -
> -#: Union type for all the node configuration types.
> -NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
> -
> -
> -def resolve_path(path: Path) -> Path:
> -    """Resolve a path into a real path."""
> -    return path.resolve()
> -
> -
> -class BaseDPDKLocation(FrozenModel):
> -    """DPDK location base class.
> -
> -    The path to the DPDK sources and type of location.
> -    """
> -
> -    #: Specifies whether to find DPDK on the SUT node or on the local host. Which are respectively
> -    #: represented by :class:`RemoteDPDKLocation` and :class:`LocalDPDKTreeLocation`.
> -    remote: bool = False
> -
> -
> -class LocalDPDKLocation(BaseDPDKLocation):
> -    """Local DPDK location base class.
> -
> -    This class is meant to represent any location that is present only locally.
> -    """
> -
> -    remote: Literal[False] = False
> -
> -
> -class LocalDPDKTreeLocation(LocalDPDKLocation):
> -    """Local DPDK tree location.
>
> -    This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
> -    validation.
> -    """
> -
> -    #: The path to the DPDK source tree directory on the local host passed as string.
> -    dpdk_tree: Path
> -
> -    #: Resolve the local DPDK tree path.
> -    resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
> -
> -    @model_validator(mode="after")
> -    def validate_dpdk_tree_path(self) -> Self:
> -        """Validate the provided DPDK tree path."""
> -        assert self.dpdk_tree.exists(), "DPDK tree not found in local filesystem."
> -        assert self.dpdk_tree.is_dir(), "The DPDK tree path must be a directory."
> -        return self
> -
> -
> -class LocalDPDKTarballLocation(LocalDPDKLocation):
> -    """Local DPDK tarball location.
> -
> -    This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
> -    validation.
> -    """
> -
> -    #: The path to the DPDK tarball on the local host passed as string.
> -    tarball: Path
> -
> -    #: Resolve the local tarball path.
> -    resolve_tarball_path = field_validator("tarball")(resolve_path)
> -
> -    @model_validator(mode="after")
> -    def validate_tarball_path(self) -> Self:
> -        """Validate the provided tarball."""
> -        assert self.tarball.exists(), "DPDK tarball not found in local filesystem."
> -        assert tarfile.is_tarfile(self.tarball), "The DPDK tarball must be a valid tar archive."
> -        return self
> -
> -
> -class RemoteDPDKLocation(BaseDPDKLocation):
> -    """Remote DPDK location base class.
> -
> -    This class is meant to represent any location that is present only remotely.
> -    """
> -
> -    remote: Literal[True] = True
> -
> -
> -class RemoteDPDKTreeLocation(RemoteDPDKLocation):
> -    """Remote DPDK tree location.
> -
> -    This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
> -    """
> -
> -    #: The path to the DPDK source tree directory on the remote node passed as string.
> -    dpdk_tree: PurePath
> -
> -
> -class RemoteDPDKTarballLocation(RemoteDPDKLocation):
> -    """Remote DPDK tarball location.
> -
> -    This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
> -    validation.
> -    """
> -
> -    #: The path to the DPDK tarball on the remote node passed as string.
> -    tarball: PurePath
> -
> -
> -#: Union type for different DPDK locations.
> -DPDKLocation = (
> -    LocalDPDKTreeLocation
> -    | LocalDPDKTarballLocation
> -    | RemoteDPDKTreeLocation
> -    | RemoteDPDKTarballLocation
> +from .common import FrozenModel, ValidationContext
> +from .node import (
> +    NodeConfiguration,
> +    NodeConfigurationTypes,
> +    SutNodeConfiguration,
> +    TGNodeConfiguration,
>  )
> -
> -
> -class BaseDPDKBuildConfiguration(FrozenModel):
> -    """The base configuration for different types of build.
> -
> -    The configuration contain the location of the DPDK and configuration used for building it.
> -    """
> -
> -    #: The location of the DPDK tree.
> -    dpdk_location: DPDKLocation
> -
> -    dpdk_location_from_settings = model_validator(mode="before")(
> -        load_fields_from_settings("dpdk_location")
> -    )
> -
> -
> -class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration):
> -    """DPDK precompiled build configuration."""
> -
> -    #: If it's defined, DPDK has been pre-compiled and the build directory is located in a
> -    #: subdirectory of `~dpdk_location.dpdk_tree` or `~dpdk_location.tarball` root directory.
> -    precompiled_build_dir: str = Field(min_length=1)
> -
> -    build_dir_from_settings = model_validator(mode="before")(
> -        load_fields_from_settings("precompiled_build_dir")
> -    )
> -
> -
> -class DPDKBuildOptionsConfiguration(FrozenModel):
> -    """DPDK build options configuration.
> -
> -    The build options used for building DPDK.
> -    """
> -
> -    #: The compiler executable to use.
> -    compiler: Compiler
> -    #: This string will be put in front of the compiler when executing the build. Useful for adding
> -    #: wrapper commands, such as ``ccache``.
> -    compiler_wrapper: str = ""
> -
> -
> -class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration):
> -    """DPDK uncompiled build configuration."""
> -
> -    #: The build options to compiled DPDK with.
> -    build_options: DPDKBuildOptionsConfiguration
> -
> -
> -#: Union type for different build configurations.
> -DPDKBuildConfiguration = DPDKPrecompiledBuildConfiguration | DPDKUncompiledBuildConfiguration
> -
> -
> -class TestSuiteConfig(FrozenModel):
> -    """Test suite configuration.
> -
> -    Information about a single test suite to be executed. This can also be represented as a string
> -    instead of a mapping, example:
> -
> -    .. code:: yaml
> -
> -        test_runs:
> -        - test_suites:
> -            # As string representation:
> -            - hello_world # test all of `hello_world`, or
> -            - hello_world hello_world_single_core # test only `hello_world_single_core`
> -            # or as model fields:
> -            - test_suite: hello_world
> -              test_cases: [hello_world_single_core] # without this field all test cases are run
> -    """
> -
> -    #: The name of the test suite module without the starting ``TestSuite_``.
> -    test_suite_name: str = Field(alias="test_suite")
> -    #: The names of test cases from this test suite to execute. If empty, all test cases will be
> -    #: executed.
> -    test_cases_names: list[str] = Field(default_factory=list, alias="test_cases")
> -
> -    @cached_property
> -    def test_suite_spec(self) -> "TestSuiteSpec":
> -        """The specification of the requested test suite."""
> -        from framework.test_suite import find_by_name
> -
> -        test_suite_spec = find_by_name(self.test_suite_name)
> -        assert (
> -            test_suite_spec is not None
> -        ), f"{self.test_suite_name} is not a valid test suite module name."
> -        return test_suite_spec
> -
> -    @model_validator(mode="before")
> -    @classmethod
> -    def convert_from_string(cls, data: Any) -> Any:
> -        """Convert the string representation of the model into a valid mapping."""
> -        if isinstance(data, str):
> -            [test_suite, *test_cases] = data.split()
> -            return dict(test_suite=test_suite, test_cases=test_cases)
> -        return data
> -
> -    @model_validator(mode="after")
> -    def validate_names(self) -> Self:
> -        """Validate the supplied test suite and test cases names.
> -
> -        This validator relies on the cached property `test_suite_spec` to run for the first
> -        time in this call, therefore triggering the assertions if needed.
> -        """
> -        available_test_cases = map(
> -            lambda t: t.name, self.test_suite_spec.class_obj.get_test_cases()
> -        )
> -        for requested_test_case in self.test_cases_names:
> -            assert requested_test_case in available_test_cases, (
> -                f"{requested_test_case} is not a valid test case "
> -                f"of test suite {self.test_suite_name}."
> -            )
> -
> -        return self
> -
> -
> -class TestRunConfiguration(FrozenModel):
> -    """The configuration of a test run.
> -
> -    The configuration contains testbed information, what tests to execute
> -    and with what DPDK build.
> -    """
> -
> -    #: The DPDK configuration used to test.
> -    dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
> -    #: Whether to run performance tests.
> -    perf: bool
> -    #: Whether to run functional tests.
> -    func: bool
> -    #: Whether to skip smoke tests.
> -    skip_smoke_tests: bool = False
> -    #: The names of test suites and/or test cases to execute.
> -    test_suites: list[TestSuiteConfig] = Field(min_length=1)
> -    #: The SUT node name to use in this test run.
> -    system_under_test_node: str
> -    #: The TG node name to use in this test run.
> -    traffic_generator_node: str
> -    #: The names of virtual devices to test.
> -    vdevs: list[str] = Field(default_factory=list)
> -    #: The seed to use for pseudo-random generation.
> -    random_seed: int | None = None
> -
> -    fields_from_settings = model_validator(mode="before")(
> -        load_fields_from_settings("test_suites", "random_seed")
> -    )
> +from .test_run import TestRunConfiguration
>
>
>  class TestRunWithNodesConfiguration(NamedTuple):
> @@ -506,13 +58,18 @@ class TestRunWithNodesConfiguration(NamedTuple):
>      tg_node_config: TGNodeConfiguration
>
>
> +TestRunsConfig = Annotated[list[TestRunConfiguration], Field(min_length=1)]
> +
> +NodesConfig = Annotated[list[NodeConfigurationTypes], Field(min_length=1)]
> +
> +
>  class Configuration(FrozenModel):
>      """DTS testbed and test configuration."""
>
>      #: Test run configurations.
> -    test_runs: list[TestRunConfiguration] = Field(min_length=1)
> +    test_runs: TestRunsConfig
>      #: Node configurations.
> -    nodes: list[NodeConfigurationTypes] = Field(min_length=1)
> +    nodes: NodesConfig
>
>      @cached_property
>      def test_runs_with_nodes(self) -> list[TestRunWithNodesConfiguration]:
> @@ -596,30 +153,37 @@ def validate_test_runs_with_nodes(self) -> Self:
>          return self
>
>
> -def load_config(settings: Settings) -> Configuration:
> -    """Load DTS test run configuration from a file.
> +T = TypeVar("T")
> +
> +
> +def _load_and_parse_model(file_path: Path, model_type: T, ctx: ValidationContext) -> T:
> +    with open(file_path) as f:
> +        try:
> +            data = yaml.safe_load(f)
> +            return TypeAdapter(model_type).validate_python(data, context=cast(dict[str, Any], ctx))
> +        except ValidationError as e:
> +            msg = f"failed to load the configuration file {file_path}"
> +            raise ConfigurationError(msg) from e
> +
>
> -    Load the YAML test run configuration file, validate it, and create a test run configuration
> -    object.
> +def load_config(ctx: ValidationContext) -> Configuration:
> +    """Load the DTS configuration from files.
>
> -    The YAML test run configuration file is specified in the :option:`--config-file` command line
> -    argument or the :envvar:`DTS_CFG_FILE` environment variable.
> +    Load the YAML configuration files, validate them, and create a configuration object.
>
>      Args:
> -        config_file_path: The path to the YAML test run configuration file.
> -        settings: The settings provided by the user on the command line.
> +        ctx: The context required for validation.
>
>      Returns:
>          The parsed test run configuration.
>
>      Raises:
> -        ConfigurationError: If the supplied configuration file is invalid.
> +        ConfigurationError: If the supplied configuration files are invalid.
>      """
> -    with open(settings.config_file_path, "r") as f:
> -        config_data = yaml.safe_load(f)
> +    test_runs = _load_and_parse_model(ctx["settings"].test_runs_config_path, TestRunsConfig, ctx)
> +    nodes = _load_and_parse_model(ctx["settings"].nodes_config_path, NodesConfig, ctx)
>
>      try:
> -        context = ValidationContext(settings=settings)
> -        return Configuration.model_validate(config_data, context=context)
> +        return Configuration.model_validate({"test_runs": test_runs, "nodes": nodes}, context=ctx)
>      except ValidationError as e:
> -        raise ConfigurationError("failed to load the supplied configuration") from e
> +        raise ConfigurationError("the configurations supplied are invalid") from e
> diff --git a/dts/framework/config/common.py b/dts/framework/config/common.py
> new file mode 100644
> index 0000000000..25265cb9da
> --- /dev/null
> +++ b/dts/framework/config/common.py
> @@ -0,0 +1,59 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright (c) 2025 Arm Limited
> +
> +"""Common definitions and objects for the configuration."""
> +
> +from collections.abc import Callable, MutableMapping
> +from typing import TYPE_CHECKING, Any, TypedDict, cast
> +
> +from pydantic import BaseModel, ConfigDict, ValidationInfo
> +
> +if TYPE_CHECKING:
> +    from framework.settings import Settings
> +
> +
> +class ValidationContext(TypedDict):
> +    """A context dictionary to use for validation."""
> +
> +    #: The command line settings.
> +    settings: "Settings"
> +
> +
> +def load_fields_from_settings(
> +    *fields: str | tuple[str, str],
> +) -> Callable[[Any, ValidationInfo], Any]:
> +    """Before model validator that injects values from :attr:`ValidationContext.settings`.
> +
> +    Args:
> +        *fields: The name of the fields to apply the argument value to. If the settings field name
> +            is not the same as the configuration field, supply a tuple with the respective names.
> +
> +    Returns:
> +        Pydantic before model validator.
> +    """
> +
> +    def _loader(data: Any, info: ValidationInfo) -> Any:
> +        if not isinstance(data, MutableMapping):
> +            return data
> +
> +        settings = cast(ValidationContext, info.context)["settings"]
> +        for field in fields:
> +            if isinstance(field, tuple):
> +                settings_field = field[0]
> +                config_field = field[1]
> +            else:
> +                settings_field = config_field = field
> +
> +            if settings_data := getattr(settings, settings_field):
> +                data[config_field] = settings_data
> +
> +        return data
> +
> +    return _loader
> +
> +
> +class FrozenModel(BaseModel):
> +    """A pre-configured :class:`~pydantic.BaseModel`."""
> +
> +    #: Fields are set as read-only and any extra fields are forbidden.
> +    model_config = ConfigDict(frozen=True, extra="forbid")
> diff --git a/dts/framework/config/node.py b/dts/framework/config/node.py
> new file mode 100644
> index 0000000000..a7ace514d9
> --- /dev/null
> +++ b/dts/framework/config/node.py
> @@ -0,0 +1,144 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2010-2021 Intel Corporation
> +# Copyright(c) 2022-2023 University of New Hampshire
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +# Copyright(c) 2024 Arm Limited
> +
> +"""Configuration models representing a node.
> +
> +The root model of a node configuration is :class:`NodeConfiguration`.
> +"""
> +
> +from enum import Enum, auto, unique
> +from typing import Annotated, Literal
> +
> +from pydantic import Field
> +
> +from framework.utils import REGEX_FOR_PCI_ADDRESS, StrEnum
> +
> +from .common import FrozenModel
> +
> +
> +@unique
> +class OS(StrEnum):
> +    r"""The supported operating systems of :class:`~framework.testbed_model.node.Node`\s."""
> +
> +    #:
> +    linux = auto()
> +    #:
> +    freebsd = auto()
> +    #:
> +    windows = auto()
> +
> +
> +@unique
> +class TrafficGeneratorType(str, Enum):
> +    """The supported traffic generators."""
> +
> +    #:
> +    SCAPY = "SCAPY"
> +
> +
> +class HugepageConfiguration(FrozenModel):
> +    r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s."""
> +
> +    #: The number of hugepages to allocate.
> +    number_of: int
> +    #: If :data:`True`, the hugepages will be configured on the first NUMA node.
> +    force_first_numa: bool
> +
> +
> +class PortConfig(FrozenModel):
> +    r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s."""
> +
> +    #: The PCI address of the port.
> +    pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS)
> +    #: The driver that the kernel should bind this device to for DPDK to use it.
> +    os_driver_for_dpdk: str = Field(examples=["vfio-pci", "mlx5_core"])
> +    #: The operating system driver name when the operating system controls the port.
> +    os_driver: str = Field(examples=["i40e", "ice", "mlx5_core"])
> +    #: The name of the peer node this port is connected to.
> +    peer_node: str
> +    #: The PCI address of the peer port connected to this port.
> +    peer_pci: str = Field(pattern=REGEX_FOR_PCI_ADDRESS)
> +
> +
> +class TrafficGeneratorConfig(FrozenModel):
> +    """A protocol required to define traffic generator types."""
> +
> +    #: The traffic generator type the child class is required to define to be distinguished among
> +    #: others.
> +    type: TrafficGeneratorType
> +
> +
> +class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
> +    """Scapy traffic generator specific configuration."""
> +
> +    type: Literal[TrafficGeneratorType.SCAPY]
> +
> +
> +#: A union type discriminating traffic generators by the `type` field.
> +TrafficGeneratorConfigTypes = Annotated[ScapyTrafficGeneratorConfig, Field(discriminator="type")]
> +
> +#: Comma-separated list of logical cores to use. An empty string or ```any``` means use all lcores.
> +LogicalCores = Annotated[
> +    str,
> +    Field(
> +        examples=["1,2,3,4,5,18-22", "10-15", "any"],
> +        pattern=r"^(([0-9]+|([0-9]+-[0-9]+))(,([0-9]+|([0-9]+-[0-9]+)))*)?$|any",
> +    ),
> +]
> +
> +
> +class NodeConfiguration(FrozenModel):
> +    r"""The configuration of :class:`~framework.testbed_model.node.Node`\s."""
> +
> +    #: The name of the :class:`~framework.testbed_model.node.Node`.
> +    name: str
> +    #: The hostname of the :class:`~framework.testbed_model.node.Node`. Can also be an IP address.
> +    hostname: str
> +    #: The name of the user used to connect to the :class:`~framework.testbed_model.node.Node`.
> +    user: str
> +    #: The password of the user. The use of passwords is heavily discouraged, please use SSH keys.
> +    password: str | None = None
> +    #: The operating system of the :class:`~framework.testbed_model.node.Node`.
> +    os: OS
> +    #: An optional hugepage configuration.
> +    hugepages: HugepageConfiguration | None = Field(None, alias="hugepages_2mb")
> +    #: The ports that can be used in testing.
> +    ports: list[PortConfig] = Field(min_length=1)
> +
> +
> +class DPDKConfiguration(FrozenModel):
> +    """Configuration of the DPDK EAL parameters."""
> +
> +    #: A comma delimited list of logical cores to use when running DPDK. ```any```, an empty
> +    #: string or omitting this field means use any core except for the first one. The first core
> +    #: will only be used if explicitly set.
> +    lcores: LogicalCores = ""
> +
> +    #: The number of memory channels to use when running DPDK.
> +    memory_channels: int = 1
> +
> +    @property
> +    def use_first_core(self) -> bool:
> +        """Returns :data:`True` if `lcores` explicitly selects the first core."""
> +        return "0" in self.lcores
> +
> +
> +class SutNodeConfiguration(NodeConfiguration):
> +    """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration."""
> +
> +    #: The runtime configuration for DPDK.
> +    dpdk_config: DPDKConfiguration
> +
> +
> +class TGNodeConfiguration(NodeConfiguration):
> +    """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration."""
> +
> +    #: The configuration of the traffic generator present on the TG node.
> +    traffic_generator: TrafficGeneratorConfigTypes
> +
> +
> +#: Union type for all the node configuration types.
> +NodeConfigurationTypes = TGNodeConfiguration | SutNodeConfiguration
> diff --git a/dts/framework/config/test_run.py b/dts/framework/config/test_run.py
> new file mode 100644
> index 0000000000..dc0e46047d
> --- /dev/null
> +++ b/dts/framework/config/test_run.py
> @@ -0,0 +1,290 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2010-2021 Intel Corporation
> +# Copyright(c) 2022-2023 University of New Hampshire
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +# Copyright(c) 2024 Arm Limited
> +
> +"""Configuration models representing a test run.
> +
> +The root model of a test run configuration is :class:`TestRunConfiguration`.
> +"""
> +
> +import tarfile
> +from enum import auto, unique
> +from functools import cached_property
> +from pathlib import Path, PurePath
> +from typing import Any, Literal
> +
> +from pydantic import Field, field_validator, model_validator
> +from typing_extensions import TYPE_CHECKING, Self
> +
> +from framework.utils import StrEnum
> +
> +from .common import FrozenModel, load_fields_from_settings
> +
> +if TYPE_CHECKING:
> +    from framework.test_suite import TestSuiteSpec
> +
> +
> +@unique
> +class Compiler(StrEnum):
> +    r"""The supported compilers of :class:`~framework.testbed_model.node.Node`\s."""
> +
> +    #:
> +    gcc = auto()
> +    #:
> +    clang = auto()
> +    #:
> +    icc = auto()
> +    #:
> +    msvc = auto()
> +
> +
> +def resolve_path(path: Path) -> Path:
> +    """Resolve a path into a real path."""
> +    return path.resolve()
> +
> +
> +class BaseDPDKLocation(FrozenModel):
> +    """DPDK location base class.
> +
> +    The path to the DPDK sources and type of location.
> +    """
> +
> +    #: Specifies whether to find DPDK on the SUT node or on the local host. Which are respectively
> +    #: represented by :class:`RemoteDPDKLocation` and :class:`LocalDPDKTreeLocation`.
> +    remote: bool = False
> +
> +
> +class LocalDPDKLocation(BaseDPDKLocation):
> +    """Local DPDK location base class.
> +
> +    This class is meant to represent any location that is present only locally.
> +    """
> +
> +    remote: Literal[False] = False
> +
> +
> +class LocalDPDKTreeLocation(LocalDPDKLocation):
> +    """Local DPDK tree location.
> +
> +    This class makes a distinction from :class:`RemoteDPDKTreeLocation` by enforcing on the fly
> +    validation.
> +    """
> +
> +    #: The path to the DPDK source tree directory on the local host passed as string.
> +    dpdk_tree: Path
> +
> +    #: Resolve the local DPDK tree path.
> +    resolve_dpdk_tree_path = field_validator("dpdk_tree")(resolve_path)
> +
> +    @model_validator(mode="after")
> +    def validate_dpdk_tree_path(self) -> Self:
> +        """Validate the provided DPDK tree path."""
> +        assert self.dpdk_tree.exists(), "DPDK tree not found in local filesystem."
> +        assert self.dpdk_tree.is_dir(), "The DPDK tree path must be a directory."
> +        return self
> +
> +
> +class LocalDPDKTarballLocation(LocalDPDKLocation):
> +    """Local DPDK tarball location.
> +
> +    This class makes a distinction from :class:`RemoteDPDKTarballLocation` by enforcing on the fly
> +    validation.
> +    """
> +
> +    #: The path to the DPDK tarball on the local host passed as string.
> +    tarball: Path
> +
> +    #: Resolve the local tarball path.
> +    resolve_tarball_path = field_validator("tarball")(resolve_path)
> +
> +    @model_validator(mode="after")
> +    def validate_tarball_path(self) -> Self:
> +        """Validate the provided tarball."""
> +        assert self.tarball.exists(), "DPDK tarball not found in local filesystem."
> +        assert tarfile.is_tarfile(self.tarball), "The DPDK tarball must be a valid tar archive."
> +        return self
> +
> +
> +class RemoteDPDKLocation(BaseDPDKLocation):
> +    """Remote DPDK location base class.
> +
> +    This class is meant to represent any location that is present only remotely.
> +    """
> +
> +    remote: Literal[True] = True
> +
> +
> +class RemoteDPDKTreeLocation(RemoteDPDKLocation):
> +    """Remote DPDK tree location.
> +
> +    This class is distinct from :class:`LocalDPDKTreeLocation` which enforces on the fly validation.
> +    """
> +
> +    #: The path to the DPDK source tree directory on the remote node passed as string.
> +    dpdk_tree: PurePath
> +
> +
> +class RemoteDPDKTarballLocation(RemoteDPDKLocation):
> +    """Remote DPDK tarball location.
> +
> +    This class is distinct from :class:`LocalDPDKTarballLocation` which enforces on the fly
> +    validation.
> +    """
> +
> +    #: The path to the DPDK tarball on the remote node passed as string.
> +    tarball: PurePath
> +
> +
> +#: Union type for different DPDK locations.
> +DPDKLocation = (
> +    LocalDPDKTreeLocation
> +    | LocalDPDKTarballLocation
> +    | RemoteDPDKTreeLocation
> +    | RemoteDPDKTarballLocation
> +)
> +
> +
> +class BaseDPDKBuildConfiguration(FrozenModel):
> +    """The base configuration for different types of build.
> +
> +    The configuration contain the location of the DPDK and configuration used for building it.
> +    """
> +
> +    #: The location of the DPDK tree.
> +    dpdk_location: DPDKLocation
> +
> +    dpdk_location_from_settings = model_validator(mode="before")(
> +        load_fields_from_settings("dpdk_location")
> +    )
> +
> +
> +class DPDKPrecompiledBuildConfiguration(BaseDPDKBuildConfiguration):
> +    """DPDK precompiled build configuration."""
> +
> +    #: If it's defined, DPDK has been pre-compiled and the build directory is located in a
> +    #: subdirectory of `~dpdk_location.dpdk_tree` or `~dpdk_location.tarball` root directory.
> +    precompiled_build_dir: str = Field(min_length=1)
> +
> +    build_dir_from_settings = model_validator(mode="before")(
> +        load_fields_from_settings("precompiled_build_dir")
> +    )
> +
> +
> +class DPDKBuildOptionsConfiguration(FrozenModel):
> +    """DPDK build options configuration.
> +
> +    The build options used for building DPDK.
> +    """
> +
> +    #: The compiler executable to use.
> +    compiler: Compiler
> +    #: This string will be put in front of the compiler when executing the build. Useful for adding
> +    #: wrapper commands, such as ``ccache``.
> +    compiler_wrapper: str = ""
> +
> +
> +class DPDKUncompiledBuildConfiguration(BaseDPDKBuildConfiguration):
> +    """DPDK uncompiled build configuration."""
> +
> +    #: The build options to compiled DPDK with.
> +    build_options: DPDKBuildOptionsConfiguration
> +
> +
> +#: Union type for different build configurations.
> +DPDKBuildConfiguration = DPDKPrecompiledBuildConfiguration | DPDKUncompiledBuildConfiguration
> +
> +
> +class TestSuiteConfig(FrozenModel):
> +    """Test suite configuration.
> +
> +    Information about a single test suite to be executed. This can also be represented as a string
> +    instead of a mapping, example:
> +
> +    .. code:: yaml
> +
> +        test_runs:
> +        - test_suites:
> +            # As string representation:
> +            - hello_world # test all of `hello_world`, or
> +            - hello_world hello_world_single_core # test only `hello_world_single_core`
> +            # or as model fields:
> +            - test_suite: hello_world
> +              test_cases: [hello_world_single_core] # without this field all test cases are run
> +    """
> +
> +    #: The name of the test suite module without the starting ``TestSuite_``.
> +    test_suite_name: str = Field(alias="test_suite")
> +    #: The names of test cases from this test suite to execute. If empty, all test cases will be
> +    #: executed.
> +    test_cases_names: list[str] = Field(default_factory=list, alias="test_cases")
> +
> +    @cached_property
> +    def test_suite_spec(self) -> "TestSuiteSpec":
> +        """The specification of the requested test suite."""
> +        from framework.test_suite import find_by_name
> +
> +        test_suite_spec = find_by_name(self.test_suite_name)
> +        assert (
> +            test_suite_spec is not None
> +        ), f"{self.test_suite_name} is not a valid test suite module name."
> +        return test_suite_spec
> +
> +    @model_validator(mode="before")
> +    @classmethod
> +    def convert_from_string(cls, data: Any) -> Any:
> +        """Convert the string representation of the model into a valid mapping."""
> +        if isinstance(data, str):
> +            [test_suite, *test_cases] = data.split()
> +            return dict(test_suite=test_suite, test_cases=test_cases)
> +        return data
> +
> +    @model_validator(mode="after")
> +    def validate_names(self) -> Self:
> +        """Validate the supplied test suite and test cases names.
> +
> +        This validator relies on the cached property `test_suite_spec` to run for the first
> +        time in this call, therefore triggering the assertions if needed.
> +        """
> +        available_test_cases = map(
> +            lambda t: t.name, self.test_suite_spec.class_obj.get_test_cases()
> +        )
> +        for requested_test_case in self.test_cases_names:
> +            assert requested_test_case in available_test_cases, (
> +                f"{requested_test_case} is not a valid test case "
> +                f"of test suite {self.test_suite_name}."
> +            )
> +
> +        return self
> +
> +
> +class TestRunConfiguration(FrozenModel):
> +    """The configuration of a test run.
> +
> +    The configuration contains testbed information, what tests to execute
> +    and with what DPDK build.
> +    """
> +
> +    #: The DPDK configuration used to test.
> +    dpdk_config: DPDKBuildConfiguration = Field(alias="dpdk_build")
> +    #: Whether to run performance tests.
> +    perf: bool
> +    #: Whether to run functional tests.
> +    func: bool
> +    #: Whether to skip smoke tests.
> +    skip_smoke_tests: bool = False
> +    #: The names of test suites and/or test cases to execute.
> +    test_suites: list[TestSuiteConfig] = Field(min_length=1)
> +    #: The SUT node name to use in this test run.
> +    system_under_test_node: str
> +    #: The TG node name to use in this test run.
> +    traffic_generator_node: str
> +    #: The names of virtual devices to test.
> +    vdevs: list[str] = Field(default_factory=list)
> +    #: The seed to use for pseudo-random generation.
> +    random_seed: int | None = None
> +
> +    fields_from_settings = model_validator(mode="before")(
> +        load_fields_from_settings("test_suites", "random_seed")
> +    )
> diff --git a/dts/framework/runner.py b/dts/framework/runner.py
> index e46a8c1a4f..9f9789cf49 100644
> --- a/dts/framework/runner.py
> +++ b/dts/framework/runner.py
> @@ -25,17 +25,22 @@
>  from types import MethodType
>  from typing import Iterable
>
> +from framework.config.common import ValidationContext
>  from framework.testbed_model.capability import Capability, get_supported_capabilities
>  from framework.testbed_model.sut_node import SutNode
>  from framework.testbed_model.tg_node import TGNode
>
>  from .config import (
>      Configuration,
> +    load_config,
> +)
> +from .config.node import (
>      SutNodeConfiguration,
> +    TGNodeConfiguration,
> +)
> +from .config.test_run import (
>      TestRunConfiguration,
>      TestSuiteConfig,
> -    TGNodeConfiguration,
> -    load_config,
>  )
>  from .exception import BlockingTestSuiteError, SSHTimeoutError, TestCaseVerifyError
>  from .logger import DTSLogger, DtsStage, get_dts_logger
> @@ -81,7 +86,7 @@ class DTSRunner:
>
>      def __init__(self):
>          """Initialize the instance with configuration, logger, result and string constants."""
> -        self._configuration = load_config(SETTINGS)
> +        self._configuration = load_config(ValidationContext(settings=SETTINGS))
>          self._logger = get_dts_logger()
>          if not os.path.exists(SETTINGS.output_dir):
>              os.makedirs(SETTINGS.output_dir)
> diff --git a/dts/framework/settings.py b/dts/framework/settings.py
> index 873d400bec..cf82a7c18f 100644
> --- a/dts/framework/settings.py
> +++ b/dts/framework/settings.py
> @@ -14,10 +14,15 @@
>
>  The command line arguments along with the supported environment variables are:
>
> -.. option:: --config-file
> -.. envvar:: DTS_CFG_FILE
> +.. option:: --test-runs-config-file
> +.. envvar:: DTS_TEST_RUNS_CFG_FILE
>
> -    The path to the YAML test run configuration file.
> +    The path to the YAML configuration file of the test runs.
> +
> +.. option:: --nodes-config-file
> +.. envvar:: DTS_NODES_CFG_FILE
> +
> +    The path to the YAML configuration file of the nodes.
>
>  .. option:: --output-dir, --output
>  .. envvar:: DTS_OUTPUT_DIR
> @@ -102,7 +107,7 @@
>
>  from pydantic import ValidationError
>
> -from .config import (
> +from .config.test_run import (
>      DPDKLocation,
>      LocalDPDKTarballLocation,
>      LocalDPDKTreeLocation,
> @@ -120,7 +125,9 @@ class Settings:
>      """
>
>      #:
> -    config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
> +    test_runs_config_path: Path = Path(__file__).parent.parent.joinpath("test_runs.yaml")
> +    #:
> +    nodes_config_path: Path = Path(__file__).parent.parent.joinpath("nodes.yaml")
>      #:
>      output_dir: str = "output"
>      #:
> @@ -316,14 +323,24 @@ def _get_parser() -> _DTSArgumentParser:
>      )
>
>      action = parser.add_argument(
> -        "--config-file",
> -        default=SETTINGS.config_file_path,
> +        "--test-runs-config-file",
> +        default=SETTINGS.test_runs_config_path,
> +        type=Path,
> +        help="The configuration file that describes the test cases and DPDK build options.",
> +        metavar="FILE_PATH",
> +        dest="test_runs_config_path",
> +    )
> +    _add_env_var_to_action(action, "TEST_RUNS_CFG_FILE")
> +
> +    action = parser.add_argument(
> +        "--nodes-config-file",
> +        default=SETTINGS.nodes_config_path,
>          type=Path,
> -        help="The configuration file that describes the test cases, SUTs and DPDK build configs.",
> +        help="The configuration file that describes the SUT and TG nodes.",
>          metavar="FILE_PATH",
> -        dest="config_file_path",
> +        dest="nodes_config_path",
>      )
> -    _add_env_var_to_action(action, "CFG_FILE")
> +    _add_env_var_to_action(action, "NODES_CFG_FILE")
>
>      action = parser.add_argument(
>          "--output-dir",
> diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
> index 0060155ef9..bffbc52505 100644
> --- a/dts/framework/test_result.py
> +++ b/dts/framework/test_result.py
> @@ -32,7 +32,7 @@
>
>  from framework.testbed_model.capability import Capability
>
> -from .config import TestRunConfiguration, TestSuiteConfig
> +from .config.test_run import TestRunConfiguration, TestSuiteConfig
>  from .exception import DTSError, ErrorSeverity
>  from .logger import DTSLogger
>  from .test_suite import TestCase, TestSuite
> diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
> index 6c2dfd6185..e53a321499 100644
> --- a/dts/framework/testbed_model/node.py
> +++ b/dts/framework/testbed_model/node.py
> @@ -15,10 +15,12 @@
>
>  from abc import ABC
>
> -from framework.config import (
> +from framework.config.node import (
>      OS,
> -    DPDKBuildConfiguration,
>      NodeConfiguration,
> +)
> +from framework.config.test_run import (
> +    DPDKBuildConfiguration,
>      TestRunConfiguration,
>  )
>  from framework.exception import ConfigurationError
> diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
> index e436886692..6d5fce40ff 100644
> --- a/dts/framework/testbed_model/os_session.py
> +++ b/dts/framework/testbed_model/os_session.py
> @@ -28,7 +28,7 @@
>  from dataclasses import dataclass
>  from pathlib import Path, PurePath, PurePosixPath
>
> -from framework.config import NodeConfiguration
> +from framework.config.node import NodeConfiguration
>  from framework.logger import DTSLogger
>  from framework.remote_session import (
>      InteractiveRemoteSession,
> diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
> index 566f4c5b46..7177da3371 100644
> --- a/dts/framework/testbed_model/port.py
> +++ b/dts/framework/testbed_model/port.py
> @@ -10,7 +10,7 @@
>
>  from dataclasses import dataclass
>
> -from framework.config import PortConfig
> +from framework.config.node import PortConfig
>
>
>  @dataclass(slots=True, frozen=True)
> diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
> index d8f1f9d452..483733cede 100644
> --- a/dts/framework/testbed_model/sut_node.py
> +++ b/dts/framework/testbed_model/sut_node.py
> @@ -16,7 +16,10 @@
>  from dataclasses import dataclass
>  from pathlib import Path, PurePath
>
> -from framework.config import (
> +from framework.config.node import (
> +    SutNodeConfiguration,
> +)
> +from framework.config.test_run import (
>      DPDKBuildConfiguration,
>      DPDKBuildOptionsConfiguration,
>      DPDKPrecompiledBuildConfiguration,
> @@ -25,7 +28,6 @@
>      LocalDPDKTreeLocation,
>      RemoteDPDKTarballLocation,
>      RemoteDPDKTreeLocation,
> -    SutNodeConfiguration,
>      TestRunConfiguration,
>  )
>  from framework.exception import ConfigurationError, RemoteFileNotFoundError
> diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
> index 3071bbd645..86cd278efb 100644
> --- a/dts/framework/testbed_model/tg_node.py
> +++ b/dts/framework/testbed_model/tg_node.py
> @@ -11,7 +11,7 @@
>
>  from scapy.packet import Packet
>
> -from framework.config import TGNodeConfiguration
> +from framework.config.node import TGNodeConfiguration
>  from framework.testbed_model.traffic_generator.capturing_traffic_generator import (
>      PacketFilteringConfig,
>  )
> diff --git a/dts/framework/testbed_model/topology.py b/dts/framework/testbed_model/topology.py
> index 0bad59d2a4..caee9b22ea 100644
> --- a/dts/framework/testbed_model/topology.py
> +++ b/dts/framework/testbed_model/topology.py
> @@ -16,7 +16,7 @@
>  else:
>      from aenum import NoAliasEnum
>
> -from framework.config import PortConfig
> +from framework.config.node import PortConfig
>  from framework.exception import ConfigurationError
>
>  from .port import Port
> diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
> index e501f6d5ee..922875f401 100644
> --- a/dts/framework/testbed_model/traffic_generator/__init__.py
> +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
> @@ -14,7 +14,7 @@
>  and a capturing traffic generator is required.
>  """
>
> -from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorConfig
> +from framework.config.node import ScapyTrafficGeneratorConfig, TrafficGeneratorConfig
>  from framework.exception import ConfigurationError
>  from framework.testbed_model.node import Node
>
> diff --git a/dts/framework/testbed_model/traffic_generator/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
> index a16cdf6758..c9c7dac54a 100644
> --- a/dts/framework/testbed_model/traffic_generator/scapy.py
> +++ b/dts/framework/testbed_model/traffic_generator/scapy.py
> @@ -20,7 +20,7 @@
>  from scapy.layers.l2 import Ether
>  from scapy.packet import Packet
>
> -from framework.config import OS, ScapyTrafficGeneratorConfig
> +from framework.config.node import OS, ScapyTrafficGeneratorConfig
>  from framework.remote_session.python_shell import PythonShell
>  from framework.testbed_model.node import Node
>  from framework.testbed_model.port import Port
> diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> index a07538cc98..9b4d5dc80a 100644
> --- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> @@ -12,7 +12,7 @@
>
>  from scapy.packet import Packet
>
> -from framework.config import TrafficGeneratorConfig
> +from framework.config.node import TrafficGeneratorConfig
>  from framework.logger import DTSLogger, get_dts_logger
>  from framework.testbed_model.node import Node
>  from framework.testbed_model.port import Port
> diff --git a/dts/nodes.example.yaml b/dts/nodes.example.yaml
> new file mode 100644
> index 0000000000..454d97ab5d
> --- /dev/null
> +++ b/dts/nodes.example.yaml
> @@ -0,0 +1,53 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright 2022-2023 The DPDK contributors
> +# Copyright 2023 Arm Limited
> +
> +# Define a system under test node, having two network ports physically
> +# connected to the corresponding ports in TG 1 (the peer node)
> +- name: "SUT 1"
> +  hostname: sut1.change.me.localhost
> +  user: dtsuser
> +  os: linux
> +  ports:
> +    # sets up the physical link between "SUT 1"@0000:00:08.0 and "TG 1"@0000:00:08.0
> +    - pci: "0000:00:08.0"
> +      os_driver_for_dpdk: vfio-pci # OS driver that DPDK will use
> +      os_driver: i40e              # OS driver to bind when the tests are not running
> +      peer_node: "TG 1"
> +      peer_pci: "0000:00:08.0"
> +    # sets up the physical link between "SUT 1"@0000:00:08.1 and "TG 1"@0000:00:08.1
> +    - pci: "0000:00:08.1"
> +      os_driver_for_dpdk: vfio-pci
> +      os_driver: i40e
> +      peer_node: "TG 1"
> +      peer_pci: "0000:00:08.1"
> +  hugepages_2mb: # optional; if removed, will use system hugepage configuration
> +      number_of: 256
> +      force_first_numa: false
> +  dpdk_config:
> +      lcores: "" # use all available logical cores (Skips first core)
> +      memory_channels: 4 # tells DPDK to use 4 memory channels
> +# Define a Scapy traffic generator node, having two network ports
> +# physically connected to the corresponding ports in SUT 1 (the peer node).
> +- name: "TG 1"
> +  hostname: tg1.change.me.localhost
> +  user: dtsuser
> +  os: linux
> +  ports:
> +    # sets up the physical link between "TG 1"@0000:00:08.0 and "SUT 1"@0000:00:08.0
> +    - pci: "0000:00:08.0"
> +      os_driver_for_dpdk: rdma
> +      os_driver: rdma
> +      peer_node: "SUT 1"
> +      peer_pci: "0000:00:08.0"
> +    # sets up the physical link between "SUT 1"@0000:00:08.0 and "TG 1"@0000:00:08.0
> +    - pci: "0000:00:08.1"
> +      os_driver_for_dpdk: rdma
> +      os_driver: rdma
> +      peer_node: "SUT 1"
> +      peer_pci: "0000:00:08.1"
> +  hugepages_2mb: # optional; if removed, will use system hugepage configuration
> +      number_of: 256
> +      force_first_numa: false
> +  traffic_generator:
> +      type: SCAPY
> diff --git a/dts/test_runs.example.yaml b/dts/test_runs.example.yaml
> new file mode 100644
> index 0000000000..5b6afb153e
> --- /dev/null
> +++ b/dts/test_runs.example.yaml
> @@ -0,0 +1,33 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright 2022-2023 The DPDK contributors
> +# Copyright 2023 Arm Limited
> +
> +# Define one test run environment
> +- dpdk_build:
> +    dpdk_location:
> +      # dpdk_tree: Commented out because `tarball` is defined.
> +      tarball: dpdk-tarball.tar.xz
> +      # Either `dpdk_tree` or `tarball` can be defined, but not both.
> +      remote: false # Optional, defaults to false. If it's true, the `dpdk_tree` or `tarball`
> +                    # is located on the SUT node, instead of the execution host.
> +
> +    # precompiled_build_dir: Commented out because `build_options` is defined.
> +    build_options:
> +      # the combination of the following two makes CC="ccache gcc"
> +      compiler: gcc
> +      compiler_wrapper: ccache # Optional.
> +    # If `precompiled_build_dir` is defined, DPDK has been pre-built and the build directory is
> +    # in a subdirectory of DPDK tree root directory. Otherwise, will be using the `build_options`
> +    # to build the DPDK from source. Either `precompiled_build_dir` or `build_options` can be
> +    # defined, but not both.
> +  perf: false # disable performance testing
> +  func: true # enable functional testing
> +  skip_smoke_tests: false # optional
> +  test_suites: # the following test suites will be run in their entirety
> +    - hello_world
> +  vdevs: # optional; if removed, vdevs won't be used in the execution
> +    - "crypto_openssl"
> +  # The machine running the DPDK test executable
> +  system_under_test_node: "SUT 1"
> +  # Traffic generator node to use for this execution environment
> +  traffic_generator_node: "TG 1"
> \ No newline at end of file
> diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
> index ab5ad44850..7ed266dac0 100644
> --- a/dts/tests/TestSuite_smoke_tests.py
> +++ b/dts/tests/TestSuite_smoke_tests.py
> @@ -14,7 +14,7 @@
>
>  import re
>
> -from framework.config import PortConfig
> +from framework.config.node import PortConfig
>  from framework.remote_session.testpmd_shell import TestPmdShell
>  from framework.settings import SETTINGS
>  from framework.test_suite import TestSuite, func_test
> --
> 2.43.0
>

  reply	other threads:[~2025-01-24 18:19 UTC|newest]

Thread overview: 81+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2024-06-13 20:18 [PATCH 0/4] dts: Remove Excess Attributes From User Config Nicholas Pratte
2024-06-13 20:18 ` [PATCH 1/4] dts: Remove build target config and list of devices Nicholas Pratte
2024-06-14 18:07   ` Jeremy Spewock
2024-06-13 20:18 ` [PATCH 2/4] dts: Use First Core Logic Change Nicholas Pratte
2024-06-14 18:09   ` Jeremy Spewock
2024-06-20 13:41     ` Nicholas Pratte
2024-06-13 20:18 ` [PATCH 3/4] dts: Self-Discovering Architecture Change Nicholas Pratte
2024-06-14 18:09   ` Jeremy Spewock
2024-06-13 20:18 ` [PATCH 4/4] dts: Rework DPDK Attributes In SUT Node Config Nicholas Pratte
2024-06-14 18:11   ` Jeremy Spewock
2024-07-05 17:13 ` [PATCH v2 0/6] dts: Remove Excess Attributes From User Config Nicholas Pratte
2024-07-05 18:29   ` [PATCH v2 1/6] dts: Remove build target config and list of devices Nicholas Pratte
2024-11-06 19:29     ` Dean Marx
2024-07-05 18:31   ` [PATCH v2 2/6] dts: Use First Core Logic Change Nicholas Pratte
2024-11-06 19:48     ` Dean Marx
2024-07-05 18:32   ` [PATCH v2 3/6] dts: Self-Discovering Architecture Change Nicholas Pratte
2024-11-06 20:13     ` Dean Marx
2024-07-05 18:32   ` [PATCH v2 4/6] dts: Rework DPDK Attributes In SUT Node Config Nicholas Pratte
2024-11-06 20:32     ` Dean Marx
2024-07-05 18:33   ` [PATCH v2 5/6] dts: add conditional behavior for test suite Nicholas Pratte
2024-07-05 18:33   ` [PATCH v2 6/6] doc: dpdk documentation changes for new dts config Nicholas Pratte
2025-01-15 14:18   ` [PATCH v3 0/7] dts: refactor configuration Luca Vizzarro
2025-01-15 14:18     ` [PATCH v3 1/7] dts: enable arch self-discovery Luca Vizzarro
2025-01-16 20:52       ` Dean Marx
2025-01-22 17:38       ` Nicholas Pratte
2025-01-15 14:18     ` [PATCH v3 2/7] dts: simplify build options config Luca Vizzarro
2025-01-16 20:53       ` Dean Marx
2025-01-22 17:45       ` Nicholas Pratte
2025-01-15 14:18     ` [PATCH v3 3/7] dts: infer use first core without config Luca Vizzarro
2025-01-16 20:53       ` Dean Marx
2025-01-22 18:02       ` Nicholas Pratte
2025-01-15 14:18     ` [PATCH v3 4/7] dts: rework DPDK attributes in SUT node config Luca Vizzarro
2025-01-16 20:53       ` Dean Marx
2025-01-15 14:18     ` [PATCH v3 5/7] dts: handle CLI overrides in the configuration Luca Vizzarro
2025-01-16 20:53       ` Dean Marx
2025-01-15 14:18     ` [PATCH v3 6/7] dts: split configuration file Luca Vizzarro
2025-01-16 20:54       ` Dean Marx
2025-01-15 14:18     ` [PATCH v3 7/7] dts: run all test suites by default Luca Vizzarro
2025-01-16 21:01       ` Dean Marx
2025-01-20 10:00         ` Luca Vizzarro
2024-07-05 17:13 ` [PATCH v2 1/6] dts: Remove build target config and list of devices Nicholas Pratte
2024-07-16 15:07   ` Jeremy Spewock
2024-09-12 20:33     ` Nicholas Pratte
2024-09-10 11:30   ` Juraj Linkeš
2024-09-12 20:31     ` Nicholas Pratte
2024-11-18 16:51   ` Luca Vizzarro
2024-07-05 17:13 ` [PATCH v2 2/6] dts: Use First Core Logic Change Nicholas Pratte
2024-09-10 13:34   ` Juraj Linkeš
2024-11-18 16:54   ` Luca Vizzarro
2024-07-05 17:13 ` [PATCH v2 3/6] dts: Self-Discovering Architecture Change Nicholas Pratte
2024-09-10 13:41   ` Juraj Linkeš
2024-11-18 17:14   ` Luca Vizzarro
2024-07-05 17:13 ` [PATCH v2 4/6] dts: Rework DPDK Attributes In SUT Node Config Nicholas Pratte
2024-09-10 14:04   ` Juraj Linkeš
2024-11-18 17:16   ` Luca Vizzarro
2024-07-05 17:13 ` [PATCH v2 5/6] dts: add conditional behavior for test suite Nicholas Pratte
2024-07-16 14:59   ` Jeremy Spewock
2024-09-10 14:12   ` Juraj Linkeš
2024-11-06 20:52   ` Dean Marx
2024-11-18 17:21   ` Luca Vizzarro
2024-07-05 17:13 ` [PATCH v2 6/6] doc: dpdk documentation changes for new dts config Nicholas Pratte
2024-09-10 14:17   ` Juraj Linkeš
2024-11-06 20:57   ` Dean Marx
2024-11-18 17:21   ` Luca Vizzarro
2024-07-05 18:24 ` [PATCH v2 1/6] dts: Remove build target config and list of devices Nicholas Pratte
2025-01-24 11:39 ` [PATCH v4 0/7] dts: refactor configuration Luca Vizzarro
2025-01-24 11:39   ` [PATCH v4 1/7] dts: enable arch self-discovery Luca Vizzarro
2025-01-24 18:09     ` Nicholas Pratte
2025-01-24 11:39   ` [PATCH v4 2/7] dts: simplify build options config Luca Vizzarro
2025-01-24 18:10     ` Nicholas Pratte
2025-01-24 11:39   ` [PATCH v4 3/7] dts: infer use first core without config Luca Vizzarro
2025-01-24 18:14     ` Nicholas Pratte
2025-01-24 11:39   ` [PATCH v4 4/7] dts: rework DPDK attributes in SUT node config Luca Vizzarro
2025-01-24 18:14     ` Nicholas Pratte
2025-01-24 11:39   ` [PATCH v4 5/7] dts: handle CLI overrides in the configuration Luca Vizzarro
2025-01-24 18:15     ` Nicholas Pratte
2025-01-24 11:39   ` [PATCH v4 6/7] dts: split configuration file Luca Vizzarro
2025-01-24 18:18     ` Nicholas Pratte [this message]
2025-01-24 11:39   ` [PATCH v4 7/7] dts: run all test suites by default Luca Vizzarro
2025-01-24 18:20     ` Nicholas Pratte
2025-01-24 18:54   ` [PATCH v4 0/7] dts: refactor configuration Nicholas Pratte

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to='CAKXZ7eg=Ym4uTeu0-RncA6Nn-HxaArTanXng6RUuJz91=8SRrw@mail.gmail.com' \
    --to=npratte@iol.unh.edu \
    --cc=dev@dpdk.org \
    --cc=dmarx@iol.unh.edu \
    --cc=luca.vizzarro@arm.com \
    --cc=paul.szczepanek@arm.com \
    --cc=probb@iol.unh.edu \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).