DPDK patches and discussions
 help / color / mirror / Atom feed
* [RFC PATCH v1 00/12] DTS external DPDK build and stats
@ 2024-09-06 13:26 Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 01/12] dts: rename build target to DPDK build Juraj Linkeš
                   ` (11 more replies)
  0 siblings, 12 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-06 13:26 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, paul.szczepanek, Luca.Vizzarro,
	alex.chapman, probb, jspewock, npratte, dmarx
  Cc: dev, Juraj Linkeš

Add support for externally built DPDK. The supported scenarios are:
* DPDK built on remote node
* DPDK built locally
* DPDK not built anywhere, source tree or tarball on remote node
* DPDK not built anywhere, local source tree or tarball

Remove multiple build targets per test run. If different build targets
are to be tested, these can be specified in multiple test runs.

Remove the git-ref option since it's redundant with the new features.

Improve statistics with a json output that includes more complete
results.

Tomáš Ďurovec (12):
  dts: rename build target to DPDK build
  dts: one dpdk build per test run
  dts: fix remote session transferring files
  dts: improve path handling for local and remote paths
  dts: add the ability to copy directories via remote
  dts: add ability to prevent overwriting files/dirs
  dts: update argument option for prevent overwriting
  dts: add support for externally compiled DPDK
  doc: update argument options for external DPDK build
  dts: remove git ref option
  doc: remove git-ref argument
  dts: improve statistics

 doc/guides/tools/dts.rst                      |  17 +-
 dts/conf.yaml                                 |   6 +-
 dts/framework/config/__init__.py              | 106 ++++-
 dts/framework/config/conf_yaml_schema.json    |  51 ++-
 dts/framework/config/types.py                 |  19 +-
 dts/framework/exception.py                    |   4 +-
 dts/framework/logger.py                       |   4 -
 dts/framework/remote_session/dpdk_shell.py    |   2 +-
 .../remote_session/remote_session.py          |  18 +-
 dts/framework/remote_session/ssh_session.py   |  12 +-
 dts/framework/runner.py                       | 150 +++----
 dts/framework/settings.py                     | 188 ++++++---
 dts/framework/test_result.py                  | 372 ++++++++++--------
 dts/framework/test_suite.py                   |   2 +-
 dts/framework/testbed_model/node.py           |  22 +-
 dts/framework/testbed_model/os_session.py     | 160 ++++++--
 dts/framework/testbed_model/posix_session.py  | 135 ++++++-
 dts/framework/testbed_model/sut_node.py       | 337 ++++++++++------
 dts/framework/utils.py                        | 168 ++++----
 dts/tests/TestSuite_smoke_tests.py            |   2 +-
 20 files changed, 1110 insertions(+), 665 deletions(-)

-- 
2.43.0


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [RFC PATCH v1 01/12] dts: rename build target to DPDK build
  2024-09-06 13:26 [RFC PATCH v1 00/12] DTS external DPDK build and stats Juraj Linkeš
@ 2024-09-06 13:26 ` Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 02/12] dts: one dpdk build per test run Juraj Linkeš
                   ` (10 subsequent siblings)
  11 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-06 13:26 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, paul.szczepanek, Luca.Vizzarro,
	alex.chapman, probb, jspewock, npratte, dmarx
  Cc: dev, Tomáš Ďurovec

From: Tomáš Ďurovec <tomas.durovec@pantheon.tech>

Signed-off-by: Tomáš Ďurovec <tomas.durovec@pantheon.tech>
---
 dts/conf.yaml                              |   2 +-
 dts/framework/config/__init__.py           |  26 ++---
 dts/framework/config/conf_yaml_schema.json |  10 +-
 dts/framework/config/types.py              |   4 +-
 dts/framework/logger.py                    |   4 +-
 dts/framework/runner.py                    | 112 ++++++++++-----------
 dts/framework/settings.py                  |   2 +-
 dts/framework/test_result.py               |  72 +++++++------
 dts/framework/test_suite.py                |   2 +-
 dts/framework/testbed_model/sut_node.py    |  55 +++++-----
 dts/tests/TestSuite_smoke_tests.py         |   2 +-
 11 files changed, 142 insertions(+), 149 deletions(-)

diff --git a/dts/conf.yaml b/dts/conf.yaml
index 7d95016e68..d43e6fcfeb 100644
--- a/dts/conf.yaml
+++ b/dts/conf.yaml
@@ -4,7 +4,7 @@
 
 test_runs:
   # define one test run environment
-  - build_targets:
+  - dpdk_builds:
       - arch: x86_64
         os: linux
         cpu: native
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index df60a5030e..598d7101ed 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -45,8 +45,8 @@
 from typing_extensions import Self
 
 from framework.config.types import (
-    BuildTargetConfigDict,
     ConfigurationDict,
+    DPDKBuildConfigDict,
     NodeConfigDict,
     PortConfigDict,
     TestRunConfigDict,
@@ -335,7 +335,7 @@ class NodeInfo:
 
 
 @dataclass(slots=True, frozen=True)
-class BuildTargetConfiguration:
+class DPDKBuildConfiguration:
     """DPDK build configuration.
 
     The configuration used for building DPDK.
@@ -358,7 +358,7 @@ class BuildTargetConfiguration:
     name: str
 
     @classmethod
-    def from_dict(cls, d: BuildTargetConfigDict) -> Self:
+    def from_dict(cls, d: DPDKBuildConfigDict) -> Self:
         r"""A convenience method that processes the inputs before creating an instance.
 
         `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
@@ -368,7 +368,7 @@ def from_dict(cls, d: BuildTargetConfigDict) -> Self:
             d: The configuration dictionary.
 
         Returns:
-            The build target configuration instance.
+            The DPDK build configuration instance.
         """
         return cls(
             arch=Architecture(d["arch"]),
@@ -381,8 +381,8 @@ def from_dict(cls, d: BuildTargetConfigDict) -> Self:
 
 
 @dataclass(slots=True, frozen=True)
-class BuildTargetInfo:
-    """Various versions and other information about a build target.
+class DPDKBuildInfo:
+    """Various versions and other information about a DPDK build.
 
     Attributes:
         dpdk_version: The DPDK version that was built.
@@ -437,7 +437,7 @@ class TestRunConfiguration:
     and with what DPDK build.
 
     Attributes:
-        build_targets: A list of DPDK builds to test.
+        dpdk_builds: A list of DPDK builds to test.
         perf: Whether to run performance tests.
         func: Whether to run functional tests.
         skip_smoke_tests: Whether to skip smoke tests.
@@ -447,7 +447,7 @@ class TestRunConfiguration:
         vdevs: The names of virtual devices to test.
     """
 
-    build_targets: list[BuildTargetConfiguration]
+    dpdk_builds: list[DPDKBuildConfiguration]
     perf: bool
     func: bool
     skip_smoke_tests: bool
@@ -464,7 +464,7 @@ def from_dict(
     ) -> Self:
         """A convenience method that processes the inputs before creating an instance.
 
-        The build target and the test suite config are transformed into their respective objects.
+        The DPDK build and the test suite config are transformed into their respective objects.
         SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
         are just stored.
 
@@ -475,8 +475,8 @@ def from_dict(
         Returns:
             The test run configuration instance.
         """
-        build_targets: list[BuildTargetConfiguration] = list(
-            map(BuildTargetConfiguration.from_dict, d["build_targets"])
+        dpdk_builds: list[DPDKBuildConfiguration] = list(
+            map(DPDKBuildConfiguration.from_dict, d["dpdk_builds"])
         )
         test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"]))
         sut_name = d["system_under_test_node"]["node_name"]
@@ -498,7 +498,7 @@ def from_dict(
             d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else []
         )
         return cls(
-            build_targets=build_targets,
+            dpdk_builds=dpdk_builds,
             perf=d["perf"],
             func=d["func"],
             skip_smoke_tests=skip_smoke_tests,
@@ -548,7 +548,7 @@ class Configuration:
     def from_dict(cls, d: ConfigurationDict) -> Self:
         """A convenience method that processes the inputs before creating an instance.
 
-        Build target and test suite config are transformed into their respective objects.
+        DPDK build and test suite config are transformed into their respective objects.
         SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
         are just stored.
 
diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
index f02a310bb5..4b63e9710e 100644
--- a/dts/framework/config/conf_yaml_schema.json
+++ b/dts/framework/config/conf_yaml_schema.json
@@ -110,9 +110,9 @@
         "mscv"
       ]
     },
-    "build_target": {
+    "dpdk_build": {
       "type": "object",
-      "description": "Targets supported by DTS",
+      "description": "DPDK build configuration supported by DTS.",
       "properties": {
         "arch": {
           "type": "string",
@@ -327,10 +327,10 @@
       "items": {
         "type": "object",
         "properties": {
-          "build_targets": {
+          "dpdk_builds": {
             "type": "array",
             "items": {
-              "$ref": "#/definitions/build_target"
+              "$ref": "#/definitions/dpdk_build"
             },
             "minimum": 1
           },
@@ -383,7 +383,7 @@
         },
         "additionalProperties": false,
         "required": [
-          "build_targets",
+          "dpdk_builds",
           "perf",
           "func",
           "test_suites",
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
index cf16556403..703d9eb48e 100644
--- a/dts/framework/config/types.py
+++ b/dts/framework/config/types.py
@@ -71,7 +71,7 @@ class NodeConfigDict(TypedDict):
     traffic_generator: TrafficGeneratorConfigDict
 
 
-class BuildTargetConfigDict(TypedDict):
+class DPDKBuildConfigDict(TypedDict):
     """Allowed keys and values."""
 
     #:
@@ -108,7 +108,7 @@ class TestRunConfigDict(TypedDict):
     """Allowed keys and values."""
 
     #:
-    build_targets: list[BuildTargetConfigDict]
+    dpdk_builds: list[DPDKBuildConfigDict]
     #:
     perf: bool
     #:
diff --git a/dts/framework/logger.py b/dts/framework/logger.py
index 9420323d38..3fbe618219 100644
--- a/dts/framework/logger.py
+++ b/dts/framework/logger.py
@@ -33,7 +33,7 @@ class DtsStage(StrEnum):
     #:
     test_run_setup = auto()
     #:
-    build_target_setup = auto()
+    dpdk_build_setup = auto()
     #:
     test_suite_setup = auto()
     #:
@@ -41,7 +41,7 @@ class DtsStage(StrEnum):
     #:
     test_suite_teardown = auto()
     #:
-    build_target_teardown = auto()
+    dpdk_build_teardown = auto()
     #:
     test_run_teardown = auto()
     #:
diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 6b6f6a05f5..2b5403e51c 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -8,11 +8,11 @@
 The module is responsible for running DTS in a series of stages:
 
     #. Test run stage,
-    #. Build target stage,
+    #. DPDK build stage,
     #. Test suite stage,
     #. Test case stage.
 
-The test run and build target stages set up the environment before running test suites.
+The test run and DPDK build stages set up the environment before running test suites.
 The test suite stage sets up steps common to all test cases
 and the test case stage runs test cases individually.
 """
@@ -30,8 +30,8 @@
 from framework.testbed_model.tg_node import TGNode
 
 from .config import (
-    BuildTargetConfiguration,
     Configuration,
+    DPDKBuildConfiguration,
     TestRunConfiguration,
     TestSuiteConfig,
     load_config,
@@ -45,7 +45,7 @@
 from .logger import DTSLogger, DtsStage, get_dts_logger
 from .settings import SETTINGS
 from .test_result import (
-    BuildTargetResult,
+    DPDKBuildResult,
     DTSResult,
     Result,
     TestCaseResult,
@@ -69,9 +69,9 @@ class DTSRunner:
     :class:`~.framework.exception.DTSError`\s.
 
     Example:
-        An error occurs in a build target setup. The current build target is aborted,
+        An error occurs in a DPDK build setup. The current DPDK build is aborted,
         all test suites and their test cases are marked as blocked and the run continues
-        with the next build target. If the errored build target was the last one in the
+        with the next DPDK build. If the errored DPDK build was the last one in the
         given test run, the next test run begins.
     """
 
@@ -97,16 +97,16 @@ def __init__(self):
         self._perf_test_case_regex = r"test_perf_"
 
     def run(self) -> None:
-        """Run all build targets in all test runs from the test run configuration.
+        """Run all DPDK build in all test runs from the test run configuration.
 
-        Before running test suites, test runs and build targets are first set up.
-        The test runs and build targets defined in the test run configuration are iterated over.
-        The test runs define which tests to run and where to run them and build targets define
+        Before running test suites, test runs and DPDK builds are first set up.
+        The test runs and DPDK builds defined in the test run configuration are iterated over.
+        The test runs define which tests to run and where to run them and DPDK builds define
         the DPDK build setup.
 
-        The tests suites are set up for each test run/build target tuple and each discovered
+        The tests suites are set up for each test run/DPDK build tuple and each discovered
         test case within the test suite is set up, executed and torn down. After all test cases
-        have been executed, the test suite is torn down and the next build target will be tested.
+        have been executed, the test suite is torn down and the next DPDK build will be tested.
 
         In order to properly mark test suites and test cases as blocked in case of a failure,
         we need to have discovered which test suites and test cases to run before any failures
@@ -116,7 +116,7 @@ def run(self) -> None:
 
             #. Test run setup
 
-                #. Build target setup
+                #. DPDK build setup
 
                     #. Test suite setup
 
@@ -126,7 +126,7 @@ def run(self) -> None:
 
                     #. Test suite teardown
 
-                #. Build target teardown
+                #. DPDK build teardown
 
             #. Test run teardown
 
@@ -414,7 +414,7 @@ def _run_test_run(
     ) -> None:
         """Run the given test run.
 
-        This involves running the test run setup as well as running all build targets
+        This involves running the test run setup as well as running all DPDK builds
         in the given test run. After that, the test run teardown is run.
 
         Args:
@@ -437,13 +437,13 @@ def _run_test_run(
             test_run_result.update_setup(Result.FAIL, e)
 
         else:
-            for build_target_config in test_run_config.build_targets:
-                build_target_result = test_run_result.add_build_target(build_target_config)
-                self._run_build_target(
+            for dpdk_build_config in test_run_config.dpdk_builds:
+                dpdk_build_result = test_run_result.add_dpdk_build(dpdk_build_config)
+                self._run_dpdk_build(
                     sut_node,
                     tg_node,
-                    build_target_config,
-                    build_target_result,
+                    dpdk_build_config,
+                    dpdk_build_result,
                     test_suites_with_cases,
                 )
 
@@ -457,88 +457,87 @@ def _run_test_run(
                 self._logger.exception("Test run teardown failed.")
                 test_run_result.update_teardown(Result.FAIL, e)
 
-    def _run_build_target(
+    def _run_dpdk_build(
         self,
         sut_node: SutNode,
         tg_node: TGNode,
-        build_target_config: BuildTargetConfiguration,
-        build_target_result: BuildTargetResult,
+        dpdk_build_config: DPDKBuildConfiguration,
+        dpdk_build_result: DPDKBuildResult,
         test_suites_with_cases: Iterable[TestSuiteWithCases],
     ) -> None:
-        """Run the given build target.
+        """Run the given DPDK build.
 
-        This involves running the build target setup as well as running all test suites
-        of the build target's test run.
-        After that, build target teardown is run.
+        This involves running the DPDK build setup as well as running all test suites
+        of the DPDK build's test run.
+        After that, DPDK build teardown is run.
 
         Args:
             sut_node: The test run's sut node.
             tg_node: The test run's tg node.
-            build_target_config: A build target's test run configuration.
-            build_target_result: The build target level result object associated
-                with the current build target.
+            dpdk_build_config: A DPDK build's test run configuration.
+            dpdk_build_result: The DPDK build level result object associated
+                with the current DPDK build.
             test_suites_with_cases: The test suites with test cases to run.
         """
-        self._logger.set_stage(DtsStage.build_target_setup)
-        self._logger.info(f"Running build target '{build_target_config.name}'.")
+        self._logger.set_stage(DtsStage.dpdk_build_setup)
+        self._logger.info(f"Running DPDK build '{dpdk_build_config.name}'.")
 
         try:
-            sut_node.set_up_build_target(build_target_config)
+            sut_node.set_up_dpdk(dpdk_build_config)
             self._result.dpdk_version = sut_node.dpdk_version
-            build_target_result.add_build_target_info(sut_node.get_build_target_info())
-            build_target_result.update_setup(Result.PASS)
+            dpdk_build_result.add_dpdk_build_info(sut_node.get_dpdk_build_info())
+            dpdk_build_result.update_setup(Result.PASS)
         except Exception as e:
-            self._logger.exception("Build target setup failed.")
-            build_target_result.update_setup(Result.FAIL, e)
+            self._logger.exception("DPDK build setup failed.")
+            dpdk_build_result.update_setup(Result.FAIL, e)
 
         else:
-            self._run_test_suites(sut_node, tg_node, build_target_result, test_suites_with_cases)
+            self._run_test_suites(sut_node, tg_node, dpdk_build_result, test_suites_with_cases)
 
         finally:
             try:
-                self._logger.set_stage(DtsStage.build_target_teardown)
-                sut_node.tear_down_build_target()
-                build_target_result.update_teardown(Result.PASS)
+                self._logger.set_stage(DtsStage.dpdk_build_teardown)
+                sut_node.tear_down_dpdk()
+                dpdk_build_result.update_teardown(Result.PASS)
             except Exception as e:
-                self._logger.exception("Build target teardown failed.")
-                build_target_result.update_teardown(Result.FAIL, e)
+                self._logger.exception("DPDK build teardown failed.")
+                dpdk_build_result.update_teardown(Result.FAIL, e)
 
     def _run_test_suites(
         self,
         sut_node: SutNode,
         tg_node: TGNode,
-        build_target_result: BuildTargetResult,
+        dpdk_build_result: DPDKBuildResult,
         test_suites_with_cases: Iterable[TestSuiteWithCases],
     ) -> None:
-        """Run `test_suites_with_cases` with the current build target.
+        """Run `test_suites_with_cases` with the current DPDK build.
 
-        The method assumes the build target we're testing has already been built on the SUT node.
-        The current build target thus corresponds to the current DPDK build present on the SUT node.
+        The method assumes the DPDK we're testing has already been built on the SUT node.
 
         If a blocking test suite (such as the smoke test suite) fails, the rest of the test suites
-        in the current build target won't be executed.
+        in the current DPDK build won't be executed.
 
         Args:
             sut_node: The test run's SUT node.
             tg_node: The test run's TG node.
-            build_target_result: The build target level result object associated
-                with the current build target.
+            dpdk_build_result: The DPDK build level result object associated
+                with the current DPDK build.
             test_suites_with_cases: The test suites with test cases to run.
         """
-        end_build_target = False
+        end_dpdk_build = False
         for test_suite_with_cases in test_suites_with_cases:
-            test_suite_result = build_target_result.add_test_suite(test_suite_with_cases)
+            test_suite_result = dpdk_build_result.add_test_suite(test_suite_with_cases)
             try:
                 self._run_test_suite(sut_node, tg_node, test_suite_result, test_suite_with_cases)
             except BlockingTestSuiteError as e:
                 self._logger.exception(
                     f"An error occurred within {test_suite_with_cases.test_suite_class.__name__}. "
-                    "Skipping build target..."
+                    "Skipping DPDK build ..."
                 )
                 self._result.add_error(e)
-                end_build_target = True
+                end_dpdk_build = True
             # if a blocking test failed and we need to bail out of suite executions
-            if end_build_target:
+            if end_dpdk_build:
                 break
 
     def _run_test_suite(
@@ -550,8 +549,7 @@ def _run_test_suite(
     ) -> None:
         """Set up, execute and tear down `test_suite_with_cases`.
 
-        The method assumes the build target we're testing has already been built on the SUT node.
-        The current build target thus corresponds to the current DPDK build present on the SUT node.
+        The method assumes the DPDK we're testing has already been built on the SUT node.
 
         Test suite execution consists of running the discovered test cases.
         A test case run consists of setup, execution and teardown of said test case.
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index f6303066d4..2b8c583853 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -270,7 +270,7 @@ def _get_parser() -> _DTSArgumentParser:
         "--config-file",
         default=SETTINGS.config_file_path,
         type=Path,
-        help="The configuration file that describes the test cases, SUTs and targets.",
+        help="The configuration file that describes the test cases, SUTs and DPDK build configs.",
         metavar="FILE_PATH",
         dest="config_file_path",
     )
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 5694a2482b..95788b7d2e 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -8,12 +8,12 @@
 
     * :class:`DTSResult` contains
     * :class:`TestRunResult` contains
-    * :class:`BuildTargetResult` contains
+    * :class:`DPDKBuildResult` contains
     * :class:`TestSuiteResult` contains
     * :class:`TestCaseResult`
 
 Each result may contain multiple lower level results, e.g. there are multiple
-:class:`TestSuiteResult`\s in a :class:`BuildTargetResult`.
+:class:`TestSuiteResult`\s in a :class:`DPDKBuildResult`.
 The results have common parts, such as setup and teardown results, captured in :class:`BaseResult`,
 which also defines some common behaviors in its methods.
 
@@ -33,10 +33,10 @@
 from .config import (
     OS,
     Architecture,
-    BuildTargetConfiguration,
-    BuildTargetInfo,
     Compiler,
     CPUType,
+    DPDKBuildConfiguration,
+    DPDKBuildInfo,
     NodeInfo,
     TestRunConfiguration,
     TestSuiteConfig,
@@ -138,7 +138,7 @@ class BaseResult:
     Stores the results of the setup and teardown portions of the corresponding stage.
     The hierarchical nature of DTS results is captured recursively in an internal list.
     A stage is each level in this particular hierarchy (pre-run or the top-most level,
-    test run, build target, test suite and test case.)
+    test run, DPDK build, test suite and test case.)
 
     Attributes:
         setup_result: The result of the setup of the particular stage.
@@ -223,7 +223,7 @@ class DTSResult(BaseResult):
     """Stores environment information and test results from a DTS run.
 
         * Test run level information, such as testbed and the test suite list,
-        * Build target level information, such as compiler, target OS and cpu,
+        * DPDK build level information, such as compiler, target OS and cpu,
         * Test suite and test case results,
         * All errors that are caught and recorded during DTS execution.
 
@@ -317,7 +317,7 @@ def get_return_code(self) -> int:
 class TestRunResult(BaseResult):
     """The test run specific result.
 
-    The internal list stores the results of all build targets in a given test run.
+    The internal list stores the results of all DPDK builds in a given test run.
 
     Attributes:
         sut_os_name: The operating system of the SUT node.
@@ -342,20 +342,18 @@ def __init__(self, test_run_config: TestRunConfiguration):
         self._config = test_run_config
         self._test_suites_with_cases = []
 
-    def add_build_target(
-        self, build_target_config: BuildTargetConfiguration
-    ) -> "BuildTargetResult":
-        """Add and return the child result (build target).
+    def add_dpdk_build(self, dpdk_build_config: DPDKBuildConfiguration) -> "DPDKBuildResult":
+        """Add and return the child result (DPDK build).
 
         Args:
-            build_target_config: The build target's test run configuration.
+            dpdk_build: The DPDK build's test run configuration.
 
         Returns:
-            The build target's result.
+            The DPDK build's result.
         """
-        result = BuildTargetResult(
+        result = DPDKBuildResult(
             self._test_suites_with_cases,
-            build_target_config,
+            dpdk_build_config,
         )
         self.child_results.append(result)
         return result
@@ -394,22 +392,22 @@ def add_sut_info(self, sut_info: NodeInfo) -> None:
 
     def _block_result(self) -> None:
         r"""Mark the result as :attr:`~Result.BLOCK`\ed."""
-        for build_target in self._config.build_targets:
-            child_result = self.add_build_target(build_target)
+        for dpdk_build in self._config.dpdk_builds:
+            child_result = self.add_dpdk_build(dpdk_build)
             child_result.update_setup(Result.BLOCK)
 
 
-class BuildTargetResult(BaseResult):
-    """The build target specific result.
+class DPDKBuildResult(BaseResult):
+    """The DPDK build specific result.
 
-    The internal list stores the results of all test suites in a given build target.
+    The internal list stores the results of all test suites in a given DPDK build.
 
     Attributes:
-        arch: The DPDK build target architecture.
-        os: The DPDK build target operating system.
-        cpu: The DPDK build target CPU.
-        compiler: The DPDK build target compiler.
-        compiler_version: The DPDK build target compiler version.
+        arch: The DPDK DPDK build architecture.
+        os: The DPDK DPDK build operating system.
+        cpu: The DPDK DPDK build CPU.
+        compiler: The DPDK DPDK build compiler.
+        compiler_version: The DPDK DPDK build compiler version.
         dpdk_version: The built DPDK version.
     """
 
@@ -424,19 +422,19 @@ class BuildTargetResult(BaseResult):
     def __init__(
         self,
         test_suites_with_cases: list[TestSuiteWithCases],
-        build_target_config: BuildTargetConfiguration,
+        dpdk_build_config: DPDKBuildConfiguration,
     ):
-        """Extend the constructor with the build target's config and test suites with cases.
+        """Extend the constructor with the DPDK build's config and test suites with cases.
 
         Args:
-            test_suites_with_cases: The test suites with test cases to be run in this build target.
-            build_target_config: The build target's test run configuration.
+            test_suites_with_cases: The test suites with test cases to be run in this DPDK build.
+            dpdk_build_config: The DPDK build's test run configuration.
         """
         super().__init__()
-        self.arch = build_target_config.arch
-        self.os = build_target_config.os
-        self.cpu = build_target_config.cpu
-        self.compiler = build_target_config.compiler
+        self.arch = dpdk_build_config.arch
+        self.os = dpdk_build_config.os
+        self.cpu = dpdk_build_config.cpu
+        self.compiler = dpdk_build_config.compiler
         self.compiler_version = None
         self.dpdk_version = None
         self._test_suites_with_cases = test_suites_with_cases
@@ -457,8 +455,8 @@ def add_test_suite(
         self.child_results.append(result)
         return result
 
-    def add_build_target_info(self, versions: BuildTargetInfo) -> None:
-        """Add information about the build target gathered at runtime.
+    def add_dpdk_build_info(self, versions: DPDKBuildInfo) -> None:
+        """Add information about the DPDK build gathered at runtime.
 
         Args:
             versions: The additional information.
@@ -484,11 +482,11 @@ class TestSuiteResult(BaseResult):
 
     test_suite_name: str
     _test_suite_with_cases: TestSuiteWithCases
-    _parent_result: BuildTargetResult
+    _parent_result: DPDKBuildResult
     _child_configs: list[str]
 
     def __init__(self, test_suite_with_cases: TestSuiteWithCases):
-        """Extend the constructor with test suite's config and BuildTargetResult.
+        """Extend the constructor with test suite's config and DPDKBuildResult.
 
         Args:
             test_suite_with_cases: The test suite with test cases.
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index 694b2eba65..c473b91c5a 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -65,7 +65,7 @@ class TestSuite:
     sut_node: SutNode
     tg_node: TGNode
     #: Whether the test suite is blocking. A failure of a blocking test suite
-    #: will block the execution of all subsequent test suites in the current build target.
+    #: will block the execution of all subsequent test suites in the current DPDK build.
     is_blocking: ClassVar[bool] = False
     _logger: DTSLogger
     _port_links: list[PortLink]
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 2855fe0276..6b6fb894ca 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -18,8 +18,8 @@
 from pathlib import PurePath
 
 from framework.config import (
-    BuildTargetConfiguration,
-    BuildTargetInfo,
+    DPDKBuildConfiguration,
+    DPDKBuildInfo,
     NodeInfo,
     SutNodeConfiguration,
     TestRunConfiguration,
@@ -57,7 +57,7 @@ class SutNode(Node):
     virtual_devices: list[VirtualDevice]
     dpdk_prefix_list: list[str]
     dpdk_timestamp: str
-    _build_target_config: BuildTargetConfiguration | None
+    _dpdk_build_config: DPDKBuildConfiguration | None
     _env_vars: dict
     _remote_tmp_dir: PurePath
     __remote_dpdk_dir: PurePath | None
@@ -77,7 +77,7 @@ def __init__(self, node_config: SutNodeConfiguration):
         super().__init__(node_config)
         self.virtual_devices = []
         self.dpdk_prefix_list = []
-        self._build_target_config = None
+        self._dpdk_build_config = None
         self._env_vars = {}
         self._remote_tmp_dir = self.main_session.get_remote_tmp_dir()
         self.__remote_dpdk_dir = None
@@ -115,9 +115,9 @@ def remote_dpdk_build_dir(self) -> PurePath:
         This is the directory where DPDK was built.
         We assume it was built in a subdirectory of the extracted tarball.
         """
-        if self._build_target_config:
+        if self._dpdk_build_config:
             return self.main_session.join_remote_path(
-                self._remote_dpdk_dir, self._build_target_config.name
+                self._remote_dpdk_dir, self._dpdk_build_config.name
             )
         else:
             return self.main_session.join_remote_path(self._remote_dpdk_dir, "build")
@@ -140,13 +140,13 @@ def node_info(self) -> NodeInfo:
     def compiler_version(self) -> str:
         """The node's compiler version."""
         if self._compiler_version is None:
-            if self._build_target_config is not None:
+            if self._dpdk_build_config is not None:
                 self._compiler_version = self.main_session.get_compiler_version(
-                    self._build_target_config.compiler.name
+                    self._dpdk_build_config.compiler.name
                 )
             else:
                 self._logger.warning(
-                    "Failed to get compiler version because _build_target_config is None."
+                    "Failed to get compiler version because _dpdk_build_config is None."
                 )
                 return ""
         return self._compiler_version
@@ -160,15 +160,13 @@ def path_to_devbind_script(self) -> PurePath:
             )
         return self._path_to_devbind_script
 
-    def get_build_target_info(self) -> BuildTargetInfo:
-        """Get additional build target information.
+    def get_dpdk_build_info(self) -> DPDKBuildInfo:
+        """Get additional DPDK build information.
 
         Returns:
-            The build target information,
+            The DPDK build information,
         """
-        return BuildTargetInfo(
-            dpdk_version=self.dpdk_version, compiler_version=self.compiler_version
-        )
+        return DPDKBuildInfo(dpdk_version=self.dpdk_version, compiler_version=self.compiler_version)
 
     def _guess_dpdk_remote_dir(self) -> PurePath:
         return self.main_session.guess_dpdk_remote_dir(self._remote_tmp_dir)
@@ -189,40 +187,39 @@ def tear_down_test_run(self) -> None:
         super().tear_down_test_run()
         self.virtual_devices = []
 
-    def set_up_build_target(self, build_target_config: BuildTargetConfiguration) -> None:
+    def set_up_dpdk(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
         """Set up DPDK the SUT node and bind ports.
 
         DPDK setup includes setting all internals needed for the build, the copying of DPDK tarball
         and then building DPDK. The drivers are bound to those that DPDK needs.
 
         Args:
-            build_target_config: The build target test run configuration according to which
+            dpdk_build_config: The DPDK build test run configuration according to which
                 the setup steps will be taken.
         """
-        self._configure_build_target(build_target_config)
+        self._configure_dpdk_build(dpdk_build_config)
         self._copy_dpdk_tarball()
         self._build_dpdk()
         self.bind_ports_to_driver()
 
-    def tear_down_build_target(self) -> None:
+    def tear_down_dpdk(self) -> None:
         """Reset DPDK variables and bind port driver to the OS driver."""
         self._env_vars = {}
-        self._build_target_config = None
+        self._dpdk_build_config = None
         self.__remote_dpdk_dir = None
         self._dpdk_version = None
         self._compiler_version = None
         self.bind_ports_to_driver(for_dpdk=False)
 
-    def _configure_build_target(self, build_target_config: BuildTargetConfiguration) -> None:
-        """Populate common environment variables and set build target config."""
+    def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
+        """Populate common environment variables and set DPDK build config."""
         self._env_vars = {}
-        self._build_target_config = build_target_config
-        self._env_vars.update(self.main_session.get_dpdk_build_env_vars(build_target_config.arch))
-        self._env_vars["CC"] = build_target_config.compiler.name
-        if build_target_config.compiler_wrapper:
-            self._env_vars["CC"] = (
-                f"'{build_target_config.compiler_wrapper} {build_target_config.compiler.name}'"
-            )  # fmt: skip
+        self._dpdk_build_config = dpdk_build_config
+        self._env_vars.update(self.main_session.get_dpdk_build_env_vars(dpdk_build_config.arch))
+        self._env_vars["CC"] = dpdk_build_config.compiler.name
+        if dpdk_build_config.compiler_wrapper:
+            self._env_vars["CC"] = f"'{self._dpdk_build_config.compiler_wrapper} "
+            f"{self._dpdk_build_config.compiler.name}'"
 
     @Node.skip_setup
     def _copy_dpdk_tarball(self) -> None:
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
index c0b0e6bb00..ac661472b9 100644
--- a/dts/tests/TestSuite_smoke_tests.py
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -29,7 +29,7 @@ class TestSmokeTests(TestSuite):
 
     Attributes:
         is_blocking: This test suite will block the execution of all other test suites
-            in the build target after it.
+            in the DPDK build after it.
         nics_in_node: The NICs present on the SUT node.
     """
 
-- 
2.43.0


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [RFC PATCH v1 02/12] dts: one dpdk build per test run
  2024-09-06 13:26 [RFC PATCH v1 00/12] DTS external DPDK build and stats Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 01/12] dts: rename build target to DPDK build Juraj Linkeš
@ 2024-09-06 13:26 ` Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 03/12] dts: fix remote session transferring files Juraj Linkeš
                   ` (9 subsequent siblings)
  11 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-06 13:26 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, paul.szczepanek, Luca.Vizzarro,
	alex.chapman, probb, jspewock, npratte, dmarx
  Cc: dev, Tomáš Ďurovec

From: Tomáš Ďurovec <tomas.durovec@pantheon.tech>

Signed-off-by: Tomáš Ďurovec <tomas.durovec@pantheon.tech>
---
 dts/conf.yaml                              |  14 +--
 dts/framework/config/__init__.py           |   9 +-
 dts/framework/config/conf_yaml_schema.json |  10 +-
 dts/framework/config/types.py              |   2 +-
 dts/framework/logger.py                    |   4 -
 dts/framework/runner.py                    | 117 +++++---------------
 dts/framework/test_result.py               | 119 ++++++---------------
 dts/framework/test_suite.py                |   2 +-
 dts/framework/testbed_model/sut_node.py    |   6 +-
 dts/tests/TestSuite_smoke_tests.py         |   2 +-
 10 files changed, 80 insertions(+), 205 deletions(-)

diff --git a/dts/conf.yaml b/dts/conf.yaml
index d43e6fcfeb..3d5ee5aee5 100644
--- a/dts/conf.yaml
+++ b/dts/conf.yaml
@@ -4,13 +4,13 @@
 
 test_runs:
   # define one test run environment
-  - dpdk_builds:
-      - arch: x86_64
-        os: linux
-        cpu: native
-        # the combination of the following two makes CC="ccache gcc"
-        compiler: gcc
-        compiler_wrapper: ccache
+  - dpdk_build:
+      arch: x86_64
+      os: linux
+      cpu: native
+      # the combination of the following two makes CC="ccache gcc"
+      compiler: gcc
+      compiler_wrapper: ccache
     perf: false # disable performance testing
     func: true # enable functional testing
     skip_smoke_tests: false # optional
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index 598d7101ed..aba49143ae 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -437,7 +437,7 @@ class TestRunConfiguration:
     and with what DPDK build.
 
     Attributes:
-        dpdk_builds: A list of DPDK builds to test.
+        dpdk_build: A DPDK build to test.
         perf: Whether to run performance tests.
         func: Whether to run functional tests.
         skip_smoke_tests: Whether to skip smoke tests.
@@ -447,7 +447,7 @@ class TestRunConfiguration:
         vdevs: The names of virtual devices to test.
     """
 
-    dpdk_builds: list[DPDKBuildConfiguration]
+    dpdk_build: DPDKBuildConfiguration
     perf: bool
     func: bool
     skip_smoke_tests: bool
@@ -475,9 +475,6 @@ def from_dict(
         Returns:
             The test run configuration instance.
         """
-        dpdk_builds: list[DPDKBuildConfiguration] = list(
-            map(DPDKBuildConfiguration.from_dict, d["dpdk_builds"])
-        )
         test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"]))
         sut_name = d["system_under_test_node"]["node_name"]
         skip_smoke_tests = d.get("skip_smoke_tests", False)
@@ -498,7 +495,7 @@ def from_dict(
             d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else []
         )
         return cls(
-            dpdk_builds=dpdk_builds,
+            dpdk_build=DPDKBuildConfiguration.from_dict(d["dpdk_build"]),
             perf=d["perf"],
             func=d["func"],
             skip_smoke_tests=skip_smoke_tests,
diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
index 4b63e9710e..c0c347199e 100644
--- a/dts/framework/config/conf_yaml_schema.json
+++ b/dts/framework/config/conf_yaml_schema.json
@@ -327,12 +327,8 @@
       "items": {
         "type": "object",
         "properties": {
-          "dpdk_builds": {
-            "type": "array",
-            "items": {
-              "$ref": "#/definitions/dpdk_build"
-            },
-            "minimum": 1
+          "dpdk_build": {
+            "$ref": "#/definitions/dpdk_build"
           },
           "perf": {
             "type": "boolean",
@@ -383,7 +379,7 @@
         },
         "additionalProperties": false,
         "required": [
-          "dpdk_builds",
+          "dpdk_build",
           "perf",
           "func",
           "test_suites",
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
index 703d9eb48e..9b3c997c80 100644
--- a/dts/framework/config/types.py
+++ b/dts/framework/config/types.py
@@ -108,7 +108,7 @@ class TestRunConfigDict(TypedDict):
     """Allowed keys and values."""
 
     #:
-    dpdk_builds: list[DPDKBuildConfigDict]
+    dpdk_build: DPDKBuildConfigDict
     #:
     perf: bool
     #:
diff --git a/dts/framework/logger.py b/dts/framework/logger.py
index 3fbe618219..d2b8e37da4 100644
--- a/dts/framework/logger.py
+++ b/dts/framework/logger.py
@@ -33,16 +33,12 @@ class DtsStage(StrEnum):
     #:
     test_run_setup = auto()
     #:
-    dpdk_build_setup = auto()
-    #:
     test_suite_setup = auto()
     #:
     test_suite = auto()
     #:
     test_suite_teardown = auto()
     #:
-    dpdk_build_teardown = auto()
-    #:
     test_run_teardown = auto()
     #:
     post_run = auto()
diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index 2b5403e51c..a212ca2470 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -12,7 +12,7 @@
     #. Test suite stage,
     #. Test case stage.
 
-The test run and DPDK build stages set up the environment before running test suites.
+The test run stage sets up the environment before running test suites.
 The test suite stage sets up steps common to all test cases
 and the test case stage runs test cases individually.
 """
@@ -29,13 +29,7 @@
 from framework.testbed_model.sut_node import SutNode
 from framework.testbed_model.tg_node import TGNode
 
-from .config import (
-    Configuration,
-    DPDKBuildConfiguration,
-    TestRunConfiguration,
-    TestSuiteConfig,
-    load_config,
-)
+from .config import Configuration, TestRunConfiguration, TestSuiteConfig, load_config
 from .exception import (
     BlockingTestSuiteError,
     ConfigurationError,
@@ -45,7 +39,6 @@
 from .logger import DTSLogger, DtsStage, get_dts_logger
 from .settings import SETTINGS
 from .test_result import (
-    DPDKBuildResult,
     DTSResult,
     Result,
     TestCaseResult,
@@ -69,9 +62,9 @@ class DTSRunner:
     :class:`~.framework.exception.DTSError`\s.
 
     Example:
-        An error occurs in a DPDK build setup. The current DPDK build is aborted,
-        all test suites and their test cases are marked as blocked and the run continues
-        with the next DPDK build. If the errored DPDK build was the last one in the
+        An error occurs in a test suite setup. The current test suite is aborted,
+        all its test cases are marked as blocked and the run continues
+        with the next test suite. If the errored test suite was the last one in the
         given test run, the next test run begins.
     """
 
@@ -97,16 +90,16 @@ def __init__(self):
         self._perf_test_case_regex = r"test_perf_"
 
     def run(self) -> None:
-        """Run all DPDK build in all test runs from the test run configuration.
+        """Run all test runs from the test run configuration.
 
-        Before running test suites, test runs and DPDK builds are first set up.
-        The test runs and DPDK builds defined in the test run configuration are iterated over.
-        The test runs define which tests to run and where to run them and DPDK builds define
-        the DPDK build setup.
+        Before running test suites, test runs are first set up.
+        The test runs defined in the test run configuration are iterated over.
+        The test runs define which tests to run and where to run them.
 
-        The tests suites are set up for each test run/DPDK build tuple and each discovered
+        The test suites are set up for each test run and each discovered
         test case within the test suite is set up, executed and torn down. After all test cases
-        have been executed, the test suite is torn down and the next DPDK build will be tested.
+        have been executed, the test suite is torn down and the next test suite will be run. Once
+        all test suites have been run, the test test run will be tested.
 
         In order to properly mark test suites and test cases as blocked in case of a failure,
         we need to have discovered which test suites and test cases to run before any failures
@@ -116,17 +109,13 @@ def run(self) -> None:
 
             #. Test run setup
 
-                #. DPDK build setup
-
-                    #. Test suite setup
+                #. Test suite setup
 
-                        #. Test case setup
-                        #. Test case logic
-                        #. Test case teardown
+                    #. Test case setup
+                    #. Test case logic
+                    #. Test case teardown
 
-                    #. Test suite teardown
-
-                #. DPDK build teardown
+                #. Test suite teardown
 
             #. Test run teardown
 
@@ -414,7 +403,7 @@ def _run_test_run(
     ) -> None:
         """Run the given test run.
 
-        This involves running the test run setup as well as running all DPDK builds
+        This involves running the test run setup as well as running all test suites
         in the given test run. After that, the test run teardown is run.
 
         Args:
@@ -430,6 +419,7 @@ def _run_test_run(
         test_run_result.add_sut_info(sut_node.node_info)
         try:
             sut_node.set_up_test_run(test_run_config)
+            test_run_result.add_dpdk_build_info(sut_node.get_dpdk_build_info())
             tg_node.set_up_test_run(test_run_config)
             test_run_result.update_setup(Result.PASS)
         except Exception as e:
@@ -437,15 +427,7 @@ def _run_test_run(
             test_run_result.update_setup(Result.FAIL, e)
 
         else:
-            for dpdk_build_config in test_run_config.dpdk_builds:
-                dpdk_build_result = test_run_result.add_dpdk_build(dpdk_build_config)
-                self._run_dpdk_build(
-                    sut_node,
-                    tg_node,
-                    dpdk_build_config,
-                    dpdk_build_result,
-                    test_suites_with_cases,
-                )
+            self._run_test_suites(sut_node, tg_node, test_run_result, test_suites_with_cases)
 
         finally:
             try:
@@ -457,82 +439,35 @@ def _run_test_run(
                 self._logger.exception("Test run teardown failed.")
                 test_run_result.update_teardown(Result.FAIL, e)
 
-    def _run_dpdk_build(
-        self,
-        sut_node: SutNode,
-        tg_node: TGNode,
-        dpdk_build_config: DPDKBuildConfiguration,
-        dpdk_build_result: DPDKBuildResult,
-        test_suites_with_cases: Iterable[TestSuiteWithCases],
-    ) -> None:
-        """Run the given DPDK build.
-
-        This involves running the DPDK build setup as well as running all test suites
-        of the DPDK build's test run.
-        After that, DPDK build teardown is run.
-
-        Args:
-            sut_node: The test run's sut node.
-            tg_node: The test run's tg node.
-            dpdk_build_config: A DPDK build's test run configuration.
-            dpdk_build_result: The DPDK build level result object associated
-                with the current DPDK build.
-            test_suites_with_cases: The test suites with test cases to run.
-        """
-        self._logger.set_stage(DtsStage.dpdk_build_setup)
-        self._logger.info(f"Running DPDK build '{dpdk_build_config.name}'.")
-
-        try:
-            sut_node.set_up_dpdk(dpdk_build_config)
-            self._result.dpdk_version = sut_node.dpdk_version
-            dpdk_build_result.add_dpdk_build_info(sut_node.get_dpdk_build_info())
-            dpdk_build_result.update_setup(Result.PASS)
-        except Exception as e:
-            self._logger.exception("DPDK build setup failed.")
-            dpdk_build_result.update_setup(Result.FAIL, e)
-
-        else:
-            self._run_test_suites(sut_node, tg_node, dpdk_build_result, test_suites_with_cases)
-
-        finally:
-            try:
-                self._logger.set_stage(DtsStage.dpdk_build_teardown)
-                sut_node.tear_down_dpdk()
-                dpdk_build_result.update_teardown(Result.PASS)
-            except Exception as e:
-                self._logger.exception("DPDK build teardown failed.")
-                dpdk_build_result.update_teardown(Result.FAIL, e)
-
     def _run_test_suites(
         self,
         sut_node: SutNode,
         tg_node: TGNode,
-        dpdk_build_result: DPDKBuildResult,
+        test_run_result: TestRunResult,
         test_suites_with_cases: Iterable[TestSuiteWithCases],
     ) -> None:
-        """Run `test_suites_with_cases` with the current DPDK build.
+        """Run `test_suites_with_cases` with the current test run.
 
         The method assumes the DPDK we're testing has already been built on the SUT node.
 
         If a blocking test suite (such as the smoke test suite) fails, the rest of the test suites
-        in the current DPDK build won't be executed.
+        in the current test run won't be executed.
 
         Args:
             sut_node: The test run's SUT node.
             tg_node: The test run's TG node.
-            dpdk_build_result: The DPDK build level result object associated
-                with the current DPDK build.
+            test_run_result: The test run's result.
             test_suites_with_cases: The test suites with test cases to run.
         """
         end_dpdk_build = False
         for test_suite_with_cases in test_suites_with_cases:
-            test_suite_result = dpdk_build_result.add_test_suite(test_suite_with_cases)
+            test_suite_result = test_run_result.add_test_suite(test_suite_with_cases)
             try:
                 self._run_test_suite(sut_node, tg_node, test_suite_result, test_suite_with_cases)
             except BlockingTestSuiteError as e:
                 self._logger.exception(
                     f"An error occurred within {test_suite_with_cases.test_suite_class.__name__}. "
-                    "Skipping DPDK build ..."
+                    "Skipping the rest of the test suites in this test run."
                 )
                 self._result.add_error(e)
                 end_dpdk_build = True
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 95788b7d2e..9e9cd94e33 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -8,12 +8,11 @@
 
     * :class:`DTSResult` contains
     * :class:`TestRunResult` contains
-    * :class:`DPDKBuildResult` contains
     * :class:`TestSuiteResult` contains
     * :class:`TestCaseResult`
 
 Each result may contain multiple lower level results, e.g. there are multiple
-:class:`TestSuiteResult`\s in a :class:`DPDKBuildResult`.
+:class:`TestSuiteResult`\s in an :class:`TestRunResult`.
 The results have common parts, such as setup and teardown results, captured in :class:`BaseResult`,
 which also defines some common behaviors in its methods.
 
@@ -35,7 +34,6 @@
     Architecture,
     Compiler,
     CPUType,
-    DPDKBuildConfiguration,
     DPDKBuildInfo,
     NodeInfo,
     TestRunConfiguration,
@@ -138,7 +136,7 @@ class BaseResult:
     Stores the results of the setup and teardown portions of the corresponding stage.
     The hierarchical nature of DTS results is captured recursively in an internal list.
     A stage is each level in this particular hierarchy (pre-run or the top-most level,
-    test run, DPDK build, test suite and test case.)
+    test run, test suite and test case.)
 
     Attributes:
         setup_result: The result of the setup of the particular stage.
@@ -222,8 +220,8 @@ def add_stats(self, statistics: "Statistics") -> None:
 class DTSResult(BaseResult):
     """Stores environment information and test results from a DTS run.
 
-        * Test run level information, such as testbed and the test suite list,
-        * DPDK build level information, such as compiler, target OS and cpu,
+        * Test run level information, such as testbed, compiler, target OS and cpu and
+          the test suite list,
         * Test suite and test case results,
         * All errors that are caught and recorded during DTS execution.
 
@@ -317,44 +315,61 @@ def get_return_code(self) -> int:
 class TestRunResult(BaseResult):
     """The test run specific result.
 
-    The internal list stores the results of all DPDK builds in a given test run.
+    The internal list stores the results of all test suites in a given test run.
 
     Attributes:
+        arch: The DPDK build architecture.
+        os: The DPDK build operating system.
+        cpu: The DPDK build CPU.
+        compiler: The DPDK build compiler.
+        compiler_version: The DPDK build compiler version.
+        dpdk_version: The built DPDK version.
         sut_os_name: The operating system of the SUT node.
         sut_os_version: The operating system version of the SUT node.
         sut_kernel_version: The operating system kernel version of the SUT node.
     """
 
+    arch: Architecture
+    os: OS
+    cpu: CPUType
+    compiler: Compiler
+    compiler_version: str | None
+    dpdk_version: str | None
     sut_os_name: str
     sut_os_version: str
     sut_kernel_version: str
     _config: TestRunConfiguration
-    _parent_result: DTSResult
     _test_suites_with_cases: list[TestSuiteWithCases]
 
     def __init__(self, test_run_config: TestRunConfiguration):
-        """Extend the constructor with the test run's config and DTSResult.
+        """Extend the constructor with the test run's config.
 
         Args:
             test_run_config: A test run configuration.
         """
         super().__init__()
+        self.arch = test_run_config.dpdk_build.arch
+        self.os = test_run_config.dpdk_build.os
+        self.cpu = test_run_config.dpdk_build.cpu
+        self.compiler = test_run_config.dpdk_build.compiler
+        self.compiler_version = None
+        self.dpdk_version = None
         self._config = test_run_config
         self._test_suites_with_cases = []
 
-    def add_dpdk_build(self, dpdk_build_config: DPDKBuildConfiguration) -> "DPDKBuildResult":
-        """Add and return the child result (DPDK build).
+    def add_test_suite(
+        self,
+        test_suite_with_cases: TestSuiteWithCases,
+    ) -> "TestSuiteResult":
+        """Add and return the child result (test suite).
 
         Args:
-            dpdk_build: The DPDK build's test run configuration.
+            test_suite_with_cases: The test suite with test cases.
 
         Returns:
-            The DPDK build's result.
+            The test suite's result.
         """
-        result = DPDKBuildResult(
-            self._test_suites_with_cases,
-            dpdk_build_config,
-        )
+        result = TestSuiteResult(test_suite_with_cases)
         self.child_results.append(result)
         return result
 
@@ -390,71 +405,6 @@ def add_sut_info(self, sut_info: NodeInfo) -> None:
         self.sut_os_version = sut_info.os_version
         self.sut_kernel_version = sut_info.kernel_version
 
-    def _block_result(self) -> None:
-        r"""Mark the result as :attr:`~Result.BLOCK`\ed."""
-        for dpdk_build in self._config.dpdk_builds:
-            child_result = self.add_dpdk_build(dpdk_build)
-            child_result.update_setup(Result.BLOCK)
-
-
-class DPDKBuildResult(BaseResult):
-    """The DPDK build specific result.
-
-    The internal list stores the results of all test suites in a given DPDK build.
-
-    Attributes:
-        arch: The DPDK DPDK build architecture.
-        os: The DPDK DPDK build operating system.
-        cpu: The DPDK DPDK build CPU.
-        compiler: The DPDK DPDK build compiler.
-        compiler_version: The DPDK DPDK build compiler version.
-        dpdk_version: The built DPDK version.
-    """
-
-    arch: Architecture
-    os: OS
-    cpu: CPUType
-    compiler: Compiler
-    compiler_version: str | None
-    dpdk_version: str | None
-    _test_suites_with_cases: list[TestSuiteWithCases]
-
-    def __init__(
-        self,
-        test_suites_with_cases: list[TestSuiteWithCases],
-        dpdk_build_config: DPDKBuildConfiguration,
-    ):
-        """Extend the constructor with the DPDK build's config and test suites with cases.
-
-        Args:
-            test_suites_with_cases: The test suites with test cases to be run in this DPDK build.
-            dpdk_build_config: The DPDK build's test run configuration.
-        """
-        super().__init__()
-        self.arch = dpdk_build_config.arch
-        self.os = dpdk_build_config.os
-        self.cpu = dpdk_build_config.cpu
-        self.compiler = dpdk_build_config.compiler
-        self.compiler_version = None
-        self.dpdk_version = None
-        self._test_suites_with_cases = test_suites_with_cases
-
-    def add_test_suite(
-        self,
-        test_suite_with_cases: TestSuiteWithCases,
-    ) -> "TestSuiteResult":
-        """Add and return the child result (test suite).
-
-        Args:
-            test_suite_with_cases: The test suite with test cases.
-
-        Returns:
-            The test suite's result.
-        """
-        result = TestSuiteResult(test_suite_with_cases)
-        self.child_results.append(result)
-        return result
-
     def add_dpdk_build_info(self, versions: DPDKBuildInfo) -> None:
         """Add information about the DPDK build gathered at runtime.
 
@@ -482,11 +432,10 @@ class TestSuiteResult(BaseResult):
 
     test_suite_name: str
     _test_suite_with_cases: TestSuiteWithCases
-    _parent_result: DPDKBuildResult
     _child_configs: list[str]
 
     def __init__(self, test_suite_with_cases: TestSuiteWithCases):
-        """Extend the constructor with test suite's config and DPDKBuildResult.
+        """Extend the constructor with test suite's config.
 
         Args:
             test_suite_with_cases: The test suite with test cases.
@@ -529,7 +478,7 @@ class TestCaseResult(BaseResult, FixtureResult):
     test_case_name: str
 
     def __init__(self, test_case_name: str):
-        """Extend the constructor with test case's name and TestSuiteResult.
+        """Extend the constructor with test case's name.
 
         Args:
             test_case_name: The test case's name.
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index c473b91c5a..80b9fc456d 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -65,7 +65,7 @@ class TestSuite:
     sut_node: SutNode
     tg_node: TGNode
     #: Whether the test suite is blocking. A failure of a blocking test suite
-    #: will block the execution of all subsequent test suites in the current DPDK build.
+    #: will block the execution of all subsequent test suites in the current test run.
     is_blocking: ClassVar[bool] = False
     _logger: DTSLogger
     _port_links: list[PortLink]
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 6b6fb894ca..9bfb91816e 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -181,13 +181,15 @@ def set_up_test_run(self, test_run_config: TestRunConfiguration) -> None:
         super().set_up_test_run(test_run_config)
         for vdev in test_run_config.vdevs:
             self.virtual_devices.append(VirtualDevice(vdev))
+        self._set_up_dpdk(test_run_config.dpdk_build)
 
     def tear_down_test_run(self) -> None:
         """Extend the test run teardown with virtual device teardown."""
         super().tear_down_test_run()
         self.virtual_devices = []
+        self._tear_down_dpdk()
 
-    def set_up_dpdk(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
+    def _set_up_dpdk(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
         """Set up DPDK the SUT node and bind ports.
 
         DPDK setup includes setting all internals needed for the build, the copying of DPDK tarball
@@ -202,7 +204,7 @@ def set_up_dpdk(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
         self._build_dpdk()
         self.bind_ports_to_driver()
 
-    def tear_down_dpdk(self) -> None:
+    def _tear_down_dpdk(self) -> None:
         """Reset DPDK variables and bind port driver to the OS driver."""
         self._env_vars = {}
         self._dpdk_build_config = None
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
index ac661472b9..99fa8d19c7 100644
--- a/dts/tests/TestSuite_smoke_tests.py
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -29,7 +29,7 @@ class TestSmokeTests(TestSuite):
 
     Attributes:
         is_blocking: This test suite will block the execution of all other test suites
-            in the DPDK build after it.
+            in the test run after it.
         nics_in_node: The NICs present on the SUT node.
     """
 
-- 
2.43.0


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [RFC PATCH v1 03/12] dts: fix remote session transferring files
  2024-09-06 13:26 [RFC PATCH v1 00/12] DTS external DPDK build and stats Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 01/12] dts: rename build target to DPDK build Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 02/12] dts: one dpdk build per test run Juraj Linkeš
@ 2024-09-06 13:26 ` Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 04/12] dts: improve path handling for local and remote paths Juraj Linkeš
                   ` (8 subsequent siblings)
  11 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-06 13:26 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, paul.szczepanek, Luca.Vizzarro,
	alex.chapman, probb, jspewock, npratte, dmarx
  Cc: dev, Tomáš Ďurovec

From: Tomáš Ďurovec <tomas.durovec@pantheon.tech>

Fix parameters layout between source and destination
according to docs.

Signed-off-by: Tomáš Ďurovec <tomas.durovec@pantheon.tech>
---
 dts/framework/remote_session/remote_session.py | 14 ++++++++------
 dts/framework/remote_session/ssh_session.py    |  8 ++++----
 dts/framework/testbed_model/os_session.py      | 18 ++++++++++--------
 dts/framework/testbed_model/posix_session.py   |  8 ++++----
 4 files changed, 26 insertions(+), 22 deletions(-)

diff --git a/dts/framework/remote_session/remote_session.py b/dts/framework/remote_session/remote_session.py
index 8c580b070f..6ca8593c90 100644
--- a/dts/framework/remote_session/remote_session.py
+++ b/dts/framework/remote_session/remote_session.py
@@ -199,32 +199,34 @@ def is_alive(self) -> bool:
     def copy_from(
         self,
         source_file: str | PurePath,
-        destination_file: str | PurePath,
+        destination_dir: str | PurePath,
     ) -> None:
         """Copy a file from the remote Node to the local filesystem.
 
         Copy `source_file` from the remote Node associated with this remote session
-        to `destination_file` on the local filesystem.
+        to `destination_dir` on the local filesystem.
 
         Args:
             source_file: The file on the remote Node.
-            destination_file: A file or directory path on the local filesystem.
+            destination_dir: A dir path on the local filesystem, where the `source_file`
+                will be saved.
         """
 
     @abstractmethod
     def copy_to(
         self,
         source_file: str | PurePath,
-        destination_file: str | PurePath,
+        destination_dir: str | PurePath,
     ) -> None:
         """Copy a file from local filesystem to the remote Node.
 
-        Copy `source_file` from local filesystem to `destination_file` on the remote Node
+        Copy `source_file` from local filesystem to `destination_dir` on the remote Node
         associated with this remote session.
 
         Args:
             source_file: The file on the local filesystem.
-            destination_file: A file or directory path on the remote Node.
+            destination_dir: A dir path on the remote Node, where the `source_file`
+                will be saved.
         """
 
     @abstractmethod
diff --git a/dts/framework/remote_session/ssh_session.py b/dts/framework/remote_session/ssh_session.py
index 66f8176833..a756bfecef 100644
--- a/dts/framework/remote_session/ssh_session.py
+++ b/dts/framework/remote_session/ssh_session.py
@@ -106,18 +106,18 @@ def is_alive(self) -> bool:
     def copy_from(
         self,
         source_file: str | PurePath,
-        destination_file: str | PurePath,
+        destination_dir: str | PurePath,
     ) -> None:
         """Overrides :meth:`~.remote_session.RemoteSession.copy_from`."""
-        self.session.get(str(destination_file), str(source_file))
+        self.session.get(str(source_file), str(destination_dir))
 
     def copy_to(
         self,
         source_file: str | PurePath,
-        destination_file: str | PurePath,
+        destination_dir: str | PurePath,
     ) -> None:
         """Overrides :meth:`~.remote_session.RemoteSession.copy_to`."""
-        self.session.put(str(source_file), str(destination_file))
+        self.session.put(str(source_file), str(destination_dir))
 
     def close(self) -> None:
         """Overrides :meth:`~.remote_session.RemoteSession.close`."""
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 79f56b289b..8928a47d6f 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -181,32 +181,34 @@ def join_remote_path(self, *args: str | PurePath) -> PurePath:
     def copy_from(
         self,
         source_file: str | PurePath,
-        destination_file: str | PurePath,
+        destination_dir: str | PurePath,
     ) -> None:
         """Copy a file from the remote node to the local filesystem.
 
         Copy `source_file` from the remote node associated with this remote
-        session to `destination_file` on the local filesystem.
+        session to `destination_dir` on the local filesystem.
 
         Args:
-            source_file: the file on the remote node.
-            destination_file: a file or directory path on the local filesystem.
+            source_file: The file on the remote node.
+            destination_dir: A dir path on the local filesystem, where the `source_file`
+                will be saved.
         """
 
     @abstractmethod
     def copy_to(
         self,
         source_file: str | PurePath,
-        destination_file: str | PurePath,
+        destination_dir: str | PurePath,
     ) -> None:
         """Copy a file from local filesystem to the remote node.
 
-        Copy `source_file` from local filesystem to `destination_file`
+        Copy `source_file` from local filesystem to `destination_dir`
         on the remote node associated with this remote session.
 
         Args:
-            source_file: the file on the local filesystem.
-            destination_file: a file or directory path on the remote node.
+            source_file: The file on the local filesystem.
+            destination_dir: A dir path on the remote Node, where the `source_file`
+                will be saved.
         """
 
     @abstractmethod
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index d279bb8b53..7f0b1f2036 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -88,18 +88,18 @@ def join_remote_path(self, *args: str | PurePath) -> PurePosixPath:
     def copy_from(
         self,
         source_file: str | PurePath,
-        destination_file: str | PurePath,
+        destination_dir: str | PurePath,
     ) -> None:
         """Overrides :meth:`~.os_session.OSSession.copy_from`."""
-        self.remote_session.copy_from(source_file, destination_file)
+        self.remote_session.copy_from(source_file, destination_dir)
 
     def copy_to(
         self,
         source_file: str | PurePath,
-        destination_file: str | PurePath,
+        destination_dir: str | PurePath,
     ) -> None:
         """Overrides :meth:`~.os_session.OSSession.copy_to`."""
-        self.remote_session.copy_to(source_file, destination_file)
+        self.remote_session.copy_to(source_file, destination_dir)
 
     def remove_remote_dir(
         self,
-- 
2.43.0


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [RFC PATCH v1 04/12] dts: improve path handling for local and remote paths
  2024-09-06 13:26 [RFC PATCH v1 00/12] DTS external DPDK build and stats Juraj Linkeš
                   ` (2 preceding siblings ...)
  2024-09-06 13:26 ` [RFC PATCH v1 03/12] dts: fix remote session transferring files Juraj Linkeš
@ 2024-09-06 13:26 ` Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 05/12] dts: add the ability to copy directories via remote Juraj Linkeš
                   ` (7 subsequent siblings)
  11 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-06 13:26 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, paul.szczepanek, Luca.Vizzarro,
	alex.chapman, probb, jspewock, npratte, dmarx
  Cc: dev, Tomáš Ďurovec

From: Tomáš Ďurovec <tomas.durovec@pantheon.tech>

Update remote session to clearly differentiate between
local and remote paths. Local paths now accept OS-aware
path objects, while remote paths handle OS-agnostic paths.

Signed-off-by: Tomáš Ďurovec <tomas.durovec@pantheon.tech>
---
 dts/framework/remote_session/remote_session.py | 6 +++---
 dts/framework/remote_session/ssh_session.py    | 6 +++---
 dts/framework/testbed_model/os_session.py      | 6 +++---
 dts/framework/testbed_model/posix_session.py   | 6 +++---
 4 files changed, 12 insertions(+), 12 deletions(-)

diff --git a/dts/framework/remote_session/remote_session.py b/dts/framework/remote_session/remote_session.py
index 6ca8593c90..ce311e70b6 100644
--- a/dts/framework/remote_session/remote_session.py
+++ b/dts/framework/remote_session/remote_session.py
@@ -12,7 +12,7 @@
 
 from abc import ABC, abstractmethod
 from dataclasses import InitVar, dataclass, field
-from pathlib import PurePath
+from pathlib import Path, PurePath
 
 from framework.config import NodeConfiguration
 from framework.exception import RemoteCommandExecutionError
@@ -199,7 +199,7 @@ def is_alive(self) -> bool:
     def copy_from(
         self,
         source_file: str | PurePath,
-        destination_dir: str | PurePath,
+        destination_dir: str | Path,
     ) -> None:
         """Copy a file from the remote Node to the local filesystem.
 
@@ -215,7 +215,7 @@ def copy_from(
     @abstractmethod
     def copy_to(
         self,
-        source_file: str | PurePath,
+        source_file: str | Path,
         destination_dir: str | PurePath,
     ) -> None:
         """Copy a file from local filesystem to the remote Node.
diff --git a/dts/framework/remote_session/ssh_session.py b/dts/framework/remote_session/ssh_session.py
index a756bfecef..88a000912e 100644
--- a/dts/framework/remote_session/ssh_session.py
+++ b/dts/framework/remote_session/ssh_session.py
@@ -5,7 +5,7 @@
 
 import socket
 import traceback
-from pathlib import PurePath
+from pathlib import Path, PurePath
 
 from fabric import Connection  # type: ignore[import-untyped]
 from invoke.exceptions import (  # type: ignore[import-untyped]
@@ -106,14 +106,14 @@ def is_alive(self) -> bool:
     def copy_from(
         self,
         source_file: str | PurePath,
-        destination_dir: str | PurePath,
+        destination_dir: str | Path,
     ) -> None:
         """Overrides :meth:`~.remote_session.RemoteSession.copy_from`."""
         self.session.get(str(source_file), str(destination_dir))
 
     def copy_to(
         self,
-        source_file: str | PurePath,
+        source_file: str | Path,
         destination_dir: str | PurePath,
     ) -> None:
         """Overrides :meth:`~.remote_session.RemoteSession.copy_to`."""
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 8928a47d6f..d24f44df10 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -25,7 +25,7 @@
 from abc import ABC, abstractmethod
 from collections.abc import Iterable
 from ipaddress import IPv4Interface, IPv6Interface
-from pathlib import PurePath
+from pathlib import Path, PurePath
 from typing import Union
 
 from framework.config import Architecture, NodeConfiguration, NodeInfo
@@ -181,7 +181,7 @@ def join_remote_path(self, *args: str | PurePath) -> PurePath:
     def copy_from(
         self,
         source_file: str | PurePath,
-        destination_dir: str | PurePath,
+        destination_dir: str | Path,
     ) -> None:
         """Copy a file from the remote node to the local filesystem.
 
@@ -197,7 +197,7 @@ def copy_from(
     @abstractmethod
     def copy_to(
         self,
-        source_file: str | PurePath,
+        source_file: str | Path,
         destination_dir: str | PurePath,
     ) -> None:
         """Copy a file from local filesystem to the remote node.
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 7f0b1f2036..0d8c5f91a6 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -13,7 +13,7 @@
 
 import re
 from collections.abc import Iterable
-from pathlib import PurePath, PurePosixPath
+from pathlib import Path, PurePath, PurePosixPath
 
 from framework.config import Architecture, NodeInfo
 from framework.exception import DPDKBuildError, RemoteCommandExecutionError
@@ -88,14 +88,14 @@ def join_remote_path(self, *args: str | PurePath) -> PurePosixPath:
     def copy_from(
         self,
         source_file: str | PurePath,
-        destination_dir: str | PurePath,
+        destination_dir: str | Path,
     ) -> None:
         """Overrides :meth:`~.os_session.OSSession.copy_from`."""
         self.remote_session.copy_from(source_file, destination_dir)
 
     def copy_to(
         self,
-        source_file: str | PurePath,
+        source_file: str | Path,
         destination_dir: str | PurePath,
     ) -> None:
         """Overrides :meth:`~.os_session.OSSession.copy_to`."""
-- 
2.43.0


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [RFC PATCH v1 05/12] dts: add the ability to copy directories via remote
  2024-09-06 13:26 [RFC PATCH v1 00/12] DTS external DPDK build and stats Juraj Linkeš
                   ` (3 preceding siblings ...)
  2024-09-06 13:26 ` [RFC PATCH v1 04/12] dts: improve path handling for local and remote paths Juraj Linkeš
@ 2024-09-06 13:26 ` Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 06/12] dts: add ability to prevent overwriting files/dirs Juraj Linkeš
                   ` (6 subsequent siblings)
  11 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-06 13:26 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, paul.szczepanek, Luca.Vizzarro,
	alex.chapman, probb, jspewock, npratte, dmarx
  Cc: dev, Tomáš Ďurovec

From: Tomáš Ďurovec <tomas.durovec@pantheon.tech>

Signed-off-by: Tomáš Ďurovec <tomas.durovec@pantheon.tech>
---
 dts/framework/testbed_model/os_session.py    | 88 +++++++++++++++---
 dts/framework/testbed_model/posix_session.py | 93 ++++++++++++++++---
 dts/framework/utils.py                       | 97 ++++++++++++++++++--
 3 files changed, 246 insertions(+), 32 deletions(-)

diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index d24f44df10..92b1a09d94 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -38,7 +38,7 @@
 )
 from framework.remote_session.remote_session import CommandResult
 from framework.settings import SETTINGS
-from framework.utils import MesonArgs
+from framework.utils import MesonArgs, TarCompressionFormat
 
 from .cpu import LogicalCore
 from .port import Port
@@ -178,11 +178,7 @@ def join_remote_path(self, *args: str | PurePath) -> PurePath:
         """
 
     @abstractmethod
-    def copy_from(
-        self,
-        source_file: str | PurePath,
-        destination_dir: str | Path,
-    ) -> None:
+    def copy_from(self, source_file: str | PurePath, destination_dir: str | Path) -> None:
         """Copy a file from the remote node to the local filesystem.
 
         Copy `source_file` from the remote node associated with this remote
@@ -195,11 +191,7 @@ def copy_from(
         """
 
     @abstractmethod
-    def copy_to(
-        self,
-        source_file: str | Path,
-        destination_dir: str | PurePath,
-    ) -> None:
+    def copy_to(self, source_file: str | Path, destination_dir: str | PurePath) -> None:
         """Copy a file from local filesystem to the remote node.
 
         Copy `source_file` from local filesystem to `destination_dir`
@@ -211,6 +203,57 @@ def copy_to(
                 will be saved.
         """
 
+    @abstractmethod
+    def copy_dir_from(
+        self,
+        source_dir: str | PurePath,
+        destination_dir: str | Path,
+        compress_format: TarCompressionFormat = TarCompressionFormat.none,
+        exclude: str | list[str] | None = None,
+    ) -> None:
+        """Copy a dir from the remote node to the local filesystem.
+
+        Copy `source_dir` from the remote node associated with this remote session to
+        `destination_dir` on the local filesystem. The new local dir will be created
+        at `destination_dir` path.
+
+        Args:
+            source_dir: The dir on the remote node.
+            destination_dir: A dir path on the local filesystem.
+            compress_format: The compression format to use. Default is no compression.
+            exclude: Files or dirs to exclude before creating the tarball.
+        """
+
+    @abstractmethod
+    def copy_dir_to(
+        self,
+        source_dir: str | Path,
+        destination_dir: str | PurePath,
+        compress_format: TarCompressionFormat = TarCompressionFormat.none,
+        exclude: str | list[str] | None = None,
+    ) -> None:
+        """Copy a dir from the local filesystem to the remote node.
+
+        Copy `source_dir` from the local filesystem to `destination_dir` on the remote node
+        associated with this remote session. The new remote dir will be created at
+        `destination_dir` path.
+
+        Args:
+            source_dir: The dir on the local filesystem.
+            destination_dir: A dir path on the remote node.
+            compress_format: The compression format to use. Default is no compression.
+            exclude: Files or dirs to exclude before creating the tarball.
+        """
+
+    @abstractmethod
+    def remove_remote_file(self, remote_file_path: str | PurePath, force: bool = True) -> None:
+        """Remove remote file, by default remove forcefully.
+
+        Args:
+            remote_file_path: The path of the file to remove.
+            force: If :data:`True`, ignore all warnings and try to remove at all costs.
+        """
+
     @abstractmethod
     def remove_remote_dir(
         self,
@@ -218,14 +261,31 @@ def remove_remote_dir(
         recursive: bool = True,
         force: bool = True,
     ) -> None:
-        """Remove remote directory, by default remove recursively and forcefully.
+        """Remove remote dir, by default remove recursively and forcefully.
 
         Args:
-            remote_dir_path: The path of the directory to remove.
-            recursive: If :data:`True`, also remove all contents inside the directory.
+            remote_dir_path: The path of the dir to remove.
+            recursive: If :data:`True`, also remove all contents inside the dir.
             force: If :data:`True`, ignore all warnings and try to remove at all costs.
         """
 
+    @abstractmethod
+    def create_remote_tarball(
+        self,
+        remote_dir_path: str | PurePath,
+        compress_format: TarCompressionFormat = TarCompressionFormat.none,
+        exclude: str | list[str] | None = None,
+    ) -> None:
+        """Create a tarball from dir on the remote node.
+
+        The remote tarball will be saved in the directory of `remote_dir_path`.
+
+        Args:
+            remote_dir_path: The path of dir on the remote node.
+            compress_format: The compression format to use. Default is no compression.
+            exclude: Files or dirs to exclude before creating the tarball.
+        """
+
     @abstractmethod
     def extract_remote_tarball(
         self,
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 0d8c5f91a6..5a6d971d7d 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -18,7 +18,13 @@
 from framework.config import Architecture, NodeInfo
 from framework.exception import DPDKBuildError, RemoteCommandExecutionError
 from framework.settings import SETTINGS
-from framework.utils import MesonArgs
+from framework.utils import (
+    MesonArgs,
+    TarCompressionFormat,
+    create_tarball,
+    ensure_list_of_strings,
+    extract_tarball,
+)
 
 from .os_session import OSSession
 
@@ -85,21 +91,57 @@ def join_remote_path(self, *args: str | PurePath) -> PurePosixPath:
         """Overrides :meth:`~.os_session.OSSession.join_remote_path`."""
         return PurePosixPath(*args)
 
-    def copy_from(
+    def copy_from(self, source_file: str | PurePath, destination_dir: str | Path) -> None:
+        """Overrides :meth:`~.os_session.OSSession.copy_from`."""
+        self.remote_session.copy_from(source_file, destination_dir)
+
+    def copy_to(self, source_file: str | Path, destination_dir: str | PurePath) -> None:
+        """Overrides :meth:`~.os_session.OSSession.copy_to`."""
+        self.remote_session.copy_to(source_file, destination_dir)
+
+    def copy_dir_from(
         self,
-        source_file: str | PurePath,
+        source_dir: str | PurePath,
         destination_dir: str | Path,
+        compress_format: TarCompressionFormat = TarCompressionFormat.none,
+        exclude: str | list[str] | None = None,
     ) -> None:
-        """Overrides :meth:`~.os_session.OSSession.copy_from`."""
-        self.remote_session.copy_from(source_file, destination_dir)
+        """Overrides :meth:`~.os_session.OSSession.copy_dir_from`."""
+        tarball_name = f"{PurePath(source_dir).name}{compress_format.extension}"
+        remote_tarball_path = self.join_remote_path(PurePath(source_dir).parent, tarball_name)
+        self.create_remote_tarball(source_dir, compress_format, exclude)
+
+        self.copy_from(remote_tarball_path, destination_dir)
+        self.remove_remote_file(remote_tarball_path)
 
-    def copy_to(
+        tarball_path = Path(destination_dir, tarball_name)
+        extract_tarball(tarball_path)
+        tarball_path.unlink()
+
+    def copy_dir_to(
         self,
-        source_file: str | Path,
+        source_dir: str | Path,
         destination_dir: str | PurePath,
+        compress_format: TarCompressionFormat = TarCompressionFormat.none,
+        exclude: str | list[str] | None = None,
     ) -> None:
-        """Overrides :meth:`~.os_session.OSSession.copy_to`."""
-        self.remote_session.copy_to(source_file, destination_dir)
+        """Overrides :meth:`~.os_session.OSSession.copy_dir_to`."""
+        source_dir_name = Path(source_dir).name
+        tar_name = f"{source_dir_name}{compress_format.extension}"
+        tar_path = Path(Path(source_dir).parent, tar_name)
+
+        create_tarball(source_dir, compress_format, arcname=source_dir_name, exclude=exclude)
+        self.copy_to(tar_path, destination_dir)
+        tar_path.unlink()
+
+        remote_tar_path = self.join_remote_path(destination_dir, tar_name)
+        self.extract_remote_tarball(remote_tar_path)
+        self.remove_remote_file(remote_tar_path)
+
+    def remove_remote_file(self, remote_file_path: str | PurePath, force: bool = True) -> None:
+        """Overrides :meth:`~.os_session.OSSession.remove_remote_dir`."""
+        opts = PosixSession.combine_short_options(f=force)
+        self.send_command(f"rm{opts} {remote_file_path}")
 
     def remove_remote_dir(
         self,
@@ -111,10 +153,37 @@ def remove_remote_dir(
         opts = PosixSession.combine_short_options(r=recursive, f=force)
         self.send_command(f"rm{opts} {remote_dir_path}")
 
-    def extract_remote_tarball(
+    def create_remote_tarball(
         self,
-        remote_tarball_path: str | PurePath,
-        expected_dir: str | PurePath | None = None,
+        remote_dir_path: str | PurePath,
+        compress_format: TarCompressionFormat = TarCompressionFormat.none,
+        exclude: str | list[str] | None = None,
+    ) -> None:
+        """Overrides :meth:`~.os_session.OSSession.create_remote_tarball`."""
+
+        def generate_tar_exclude_args(exclude_patterns):
+            """Generate args to exclude patterns when creating a tarball.
+
+            Args:
+                exclude_patterns: The patterns to exclude from the tarball.
+
+            Returns:
+                The generated string args to exclude the specified patterns.
+            """
+            if exclude_patterns:
+                exclude_patterns = ensure_list_of_strings(exclude_patterns)
+                return "".join([f" --exclude={pattern}" for pattern in exclude_patterns])
+            return ""
+
+        target_tarball_path = f"{remote_dir_path}{compress_format.extension}"
+        self.send_command(
+            f"tar caf {target_tarball_path}{generate_tar_exclude_args(exclude)} "
+            f"-C {PurePath(remote_dir_path).parent} {PurePath(remote_dir_path).name}",
+            60,
+        )
+
+    def extract_remote_tarball(
+        self, remote_tarball_path: str | PurePath, expected_dir: str | PurePath | None = None
     ) -> None:
         """Overrides :meth:`~.os_session.OSSession.extract_remote_tarball`."""
         self.send_command(
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index 6b5d5a805f..5757872fbd 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -15,12 +15,15 @@
 """
 
 import atexit
+import fnmatch
 import json
 import os
 import subprocess
+import tarfile
 from enum import Enum
 from pathlib import Path
 from subprocess import SubprocessError
+from typing import Any
 
 from scapy.packet import Packet  # type: ignore[import-untyped]
 
@@ -140,13 +143,17 @@ def __str__(self) -> str:
         return " ".join(f"{self._default_library} {self._dpdk_args}".split())
 
 
-class _TarCompressionFormat(StrEnum):
+class TarCompressionFormat(StrEnum):
     """Compression formats that tar can use.
 
     Enum names are the shell compression commands
     and Enum values are the associated file extensions.
+
+    The 'none' member represents no compression, only archiving with tar.
+    Its value is set to 'tar' to indicate that the file is an uncompressed tar archive.
     """
 
+    none = "tar"
     gzip = "gz"
     compress = "Z"
     bzip2 = "bz2"
@@ -156,6 +163,16 @@ class _TarCompressionFormat(StrEnum):
     xz = "xz"
     zstd = "zst"
 
+    @property
+    def extension(self):
+        """Return the extension associated with the compression format.
+
+        If the compression format is 'none', the extension will be in the format '.tar'.
+        For other compression formats, the extension will be in the format
+        '.tar.{compression format}'.
+        """
+        return f".{self.value}" if self == self.none else f".{self.none.value}.{self.value}"
+
 
 class DPDKGitTarball:
     """Compressed tarball of DPDK from the repository.
@@ -169,7 +186,7 @@ class DPDKGitTarball:
     """
 
     _git_ref: str
-    _tar_compression_format: _TarCompressionFormat
+    _tar_compression_format: TarCompressionFormat
     _tarball_dir: Path
     _tarball_name: str
     _tarball_path: Path | None
@@ -178,7 +195,7 @@ def __init__(
         self,
         git_ref: str,
         output_dir: str,
-        tar_compression_format: _TarCompressionFormat = _TarCompressionFormat.xz,
+        tar_compression_format: TarCompressionFormat = TarCompressionFormat.xz,
     ):
         """Create the tarball during initialization.
 
@@ -198,9 +215,7 @@ def __init__(
 
         self._create_tarball_dir()
 
-        self._tarball_name = (
-            f"dpdk-tarball-{self._git_ref}.tar.{self._tar_compression_format.value}"
-        )
+        self._tarball_name = f"dpdk-tarball-{self._git_ref}{self._tar_compression_format.extension}"
         self._tarball_path = self._check_tarball_path()
         if not self._tarball_path:
             self._create_tarball()
@@ -244,3 +259,73 @@ def _delete_tarball(self) -> None:
     def __fspath__(self) -> str:
         """The os.PathLike protocol implementation."""
         return str(self._tarball_path)
+
+
+def ensure_list_of_strings(value: Any | list[Any]) -> list[str]:
+    """Ensure the input is a list of strings.
+
+    Converting all elements to list of strings format.
+
+    Args:
+        value: A single value or a list of values.
+
+    Returns:
+        A list of strings.
+    """
+    return list(map(str, value) if isinstance(value, list) else str(value))
+
+
+def create_tarball(
+    source_path: str | Path,
+    compress_format: TarCompressionFormat = TarCompressionFormat.none,
+    arcname: str | None = None,
+    exclude: Any | list[Any] | None = None,
+):
+    """Create a tarball archive from a source dir or file.
+
+    The tarball archive will be saved in the same path as `source_path` parent path.
+
+    Args:
+        source_path: The path to the source dir or file to be included in the tarball.
+        compress_format: The compression format to use. Defaults is no compression.
+        arcname: The name under which `source_path` will be archived.
+        exclude: Files or dirs to exclude before creating the tarball.
+    """
+
+    def create_filter_function(exclude_patterns: str | list[str] | None):
+        """Create a filter function based on the provided exclude patterns.
+
+        Args:
+            exclude_patterns: The patterns to exclude from the tarball.
+
+        Returns:
+            The filter function that excludes files based on the patterns.
+        """
+        if exclude_patterns:
+            exclude_patterns = ensure_list_of_strings(exclude_patterns)
+
+            def filter_func(tarinfo: tarfile.TarInfo) -> tarfile.TarInfo | None:
+                file_name = os.path.basename(tarinfo.name)
+                if any(fnmatch.fnmatch(file_name, pattern) for pattern in exclude_patterns):
+                    return None
+                return tarinfo
+
+            return filter_func
+        return None
+
+    with tarfile.open(
+        f"{source_path}{compress_format.extension}", f"w:{compress_format.value}"
+    ) as tar:
+        tar.add(source_path, arcname=arcname, filter=create_filter_function(exclude))
+
+
+def extract_tarball(tar_path: str | Path):
+    """Extract the contents of a tarball.
+
+    The tarball will be extracted in the same path as `tar_path` parent path.
+
+    Args:
+        tar_path: The path to the tarball file to extract.
+    """
+    with tarfile.open(tar_path, "r") as tar:
+        tar.extractall(path=Path(tar_path).parent)
-- 
2.43.0


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [RFC PATCH v1 06/12] dts: add ability to prevent overwriting files/dirs
  2024-09-06 13:26 [RFC PATCH v1 00/12] DTS external DPDK build and stats Juraj Linkeš
                   ` (4 preceding siblings ...)
  2024-09-06 13:26 ` [RFC PATCH v1 05/12] dts: add the ability to copy directories via remote Juraj Linkeš
@ 2024-09-06 13:26 ` Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 07/12] dts: update argument option for prevent overwriting Juraj Linkeš
                   ` (5 subsequent siblings)
  11 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-06 13:26 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, paul.szczepanek, Luca.Vizzarro,
	alex.chapman, probb, jspewock, npratte, dmarx
  Cc: dev, Tomáš Ďurovec

From: Tomáš Ďurovec <tomas.durovec@pantheon.tech>

Signed-off-by: Tomáš Ďurovec <tomas.durovec@pantheon.tech>
---
 dts/framework/settings.py                    | 17 ++++++++++
 dts/framework/testbed_model/os_session.py    | 31 +++++++++++++++---
 dts/framework/testbed_model/posix_session.py | 33 +++++++++++++++++---
 3 files changed, 71 insertions(+), 10 deletions(-)

diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 2b8c583853..2f7089a26b 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -55,6 +55,11 @@
     Git revision ID to test. Could be commit, tag, tree ID etc.
     To test local changes, first commit them, then use their commit ID.
 
+.. option:: -f, --force
+.. envvar:: DTS_FORCE
+
+    Specify to remove an already existing dpdk tarball before copying/extracting a new one.
+
 .. option:: --test-suite
 .. envvar:: DTS_TEST_SUITES
 
@@ -110,6 +115,8 @@ class Settings:
     #:
     dpdk_tarball_path: Path | str = ""
     #:
+    force: bool = False
+    #:
     compile_timeout: float = 1200
     #:
     test_suites: list[TestSuiteConfig] = field(default_factory=list)
@@ -337,6 +344,16 @@ def _get_parser() -> _DTSArgumentParser:
     )
     _add_env_var_to_action(action)
 
+    action = parser.add_argument(
+        "-f",
+        "--force",
+        action="store_true",
+        default=SETTINGS.force,
+        help="Specify to remove an already existing dpdk tarball before copying/extracting a "
+        "new one.",
+    )
+    _add_env_var_to_action(action)
+
     action = parser.add_argument(
         "--compile-timeout",
         default=SETTINGS.compile_timeout,
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 92b1a09d94..afc9ffb814 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -178,7 +178,9 @@ def join_remote_path(self, *args: str | PurePath) -> PurePath:
         """
 
     @abstractmethod
-    def copy_from(self, source_file: str | PurePath, destination_dir: str | Path) -> None:
+    def copy_from(
+        self, source_file: str | PurePath, destination_dir: str | Path, force: bool = SETTINGS.force
+    ) -> None:
         """Copy a file from the remote node to the local filesystem.
 
         Copy `source_file` from the remote node associated with this remote
@@ -188,10 +190,14 @@ def copy_from(self, source_file: str | PurePath, destination_dir: str | Path) ->
             source_file: The file on the remote node.
             destination_dir: A dir path on the local filesystem, where the `source_file`
                 will be saved.
+            force: If :data:`True`, remove an already existing `source_file` at the
+                `destination_dir` before copying to prevent overwriting data.
         """
 
     @abstractmethod
-    def copy_to(self, source_file: str | Path, destination_dir: str | PurePath) -> None:
+    def copy_to(
+        self, source_file: str | Path, destination_dir: str | PurePath, force: bool = SETTINGS.force
+    ) -> None:
         """Copy a file from local filesystem to the remote node.
 
         Copy `source_file` from local filesystem to `destination_dir`
@@ -201,6 +207,8 @@ def copy_to(self, source_file: str | Path, destination_dir: str | PurePath) -> N
             source_file: The file on the local filesystem.
             destination_dir: A dir path on the remote Node, where the `source_file`
                 will be saved.
+            force: If :data:`True`, remove an already existing `source_file` at the
+                `destination_dir` before copying to prevent overwriting data.
         """
 
     @abstractmethod
@@ -210,6 +218,7 @@ def copy_dir_from(
         destination_dir: str | Path,
         compress_format: TarCompressionFormat = TarCompressionFormat.none,
         exclude: str | list[str] | None = None,
+        force: bool = SETTINGS.force,
     ) -> None:
         """Copy a dir from the remote node to the local filesystem.
 
@@ -222,6 +231,8 @@ def copy_dir_from(
             destination_dir: A dir path on the local filesystem.
             compress_format: The compression format to use. Default is no compression.
             exclude: Files or dirs to exclude before creating the tarball.
+            force: If :data:`True`, remove an already existing `source_dir` at the `destination_dir`
+                before copying to prevent overwriting data.
         """
 
     @abstractmethod
@@ -231,18 +242,21 @@ def copy_dir_to(
         destination_dir: str | PurePath,
         compress_format: TarCompressionFormat = TarCompressionFormat.none,
         exclude: str | list[str] | None = None,
+        force: bool = SETTINGS.force,
     ) -> None:
         """Copy a dir from the local filesystem to the remote node.
 
         Copy `source_dir` from the local filesystem to `destination_dir` on the remote node
-        associated with this remote session. The new remote dir will be created at
-        `destination_dir` path.
+        associated with this remote session. The new remote dir will be created
+        at `destination_dir` path.
 
         Args:
             source_dir: The dir on the local filesystem.
             destination_dir: A dir path on the remote node.
             compress_format: The compression format to use. Default is no compression.
             exclude: Files or dirs to exclude before creating the tarball.
+            force: If :data:`True`, remove an already existing `source_dir` at the `destination_dir`
+                before copying to prevent overwriting data.
         """
 
     @abstractmethod
@@ -275,6 +289,7 @@ def create_remote_tarball(
         remote_dir_path: str | PurePath,
         compress_format: TarCompressionFormat = TarCompressionFormat.none,
         exclude: str | list[str] | None = None,
+        force: bool = SETTINGS.force,
     ) -> None:
         """Create a tarball from dir on the remote node.
 
@@ -284,6 +299,8 @@ def create_remote_tarball(
             remote_dir_path: The path of dir on the remote node.
             compress_format: The compression format to use. Default is no compression.
             exclude: Files or dirs to exclude before creating the tarball.
+            force: If :data:`True`, remove an already existing tarball at the directory of
+                `remote_dir_path` before creating a new one to prevent overwriting data.
         """
 
     @abstractmethod
@@ -291,13 +308,17 @@ def extract_remote_tarball(
         self,
         remote_tarball_path: str | PurePath,
         expected_dir: str | PurePath | None = None,
+        force: bool = SETTINGS.force,
     ) -> None:
-        """Extract remote tarball in its remote directory.
+        """Extract remote tarball in its remote dir.
 
         Args:
             remote_tarball_path: The path of the tarball on the remote node.
             expected_dir: If non-empty, check whether `expected_dir` exists after extracting
                 the archive.
+            force: If :data:`True` and `expected_dir` is defined, remove an already
+                existing `expected_dir` at the directory of `remote_dir_path` before
+                extracting to prevent overwriting data.
         """
 
     @abstractmethod
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 5a6d971d7d..94aac68e8d 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -91,12 +91,22 @@ def join_remote_path(self, *args: str | PurePath) -> PurePosixPath:
         """Overrides :meth:`~.os_session.OSSession.join_remote_path`."""
         return PurePosixPath(*args)
 
-    def copy_from(self, source_file: str | PurePath, destination_dir: str | Path) -> None:
+    def copy_from(
+        self, source_file: str | PurePath, destination_dir: str | Path, force: bool = SETTINGS.force
+    ) -> None:
         """Overrides :meth:`~.os_session.OSSession.copy_from`."""
+        if force:
+            Path(destination_dir, PurePath(source_file).name).unlink(missing_ok=True)
+
         self.remote_session.copy_from(source_file, destination_dir)
 
-    def copy_to(self, source_file: str | Path, destination_dir: str | PurePath) -> None:
+    def copy_to(
+        self, source_file: str | Path, destination_dir: str | PurePath, force: bool = SETTINGS.force
+    ) -> None:
         """Overrides :meth:`~.os_session.OSSession.copy_to`."""
+        if force:
+            self.remove_remote_file(PurePath(destination_dir, Path(source_file).name))
+
         self.remote_session.copy_to(source_file, destination_dir)
 
     def copy_dir_from(
@@ -105,13 +115,14 @@ def copy_dir_from(
         destination_dir: str | Path,
         compress_format: TarCompressionFormat = TarCompressionFormat.none,
         exclude: str | list[str] | None = None,
+        force: bool = SETTINGS.force,
     ) -> None:
         """Overrides :meth:`~.os_session.OSSession.copy_dir_from`."""
         tarball_name = f"{PurePath(source_dir).name}{compress_format.extension}"
         remote_tarball_path = self.join_remote_path(PurePath(source_dir).parent, tarball_name)
         self.create_remote_tarball(source_dir, compress_format, exclude)
 
-        self.copy_from(remote_tarball_path, destination_dir)
+        self.copy_from(remote_tarball_path, destination_dir, force)
         self.remove_remote_file(remote_tarball_path)
 
         tarball_path = Path(destination_dir, tarball_name)
@@ -124,6 +135,7 @@ def copy_dir_to(
         destination_dir: str | PurePath,
         compress_format: TarCompressionFormat = TarCompressionFormat.none,
         exclude: str | list[str] | None = None,
+        force: bool = SETTINGS.force,
     ) -> None:
         """Overrides :meth:`~.os_session.OSSession.copy_dir_to`."""
         source_dir_name = Path(source_dir).name
@@ -131,7 +143,7 @@ def copy_dir_to(
         tar_path = Path(Path(source_dir).parent, tar_name)
 
         create_tarball(source_dir, compress_format, arcname=source_dir_name, exclude=exclude)
-        self.copy_to(tar_path, destination_dir)
+        self.copy_to(tar_path, destination_dir, force)
         tar_path.unlink()
 
         remote_tar_path = self.join_remote_path(destination_dir, tar_name)
@@ -158,6 +170,7 @@ def create_remote_tarball(
         remote_dir_path: str | PurePath,
         compress_format: TarCompressionFormat = TarCompressionFormat.none,
         exclude: str | list[str] | None = None,
+        force: bool = SETTINGS.force,
     ) -> None:
         """Overrides :meth:`~.os_session.OSSession.create_remote_tarball`."""
 
@@ -176,6 +189,9 @@ def generate_tar_exclude_args(exclude_patterns):
             return ""
 
         target_tarball_path = f"{remote_dir_path}{compress_format.extension}"
+        if force:
+            self.remove_remote_file(target_tarball_path)
+
         self.send_command(
             f"tar caf {target_tarball_path}{generate_tar_exclude_args(exclude)} "
             f"-C {PurePath(remote_dir_path).parent} {PurePath(remote_dir_path).name}",
@@ -183,13 +199,20 @@ def generate_tar_exclude_args(exclude_patterns):
         )
 
     def extract_remote_tarball(
-        self, remote_tarball_path: str | PurePath, expected_dir: str | PurePath | None = None
+        self,
+        remote_tarball_path: str | PurePath,
+        expected_dir: str | PurePath | None = None,
+        force: bool = SETTINGS.force,
     ) -> None:
         """Overrides :meth:`~.os_session.OSSession.extract_remote_tarball`."""
+        if force and expected_dir:
+            self.remove_remote_dir(expected_dir)
+
         self.send_command(
             f"tar xfm {remote_tarball_path} -C {PurePosixPath(remote_tarball_path).parent}",
             60,
         )
+
         if expected_dir:
             self.send_command(f"ls {expected_dir}", verify=True)
 
-- 
2.43.0


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [RFC PATCH v1 07/12] dts: update argument option for prevent overwriting
  2024-09-06 13:26 [RFC PATCH v1 00/12] DTS external DPDK build and stats Juraj Linkeš
                   ` (5 preceding siblings ...)
  2024-09-06 13:26 ` [RFC PATCH v1 06/12] dts: add ability to prevent overwriting files/dirs Juraj Linkeš
@ 2024-09-06 13:26 ` Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 08/12] dts: add support for externally compiled DPDK Juraj Linkeš
                   ` (4 subsequent siblings)
  11 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-06 13:26 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, paul.szczepanek, Luca.Vizzarro,
	alex.chapman, probb, jspewock, npratte, dmarx
  Cc: dev, Tomáš Ďurovec

From: Tomáš Ďurovec <tomas.durovec@pantheon.tech>

Signed-off-by: Tomáš Ďurovec <tomas.durovec@pantheon.tech>
---
 doc/guides/tools/dts.rst | 1 +
 1 file changed, 1 insertion(+)

diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..059776c888 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -241,6 +241,7 @@ DTS is run with ``main.py`` located in the ``dts`` directory after entering Poet
      --revision ID, --rev ID, --git-ref ID
                            [DTS_DPDK_REVISION_ID] Git revision ID to test. Could be commit, tag, tree ID etc. To test local changes, first
                            commit them, then use their commit ID. (default: None)
+     -f, --force           [DTS_FORCE] Specify to remove an already existing dpdk tarball before copying/extracting a new one. (default: False)
      --compile-timeout SECONDS
                            [DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK. (default: 1200)
      --test-suite TEST_SUITE [TEST_CASES ...]
-- 
2.43.0


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [RFC PATCH v1 08/12] dts: add support for externally compiled DPDK
  2024-09-06 13:26 [RFC PATCH v1 00/12] DTS external DPDK build and stats Juraj Linkeš
                   ` (6 preceding siblings ...)
  2024-09-06 13:26 ` [RFC PATCH v1 07/12] dts: update argument option for prevent overwriting Juraj Linkeš
@ 2024-09-06 13:26 ` Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 09/12] doc: update argument options for external DPDK build Juraj Linkeš
                   ` (3 subsequent siblings)
  11 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-06 13:26 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, paul.szczepanek, Luca.Vizzarro,
	alex.chapman, probb, jspewock, npratte, dmarx
  Cc: dev, Tomáš Ďurovec

From: Tomáš Ďurovec <tomas.durovec@pantheon.tech>

Signed-off-by: Tomáš Ďurovec <tomas.durovec@pantheon.tech>
---
 dts/conf.yaml                                |  14 +-
 dts/framework/config/__init__.py             |  87 ++++-
 dts/framework/config/conf_yaml_schema.json   |  41 ++-
 dts/framework/config/types.py                |  17 +-
 dts/framework/exception.py                   |   4 +-
 dts/framework/remote_session/dpdk_shell.py   |   2 +-
 dts/framework/runner.py                      |  16 +-
 dts/framework/settings.py                    | 160 ++++++++--
 dts/framework/test_result.py                 |  27 +-
 dts/framework/testbed_model/node.py          |  22 +-
 dts/framework/testbed_model/os_session.py    |  43 ++-
 dts/framework/testbed_model/posix_session.py |  23 +-
 dts/framework/testbed_model/sut_node.py      | 314 +++++++++++++------
 13 files changed, 562 insertions(+), 208 deletions(-)

diff --git a/dts/conf.yaml b/dts/conf.yaml
index 3d5ee5aee5..a38aaca7f7 100644
--- a/dts/conf.yaml
+++ b/dts/conf.yaml
@@ -5,12 +5,14 @@
 test_runs:
   # define one test run environment
   - dpdk_build:
-      arch: x86_64
-      os: linux
-      cpu: native
-      # the combination of the following two makes CC="ccache gcc"
-      compiler: gcc
-      compiler_wrapper: ccache
+      tarball: "" # define path to DPDK tarball
+      build:
+        arch: x86_64
+        os: linux
+        cpu: native
+        # the combination of the following two makes CC="ccache gcc"
+        compiler: gcc
+        compiler_wrapper: ccache
     perf: false # disable performance testing
     func: true # enable functional testing
     skip_smoke_tests: false # optional
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index aba49143ae..0896f4e495 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -47,6 +47,7 @@
 from framework.config.types import (
     ConfigurationDict,
     DPDKBuildConfigDict,
+    DPDKSetupDict,
     NodeConfigDict,
     PortConfigDict,
     TestRunConfigDict,
@@ -380,6 +381,67 @@ def from_dict(cls, d: DPDKBuildConfigDict) -> Self:
         )
 
 
+@dataclass(slots=True, frozen=True)
+class DPDKLocation:
+    """DPDK location.
+
+    The path to the DPDK sources, build dir and type of location.
+
+    Attributes:
+        dpdk_tree: The path to the DPDK tree.
+        tarball: The path to the DPDK tarball.
+        remote: If :data:`True`, `dpdk_tree` or `tarball` is on the SUT node.
+        build_dir: A directory name, which would be located in the `dpdk tree` or `tarball`.
+    """
+
+    dpdk_tree: str | None
+    tarball: str | None
+    remote: bool
+    build_dir: str | None
+
+    @classmethod
+    def from_dict(cls, d: DPDKSetupDict) -> Self | None:
+        """A convenience method that processes and validate the inputs before creating an instance.
+
+        Ensures that either `dpdk_tree` or `tarball` is provided and, if local
+        (`remote` is False), verifies their existence. Constructs and returns
+        a `DPDKLocation` object with the provided parameters if validation is
+        successful, or `None` if neither `dpdk_tree` nor `tarball` is given.
+
+        Args:
+            d: The configuration dictionary.
+
+        Returns:
+            A DPDK location if construction is successful, otherwise None.
+
+        Raises:
+            ConfigurationError: If `dpdk_tree` or `tarball` not found in local filesystem.
+        """
+        dpdk_tree = d.get("dpdk_tree")
+        tarball = d.get("tarball")
+        remote = d.get("remote", False)
+
+        if dpdk_tree or tarball:
+            if not remote:
+                if dpdk_tree and not Path(dpdk_tree).is_dir():
+                    raise ConfigurationError(
+                        f"DPDK tree '{dpdk_tree}' not found in local filesystem."
+                    )
+                if tarball and not Path(tarball).is_file():
+                    raise ConfigurationError(
+                        f"DPDK tarball '{tarball}' not found in local filesystem."
+                    )
+
+            return cls(
+                dpdk_tree=dpdk_tree,
+                tarball=tarball,
+                remote=remote,
+                build_dir=d.get("dir_name"),
+            )
+
+        return None
+
+
 @dataclass(slots=True, frozen=True)
 class DPDKBuildInfo:
     """Various versions and other information about a DPDK build.
@@ -389,8 +451,8 @@ class DPDKBuildInfo:
         compiler_version: The version of the compiler used to build DPDK.
     """
 
-    dpdk_version: str
-    compiler_version: str
+    dpdk_version: str | None
+    compiler_version: str | None
 
 
 @dataclass(slots=True, frozen=True)
@@ -437,7 +499,8 @@ class TestRunConfiguration:
     and with what DPDK build.
 
     Attributes:
-        dpdk_build: A DPDK build to test.
+        dpdk_location: The target source of the DPDK tree.
+        dpdk_build_config: A DPDK build configuration to test.
         perf: Whether to run performance tests.
         func: Whether to run functional tests.
         skip_smoke_tests: Whether to skip smoke tests.
@@ -447,7 +510,8 @@ class TestRunConfiguration:
         vdevs: The names of virtual devices to test.
     """
 
-    dpdk_build: DPDKBuildConfiguration
+    dpdk_location: DPDKLocation | None
+    dpdk_build_config: DPDKBuildConfiguration | None
     perf: bool
     func: bool
     skip_smoke_tests: bool
@@ -475,6 +539,18 @@ def from_dict(
         Returns:
             The test run configuration instance.
         """
+        dpdk_location = None
+        dpdk_build_config = None
+
+        dpdk_build_dct = d.get("dpdk_build")
+        if dpdk_build_dct:
+            dpdk_location = DPDKLocation.from_dict(dpdk_build_dct)
+            dpdk_build_config = (
+                DPDKBuildConfiguration.from_dict(dpdk_build_dct["build"])
+                if dpdk_build_dct.get("build")
+                else None
+            )
+
         test_suites: list[TestSuiteConfig] = list(map(TestSuiteConfig.from_dict, d["test_suites"]))
         sut_name = d["system_under_test_node"]["node_name"]
         skip_smoke_tests = d.get("skip_smoke_tests", False)
@@ -495,7 +571,8 @@ def from_dict(
             d["system_under_test_node"]["vdevs"] if "vdevs" in d["system_under_test_node"] else []
         )
         return cls(
-            dpdk_build=DPDKBuildConfiguration.from_dict(d["dpdk_build"]),
+            dpdk_location=dpdk_location,
+            dpdk_build_config=dpdk_build_config,
             perf=d["perf"],
             func=d["func"],
             skip_smoke_tests=skip_smoke_tests,
diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
index c0c347199e..03f0fe837f 100644
--- a/dts/framework/config/conf_yaml_schema.json
+++ b/dts/framework/config/conf_yaml_schema.json
@@ -110,9 +110,9 @@
         "mscv"
       ]
     },
-    "dpdk_build": {
+    "build": {
       "type": "object",
-      "description": "DPDK build configuration supported by DTS.",
+      "description": "DPDK build configuration supported by DTS. Either this or `dir_name` can be defined, but not both.",
       "properties": {
         "arch": {
           "type": "string",
@@ -146,6 +146,43 @@
         "compiler"
       ]
     },
+    "dpdk_build": {
+      "type":"object",
+      "description": "DPDK source and build configuration. Optional.",
+      "properties": {
+        "dpdk_tree": {
+          "type": "string",
+          "description": "Path to the DPDK source code. Either this or `tarball` can be defined, but not both."
+        },
+        "tarball": {
+          "type": "string",
+          "description": "Path to the DPDK tarball. Either this or `dpdk_tree` can be defined, but not both."
+        },
+        "remote": {
+          "type": "boolean",
+          "description": "If present, `dpdk_tree` or `tarball` is on the SUT node."
+        },
+        "dir_name": {
+          "type": "string",
+          "description": "A directory name, which would be located in the `dpdk tree` or `tarball`. Either this or `build` can be defined, but not both."
+        },
+        "build": {
+          "$ref": "#/definitions/build"
+        }
+      },
+      "allOf": [
+        {
+          "not": {
+            "required": ["dpdk_tree", "tarball"]
+          }
+        },
+        {
+          "not": {
+            "required": ["dir_name", "build"]
+          }
+        }
+      ]
+    },
     "hugepages_2mb": {
       "type": "object",
       "description": "Optional hugepage configuration. If not specified, hugepages won't be configured and DTS will use system configuration.",
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
index 9b3c997c80..b82eec38fd 100644
--- a/dts/framework/config/types.py
+++ b/dts/framework/config/types.py
@@ -86,6 +86,21 @@ class DPDKBuildConfigDict(TypedDict):
     compiler_wrapper: str
 
 
+class DPDKSetupDict(TypedDict):
+    """Allowed keys and values."""
+
+    #:
+    dpdk_tree: str | None
+    #:
+    tarball: str | None
+    #:
+    remote: bool
+    #:
+    dir_name: str | None
+    #:
+    build: DPDKBuildConfigDict
+
+
 class TestSuiteConfigDict(TypedDict):
     """Allowed keys and values."""
 
@@ -108,7 +123,7 @@ class TestRunConfigDict(TypedDict):
     """Allowed keys and values."""
 
     #:
-    dpdk_build: DPDKBuildConfigDict
+    dpdk_build: DPDKSetupDict
     #:
     perf: bool
     #:
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index f45f789825..d967ede09b 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -184,8 +184,8 @@ class InteractiveCommandExecutionError(DTSError):
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
 
 
-class RemoteDirectoryExistsError(DTSError):
-    """A directory that exists on a remote node."""
+class RemoteFileNotFoundError(DTSError):
+    """A remote file or directory is requested but doesn’t exist."""
 
     #:
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
diff --git a/dts/framework/remote_session/dpdk_shell.py b/dts/framework/remote_session/dpdk_shell.py
index c5f5c2d116..b39132cc42 100644
--- a/dts/framework/remote_session/dpdk_shell.py
+++ b/dts/framework/remote_session/dpdk_shell.py
@@ -104,4 +104,4 @@ def _update_real_path(self, path: PurePath) -> None:
 
         Adds the remote DPDK build directory to the path.
         """
-        super()._update_real_path(self._node.remote_dpdk_build_dir.joinpath(path))
+        super()._update_real_path(PurePath(self._node.remote_dpdk_build_dir).joinpath(path))
diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index a212ca2470..c4ac5db194 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -412,15 +412,27 @@ def _run_test_run(
             test_run_config: A test run configuration.
             test_run_result: The test run's result.
             test_suites_with_cases: The test suites with test cases to run.
+
+        Raises:
+            ConfigurationError: If the DPDK sources or build is not set up from config or settings.
         """
         self._logger.info(
             f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
         )
         test_run_result.add_sut_info(sut_node.node_info)
         try:
-            sut_node.set_up_test_run(test_run_config)
+            dpdk_location = SETTINGS.dpdk_location or test_run_config.dpdk_location
+            if not dpdk_location:
+                raise ConfigurationError("DPDK sources is not set up from config or settings.")
+            elif not (dpdk_location.build_dir or test_run_config.dpdk_build_config):
+                raise ConfigurationError(
+                    "Either DPDK build config is not set up from config or DPDK build dir is not "
+                    "sets up from config or settings."
+                )
+
+            sut_node.set_up_test_run(test_run_config, dpdk_location)
             test_run_result.add_dpdk_build_info(sut_node.get_dpdk_build_info())
-            tg_node.set_up_test_run(test_run_config)
+            tg_node.set_up_test_run(test_run_config, dpdk_location)
             test_run_result.update_setup(Result.PASS)
         except Exception as e:
             self._logger.exception("Test run setup failed.")
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 2f7089a26b..97acd62fd8 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -39,10 +39,10 @@
 
     Set to any value to enable logging everything to the console.
 
-.. option:: -s, --skip-setup
-.. envvar:: DTS_SKIP_SETUP
+.. option:: --dpdk-tree
+.. envvar:: DTS_DPDK_TREE
 
-    Set to any value to skip building DPDK.
+    Path to DPDK source code tree to test.
 
 .. option:: --tarball, --snapshot
 .. envvar:: DTS_DPDK_TARBALL
@@ -55,10 +55,20 @@
     Git revision ID to test. Could be commit, tag, tree ID etc.
     To test local changes, first commit them, then use their commit ID.
 
+.. option:: --remote-source
+.. envvar:: DTS_REMOTE_SOURCE
+
+    Set when the DPDK source tree or tarball is located on the SUT node.
+
+.. option:: --build-dir
+.. envvar:: DTS_BUILD_DIR
+
+    A directory name, which would be located in the `dpdk tree` or `tarball`.
+
 .. option:: -f, --force
 .. envvar:: DTS_FORCE
 
-    Specify to remove an already existing dpdk tarball before copying/extracting a new one.
+    Specify to remove an already existing DPDK tarball or tree before copying/extracting a new one.
 
 .. option:: --test-suite
 .. envvar:: DTS_TEST_SUITES
@@ -90,7 +100,7 @@
 from pathlib import Path
 from typing import Callable
 
-from .config import TestSuiteConfig
+from .config import DPDKLocation, TestSuiteConfig
 from .exception import ConfigurationError
 from .utils import DPDKGitTarball, get_commit_id
 
@@ -111,9 +121,7 @@ class Settings:
     #:
     verbose: bool = False
     #:
-    skip_setup: bool = False
-    #:
-    dpdk_tarball_path: Path | str = ""
+    dpdk_location: DPDKLocation | None = None
     #:
     force: bool = False
     #:
@@ -241,14 +249,6 @@ def _get_help_string(self, action):
         return help
 
 
-def _parse_tarball_path(file_path: str) -> Path:
-    """Validate whether `file_path` is valid and return a Path object."""
-    path = Path(file_path)
-    if not path.exists() or not path.is_file():
-        raise argparse.ArgumentTypeError("The file path provided is not a valid file")
-    return path
-
-
 def _parse_revision_id(rev_id: str) -> str:
     """Validate revision ID and retrieve corresponding commit ID."""
     try:
@@ -257,6 +257,48 @@ def _parse_revision_id(rev_id: str) -> str:
         raise argparse.ArgumentTypeError("The Git revision ID supplied is invalid or ambiguous")
 
 
+def _required_with_one_of(parser: _DTSArgumentParser, action: Action, *required_dests: str) -> None:
+    """Verify that `action` is listed together with `required_dests`.
+
+    Verify that a specific action is included in the command-line arguments or environment variables
+    if at least one of the required destination is already defined in the command-line arguments or
+    environment variables.
+
+    Args:
+        parser: The custom ArgumentParser object which contains `action`.
+        action: The action to be verified.
+        *required_dests: Destination variable names of the required arguments.
+
+    Raises:
+        argparse.ArgumentTypeError: If the action is not included when one
+            of the required arguments is present.
+
+    Example:
+        For etc. if the `--option1` argument is provided, then the `--option2` argument
+        must also be included too. Only one of the required_dests needs to be provided for
+        the check to be applied.
+
+        parser = _DTSArgumentParser()
+        option1_arg = parser.add_argument('--option1', dest='option1', action='store_true')
+        option2_arg = arser.add_argument('--option2', dest='option2', action='store_true')
+
+        _required_with_one_of(parser, option1_arg, 'option2')
+
+    """
+    if _is_action_in_args(action):
+        for required_dest in required_dests:
+            required_action = parser.find_action(required_dest)
+            if required_action is None:
+                continue
+
+            if _is_action_in_args(required_action):
+                return None
+
+        raise argparse.ArgumentTypeError(
+            f"The '{action.dest}' is required at least with one of '{', '.join(required_dests)}'."
+        )
+
+
 def _get_parser() -> _DTSArgumentParser:
     """Create the argument parser for DTS.
 
@@ -311,21 +353,19 @@ def _get_parser() -> _DTSArgumentParser:
     )
     _add_env_var_to_action(action)
 
-    action = parser.add_argument(
-        "-s",
-        "--skip-setup",
-        action="store_true",
-        default=SETTINGS.skip_setup,
-        help="Specify to skip all setup steps on SUT and TG nodes.",
-    )
-    _add_env_var_to_action(action)
+    dpdk_source = parser.add_mutually_exclusive_group()
 
-    dpdk_source = parser.add_mutually_exclusive_group(required=True)
+    action = dpdk_source.add_argument(
+        "--dpdk-tree",
+        help="Path to DPDK source code tree to test.",
+        metavar="DIR_PATH",
+        dest="dpdk_tree_path",
+    )
+    _add_env_var_to_action(action, "DPDK_TREE")
 
     action = dpdk_source.add_argument(
         "--tarball",
         "--snapshot",
-        type=_parse_tarball_path,
         help="Path to DPDK source code tarball to test.",
         metavar="FILE_PATH",
         dest="dpdk_tarball_path",
@@ -344,6 +384,23 @@ def _get_parser() -> _DTSArgumentParser:
     )
     _add_env_var_to_action(action)
 
+    action = parser.add_argument(
+        "--remote-source",
+        action="store_true",
+        default=False,
+        help="Set when the DPDK source tree or tarball is located on the SUT node.",
+    )
+    _add_env_var_to_action(action)
+    _required_with_one_of(parser, action, "dpdk_tarball_path", "dpdk_tree_path")
+
+    action = parser.add_argument(
+        "--build-dir",
+        help="A directory name, which would be located in the `dpdk tree` or `tarball`.",
+        metavar="DIR_NAME",
+    )
+    _add_env_var_to_action(action)
+    _required_with_one_of(parser, action, "dpdk_tarball_path", "dpdk_tree_path")
+
     action = parser.add_argument(
         "-f",
         "--force",
@@ -395,6 +452,49 @@ def _get_parser() -> _DTSArgumentParser:
     return parser
 
 
+def _process_dpdk_location(
+    dpdk_tree: str | None,
+    tarball: str | None,
+    remote: bool,
+    build_dir: str | None,
+):
+    """Process and validate DPDK build arguments.
+
+    Ensures that either `dpdk_tree` or `tarball` is provided and, if local
+    (`remote` is False), verifies their existence. Constructs and returns
+    a `DPDKLocation` object with the provided parameters if validation is
+    successful, or `None` if neither `dpdk_tree` nor `tarball` is given.
+
+    Args:
+        dpdk_tree: The path to the DPDK tree.
+        tarball: The path to the DPDK tarball.
+        remote: If :data:`True`, `dpdk_tree` or `tarball` is on the SUT node.
+        build_dir: A directory name, which would be located in the `dpdk tree` or `tarball`.
+
+    Returns:
+        A DPDK location if construction is successful, otherwise None.
+
+    Raises:
+        argparse.ArgumentTypeError: If `dpdk_tree` or `tarball` not found in local filesystem.
+    """
+    if dpdk_tree or tarball:
+        if not remote:
+            if dpdk_tree and not Path(dpdk_tree).is_dir():
+                raise argparse.ArgumentTypeError(
+                    f"DPDK tree '{dpdk_tree}' not found in local filesystem."
+                )
+            if tarball and not Path(tarball).is_file():
+                raise argparse.ArgumentTypeError(
+                    f"DPDK tarball '{tarball}' not found in local filesystem."
+                )
+
+        return DPDKLocation(
+            dpdk_tree=dpdk_tree, tarball=tarball, remote=remote, build_dir=build_dir
+        )
+
+    return None
+
+
 def _process_test_suites(
     parser: _DTSArgumentParser, args: list[list[str]]
 ) -> list[TestSuiteConfig]:
@@ -424,16 +524,14 @@ def get_settings() -> Settings:
         The new settings object.
     """
     parser = _get_parser()
-
-    if len(sys.argv) == 1:
-        parser.print_help()
-        sys.exit(1)
-
     args = parser.parse_args()
 
     if args.dpdk_revision_id:
         args.dpdk_tarball_path = Path(DPDKGitTarball(args.dpdk_revision_id, args.output_dir))
 
+    args.dpdk_location = _process_dpdk_location(
+        args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source, args.build_dir
+    )
     args.test_suites = _process_test_suites(parser, args.test_suites)
 
     kwargs = {k: v for k, v in vars(args).items() if hasattr(SETTINGS, k)}
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 9e9cd94e33..c4343602aa 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -29,16 +29,7 @@
 from types import FunctionType
 from typing import Union
 
-from .config import (
-    OS,
-    Architecture,
-    Compiler,
-    CPUType,
-    DPDKBuildInfo,
-    NodeInfo,
-    TestRunConfiguration,
-    TestSuiteConfig,
-)
+from .config import DPDKBuildInfo, NodeInfo, TestRunConfiguration, TestSuiteConfig
 from .exception import DTSError, ErrorSeverity
 from .logger import DTSLogger
 from .settings import SETTINGS
@@ -220,8 +211,8 @@ def add_stats(self, statistics: "Statistics") -> None:
 class DTSResult(BaseResult):
     """Stores environment information and test results from a DTS run.
 
-        * Test run level information, such as testbed, compiler, target OS and cpu and
-          the test suite list,
+        * Test run level information, such as testbed, compiler version, dpdk version
+          and the test suite list,
         * Test suite and test case results,
         * All errors that are caught and recorded during DTS execution.
 
@@ -318,10 +309,6 @@ class TestRunResult(BaseResult):
     The internal list stores the results of all test suites in a given test run.
 
     Attributes:
-        arch: The DPDK build architecture.
-        os: The DPDK build operating system.
-        cpu: The DPDK build CPU.
-        compiler: The DPDK build compiler.
         compiler_version: The DPDK build compiler version.
         dpdk_version: The built DPDK version.
         sut_os_name: The operating system of the SUT node.
@@ -329,10 +316,6 @@ class TestRunResult(BaseResult):
         sut_kernel_version: The operating system kernel version of the SUT node.
     """
 
-    arch: Architecture
-    os: OS
-    cpu: CPUType
-    compiler: Compiler
     compiler_version: str | None
     dpdk_version: str | None
     sut_os_name: str
@@ -348,10 +331,6 @@ def __init__(self, test_run_config: TestRunConfiguration):
             test_run_config: A test run configuration.
         """
         super().__init__()
-        self.arch = test_run_config.dpdk_build.arch
-        self.os = test_run_config.dpdk_build.os
-        self.cpu = test_run_config.dpdk_build.cpu
-        self.compiler = test_run_config.dpdk_build.compiler
         self.compiler_version = None
         self.dpdk_version = None
         self._config = test_run_config
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index 12a40170ac..f048b57ed5 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -15,12 +15,11 @@
 
 from abc import ABC
 from ipaddress import IPv4Interface, IPv6Interface
-from typing import Any, Callable, Union
+from typing import Union
 
-from framework.config import OS, NodeConfiguration, TestRunConfiguration
+from framework.config import OS, DPDKLocation, NodeConfiguration, TestRunConfiguration
 from framework.exception import ConfigurationError
 from framework.logger import DTSLogger, get_dts_logger
-from framework.settings import SETTINGS
 
 from .cpu import (
     LogicalCore,
@@ -95,7 +94,9 @@ def _init_ports(self) -> None:
         for port in self.ports:
             self.configure_port_state(port)
 
-    def set_up_test_run(self, test_run_config: TestRunConfiguration) -> None:
+    def set_up_test_run(
+        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+    ) -> None:
         """Test run setup steps.
 
         Configure hugepages on all DTS node types. Additional steps can be added by
@@ -104,6 +105,7 @@ def set_up_test_run(self, test_run_config: TestRunConfiguration) -> None:
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
+            dpdk_location: The target source of the DPDK tree.
         """
         self._setup_hugepages()
 
@@ -216,18 +218,6 @@ def close(self) -> None:
         for session in self._other_sessions:
             session.close()
 
-    @staticmethod
-    def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
-        """Skip the decorated function.
-
-        The :option:`--skip-setup` command line argument and the :envvar:`DTS_SKIP_SETUP`
-        environment variable enable the decorator.
-        """
-        if SETTINGS.skip_setup:
-            return lambda *args: None
-        else:
-            return func
-
 
 def create_session(node_config: NodeConfiguration, name: str, logger: DTSLogger) -> OSSession:
     """Factory for OS-aware sessions.
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index afc9ffb814..a9309ba38e 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -25,7 +25,7 @@
 from abc import ABC, abstractmethod
 from collections.abc import Iterable
 from ipaddress import IPv4Interface, IPv6Interface
-from pathlib import Path, PurePath
+from pathlib import Path, PurePath, PurePosixPath
 from typing import Union
 
 from framework.config import Architecture, NodeConfiguration, NodeInfo
@@ -137,17 +137,6 @@ def _get_privileged_command(command: str) -> str:
             The modified command that executes with administrative privileges.
         """
 
-    @abstractmethod
-    def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
-        """Try to find DPDK directory in `remote_dir`.
-
-        The directory is the one which is created after the extraction of the tarball. The files
-        are usually extracted into a directory starting with ``dpdk-``.
-
-        Returns:
-            The absolute path of the DPDK remote directory, empty path if not found.
-        """
-
     @abstractmethod
     def get_remote_tmp_dir(self) -> PurePath:
         """Get the path of the temporary directory of the remote OS.
@@ -177,6 +166,17 @@ def join_remote_path(self, *args: str | PurePath) -> PurePath:
             The resulting joined path.
         """
 
+    @abstractmethod
+    def remote_path_exists(self, remote_path: str | PurePath) -> bool:
+        """Check whether a path exists on the remote system.
+
+        Args:
+            remote_path: The path to check.
+
+        Returns:
+            True if the path exists, False otherwise.
+        """
+
     @abstractmethod
     def copy_from(
         self, source_file: str | PurePath, destination_dir: str | Path, force: bool = SETTINGS.force
@@ -321,6 +321,25 @@ def extract_remote_tarball(
                 extracting to prevent overwriting data.
         """
 
+    @abstractmethod
+    def get_tarball_top_dir(
+        self, remote_tarball_path: str | PurePath
+    ) -> str | PurePosixPath | None:
+        """Get the top directory of the remote tarball.
+
+        It examines the contents of a tarball located at the given `remote_tarball_path` and
+        determines the top-level directory. If all files and directories in the tarball share
+        the same top-level directory, that directory name is returned. If the tarball contains
+        multiple top-level directories or is empty, the method return None.
+
+        Args:
+            remote_tarball_path: The path to the remote tarball.
+
+        Returns:
+           The top directory of the tarball, if there are not multiple top directories
+            otherwise None.
+        """
+
     @abstractmethod
     def build_dpdk(
         self,
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 94aac68e8d..83c440711f 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -91,6 +91,11 @@ def join_remote_path(self, *args: str | PurePath) -> PurePosixPath:
         """Overrides :meth:`~.os_session.OSSession.join_remote_path`."""
         return PurePosixPath(*args)
 
+    def remote_path_exists(self, remote_path: str | PurePath) -> bool:
+        """Overrides :meth:`~.os_session.OSSession.remote_path_exists`."""
+        result = self.send_command(f"test -e {remote_path}")
+        return not result.return_code
+
     def copy_from(
         self, source_file: str | PurePath, destination_dir: str | Path, force: bool = SETTINGS.force
     ) -> None:
@@ -216,6 +221,16 @@ def extract_remote_tarball(
         if expected_dir:
             self.send_command(f"ls {expected_dir}", verify=True)
 
+    def get_tarball_top_dir(
+        self, remote_tarball_path: str | PurePath
+    ) -> str | PurePosixPath | None:
+        """Overrides :meth:`~.os_session.OSSession.get_tarball_top_dir`."""
+        members = self.send_command(f"tar tf {remote_tarball_path}").stdout.split()
+        top_dirs = [PurePosixPath(member).parts[0] for member in members if member]
+        if len(set(top_dirs)) == 1:
+            return top_dirs[0]
+        return None
+
     def build_dpdk(
         self,
         env_vars: dict,
@@ -321,7 +336,7 @@ def _get_dpdk_pids(self, dpdk_runtime_dirs: Iterable[str | PurePath]) -> list[in
         pid_regex = r"p(\d+)"
         for dpdk_runtime_dir in dpdk_runtime_dirs:
             dpdk_config_file = PurePosixPath(dpdk_runtime_dir, "config")
-            if self._remote_files_exists(dpdk_config_file):
+            if self.remote_path_exists(dpdk_config_file):
                 out = self.send_command(f"lsof -Fp {dpdk_config_file}").stdout
                 if out and "No such file or directory" not in out:
                     for out_line in out.splitlines():
@@ -330,10 +345,6 @@ def _get_dpdk_pids(self, dpdk_runtime_dirs: Iterable[str | PurePath]) -> list[in
                             pids.append(int(match.group(1)))
         return pids
 
-    def _remote_files_exists(self, remote_path: PurePath) -> bool:
-        result = self.send_command(f"test -e {remote_path}")
-        return not result.return_code
-
     def _check_dpdk_hugepages(self, dpdk_runtime_dirs: Iterable[str | PurePath]) -> None:
         """Check there aren't any leftover hugepages.
 
@@ -345,7 +356,7 @@ def _check_dpdk_hugepages(self, dpdk_runtime_dirs: Iterable[str | PurePath]) ->
         """
         for dpdk_runtime_dir in dpdk_runtime_dirs:
             hugepage_info = PurePosixPath(dpdk_runtime_dir, "hugepage_info")
-            if self._remote_files_exists(hugepage_info):
+            if self.remote_path_exists(hugepage_info):
                 out = self.send_command(f"lsof -Fp {hugepage_info}").stdout
                 if out and "No such file or directory" not in out:
                     self._logger.warning("Some DPDK processes did not free hugepages.")
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 9bfb91816e..67af04d020 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -13,20 +13,20 @@
 
 
 import os
-import tarfile
 import time
 from pathlib import PurePath
 
 from framework.config import (
     DPDKBuildConfiguration,
     DPDKBuildInfo,
+    DPDKLocation,
     NodeInfo,
     SutNodeConfiguration,
     TestRunConfiguration,
 )
+from framework.exception import RemoteFileNotFoundError
 from framework.params.eal import EalParams
 from framework.remote_session.remote_session import CommandResult
-from framework.settings import SETTINGS
 from framework.utils import MesonArgs
 
 from .node import Node
@@ -39,14 +39,27 @@ class SutNode(Node):
 
     The SUT node extends :class:`Node` with DPDK specific features:
 
-        * DPDK build,
+        * Managing DPDK source tree on the remote SUT,
+        * Building the DPDK from source or using a pre-built version,
         * Gathering of DPDK build info,
         * The running of DPDK apps, interactively or one-time execution,
         * DPDK apps cleanup.
 
-    The :option:`--tarball` command line argument and the :envvar:`DTS_DPDK_TARBALL`
-    environment variable configure the path to the DPDK tarball
-    or the git commit ID, tag ID or tree ID to test.
+    The :option:`--tarball` command line argument, :envvar:`DTS_DPDK_TARBALL` environment variable
+    and `tarball` inside `dpdk_build` from configuration, set the path to the DPDK tarball.
+
+    The :option:`--dpdk-tree` command line argument, :envvar:`DTS_DPDK_TREE` environment variable
+    and `dpdk-tree` inside `dpdk_build` from configuration, set the path to the DPDK tree.
+
+    The :option:`--remote-source` command line argument, :envvar:`DTS_REMOTE_SOURCE` environment
+    variable and `remote` inside `dpdk_build` from configuration, set when the `dpdk_tree` or
+    `tarball` is located on the SUT node.
+
+    The :option:`--build-dir` command line argument, :envvar:`DTS_BUILD_DIR` environment
+    variable and `dir_name` inside `dpdk_build` from configuration, set a directory name,
+    which would be located in the `dpdk_tree` or `tarball`.
+
+    Building DPDK from source uses `build` configuration inside `dpdk_build` of configuration.
 
     Attributes:
         config: The SUT node configuration.
@@ -57,10 +70,10 @@ class SutNode(Node):
     virtual_devices: list[VirtualDevice]
     dpdk_prefix_list: list[str]
     dpdk_timestamp: str
-    _dpdk_build_config: DPDKBuildConfiguration | None
     _env_vars: dict
     _remote_tmp_dir: PurePath
-    __remote_dpdk_dir: PurePath | None
+    __remote_dpdk_tree_path: str | PurePath | None
+    _remote_dpdk_build_dir: PurePath | None
     _app_compile_timeout: float
     _dpdk_kill_session: OSSession | None
     _dpdk_version: str | None
@@ -77,10 +90,10 @@ def __init__(self, node_config: SutNodeConfiguration):
         super().__init__(node_config)
         self.virtual_devices = []
         self.dpdk_prefix_list = []
-        self._dpdk_build_config = None
         self._env_vars = {}
         self._remote_tmp_dir = self.main_session.get_remote_tmp_dir()
-        self.__remote_dpdk_dir = None
+        self.__remote_dpdk_tree_path = None
+        self._remote_dpdk_build_dir = None
         self._app_compile_timeout = 90
         self._dpdk_kill_session = None
         self.dpdk_timestamp = (
@@ -93,40 +106,34 @@ def __init__(self, node_config: SutNodeConfiguration):
         self._logger.info(f"Created node: {self.name}")
 
     @property
-    def _remote_dpdk_dir(self) -> PurePath:
-        """The remote DPDK dir.
-
-        This internal property should be set after extracting the DPDK tarball. If it's not set,
-        that implies the DPDK setup step has been skipped, in which case we can guess where
-        a previous build was located.
-        """
-        if self.__remote_dpdk_dir is None:
-            self.__remote_dpdk_dir = self._guess_dpdk_remote_dir()
-        return self.__remote_dpdk_dir
-
-    @_remote_dpdk_dir.setter
-    def _remote_dpdk_dir(self, value: PurePath) -> None:
-        self.__remote_dpdk_dir = value
+    def _remote_dpdk_tree_path(self) -> str | PurePath:
+        """The remote DPDK tree path."""
+        if self.__remote_dpdk_tree_path:
+            return self.__remote_dpdk_tree_path
+
+        self._logger.warning(
+            "Failed to get remote dpdk tree path because we don't know the "
+            "location on the SUT node."
+        )
+        return ""
 
     @property
-    def remote_dpdk_build_dir(self) -> PurePath:
-        """The remote DPDK build directory.
-
-        This is the directory where DPDK was built.
-        We assume it was built in a subdirectory of the extracted tarball.
-        """
-        if self._dpdk_build_config:
-            return self.main_session.join_remote_path(
-                self._remote_dpdk_dir, self._dpdk_build_config.name
-            )
-        else:
-            return self.main_session.join_remote_path(self._remote_dpdk_dir, "build")
+    def remote_dpdk_build_dir(self) -> str | PurePath:
+        """The remote DPDK build dir path."""
+        if self._remote_dpdk_build_dir:
+            return self._remote_dpdk_build_dir
+
+        self._logger.warning(
+            "Failed to get remote dpdk build dir because we don't know the "
+            "location on the SUT node."
+        )
+        return ""
 
     @property
-    def dpdk_version(self) -> str:
+    def dpdk_version(self) -> str | None:
         """Last built DPDK version."""
         if self._dpdk_version is None:
-            self._dpdk_version = self.main_session.get_dpdk_version(self._remote_dpdk_dir)
+            self._dpdk_version = self.main_session.get_dpdk_version(self._remote_dpdk_tree_path)
         return self._dpdk_version
 
     @property
@@ -137,26 +144,25 @@ def node_info(self) -> NodeInfo:
         return self._node_info
 
     @property
-    def compiler_version(self) -> str:
+    def compiler_version(self) -> str | None:
         """The node's compiler version."""
-        if self._compiler_version is None:
-            if self._dpdk_build_config is not None:
-                self._compiler_version = self.main_session.get_compiler_version(
-                    self._dpdk_build_config.compiler.name
-                )
-            else:
-                self._logger.warning(
-                    "Failed to get compiler version because _dpdk_build_config is None."
-                )
-                return ""
         return self._compiler_version
 
+    @compiler_version.setter
+    def compiler_version(self, value: str) -> None:
+        """Set the compiler version used on the SUT.
+
+        Args:
+            value: The node's compiler version.
+        """
+        self._compiler_version = value
+
     @property
-    def path_to_devbind_script(self) -> PurePath:
+    def path_to_devbind_script(self) -> PurePath | str:
         """The path to the dpdk-devbind.py script on the node."""
         if self._path_to_devbind_script is None:
             self._path_to_devbind_script = self.main_session.join_remote_path(
-                self._remote_dpdk_dir, "usertools", "dpdk-devbind.py"
+                self._remote_dpdk_tree_path, "usertools", "dpdk-devbind.py"
             )
         return self._path_to_devbind_script
 
@@ -168,101 +174,209 @@ def get_dpdk_build_info(self) -> DPDKBuildInfo:
         """
         return DPDKBuildInfo(dpdk_version=self.dpdk_version, compiler_version=self.compiler_version)
 
-    def _guess_dpdk_remote_dir(self) -> PurePath:
-        return self.main_session.guess_dpdk_remote_dir(self._remote_tmp_dir)
+    def set_up_test_run(
+        self, test_run_config: TestRunConfiguration, dpdk_location: DPDKLocation
+    ) -> None:
+        """Extend the test run setup with vdev config and DPDK build set up.
 
-    def set_up_test_run(self, test_run_config: TestRunConfiguration) -> None:
-        """Extend the test run setup with vdev config.
+        This method extends the setup process by configuring virtual devices and preparing the DPDK
+        environment based on the provided configuration.
 
         Args:
             test_run_config: A test run configuration according to which
                 the setup steps will be taken.
+            dpdk_location: The target source of the DPDK tree.
         """
-        super().set_up_test_run(test_run_config)
+        super().set_up_test_run(test_run_config, dpdk_location)
         for vdev in test_run_config.vdevs:
             self.virtual_devices.append(VirtualDevice(vdev))
-        self._set_up_dpdk(test_run_config.dpdk_build)
+        self._set_up_dpdk(dpdk_location, test_run_config.dpdk_build_config)
 
     def tear_down_test_run(self) -> None:
-        """Extend the test run teardown with virtual device teardown."""
+        """Extend the test run teardown with virtual device teardown and DPDK teardown."""
         super().tear_down_test_run()
         self.virtual_devices = []
         self._tear_down_dpdk()
 
-    def _set_up_dpdk(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
+    def _set_up_dpdk(
+        self, dpdk_location: DPDKLocation, dpdk_build_config: DPDKBuildConfiguration | None
+    ) -> None:
         """Set up DPDK the SUT node and bind ports.
 
-        DPDK setup includes setting all internals needed for the build, the copying of DPDK tarball
-        and then building DPDK. The drivers are bound to those that DPDK needs.
+        DPDK setup includes setting all internals needed for the build, the copying of DPDK
+        sources and then building DPDK or used the exist ones from the `dpdk_location`. The drivers
+        are bound to those that DPDK needs.
 
         Args:
+            dpdk_location: The target source of the DPDK tree.
             dpdk_build_config: The DPDK build test run configuration according to which
                 the setup steps will be taken.
         """
-        self._configure_dpdk_build(dpdk_build_config)
-        self._copy_dpdk_tarball()
-        self._build_dpdk()
+        self._set_remote_dpdk_tree_path(dpdk_location)
+        if not self._remote_dpdk_tree_path:
+            if dpdk_location.dpdk_tree:
+                self._copy_dpdk_tree(dpdk_location.dpdk_tree)
+            elif dpdk_location.tarball:
+                self._prepare_and_extract_dpdk_tarball(dpdk_location.tarball, dpdk_location.remote)
+
+        self._set_remote_dpdk_build_dir(dpdk_location.build_dir)
+        if not self.remote_dpdk_build_dir and dpdk_build_config:
+            self._configure_dpdk_build(dpdk_build_config)
+            self._build_dpdk()
+
         self.bind_ports_to_driver()
 
     def _tear_down_dpdk(self) -> None:
         """Reset DPDK variables and bind port driver to the OS driver."""
         self._env_vars = {}
-        self._dpdk_build_config = None
-        self.__remote_dpdk_dir = None
+        self.__remote_dpdk_tree_path = None
+        self._remote_dpdk_build_dir = None
         self._dpdk_version = None
         self._compiler_version = None
         self.bind_ports_to_driver(for_dpdk=False)
 
+    def _set_remote_dpdk_tree_path(self, dpdk_location: DPDKLocation):
+        """Set the path to the remote DPDK source tree based on the provided DPDK location.
+
+        Verifies DPDK source tree existence on the SUT node and sets the `_remote_dpdk_tree_path`
+            property.
+
+        Args:
+            dpdk_location: The target source of the DPDK tree.
+
+        Raises:
+            RemoteFileNotFoundError: If the DPDK source tree is expected to be on the SUT node but
+                is not found.
+        """
+        if dpdk_location.remote and dpdk_location.dpdk_tree:
+            if self.main_session.remote_path_exists(dpdk_location.dpdk_tree):
+                self.__remote_dpdk_tree_path = PurePath(dpdk_location.dpdk_tree)
+            else:
+                raise RemoteFileNotFoundError(
+                    f"Remote DPDK source tree '{dpdk_location.dpdk_tree}' not found in SUT node."
+                )
+
+    def _copy_dpdk_tree(self, dpdk_tree_path: str) -> None:
+        """Copy the DPDK source tree to the SUT.
+
+        Args:
+            dpdk_tree_path: The path to DPDK source tree on local filesystem.
+        """
+        self._logger.info(
+            f"Copying DPDK source tree to SUT: '{dpdk_tree_path}' into '{self._remote_tmp_dir}'."
+        )
+        self.main_session.copy_dir_to(dpdk_tree_path, self._remote_tmp_dir, exclude=".git")
+
+        self.__remote_dpdk_tree_path = self.main_session.join_remote_path(
+            self._remote_tmp_dir, PurePath(dpdk_tree_path).name
+        )
+
+    def _prepare_and_extract_dpdk_tarball(self, dpdk_tarball: str, remote: bool) -> None:
+        """Ensure the DPDK tarball is available on the SUT node and extract it.
+
+        This method ensures that the DPDK source tree tarball is available on the
+        SUT node. If the `dpdk_tarball` is local, it is copied to the SUT node. If the
+        `dpdk_tarball` is already on the SUT node, it verifies its existence.
+        The `dpdk_tarball` is then extracted on the SUT node.
+
+        This method sets the `_remote_dpdk_tree_path` property to the path of the
+        extracted DPDK tree on the SUT node.
+
+        Args:
+            dpdk_tarball: The path to the DPDK tarball, either locally or on the SUT node.
+            remote: Indicates whether the `dpdk_tarball` is already on the SUT node.
+
+        Raises:
+            RemoteFileNotFoundError: If the `dpdk_tarball` is expected to be on the SUT node but
+                is not found.
+        """
+        if remote:
+            if not self.main_session.remote_path_exists(dpdk_tarball):
+                raise RemoteFileNotFoundError(
+                    f"Remote DPDK tarball '{dpdk_tarball}' not found in SUT."
+                )
+
+            remote_tarball_path = PurePath(dpdk_tarball)
+        else:
+            self._logger.info(
+                f"Copying DPDK tarball to SUT: '{dpdk_tarball}' into '{self._remote_tmp_dir}'."
+            )
+            self.main_session.copy_to(dpdk_tarball, self._remote_tmp_dir)
+
+            remote_tarball_path = self.main_session.join_remote_path(
+                self._remote_tmp_dir, PurePath(dpdk_tarball).name
+            )
+
+        tarball_top_dir = self.main_session.get_tarball_top_dir(remote_tarball_path)
+        self.__remote_dpdk_tree_path = self.main_session.join_remote_path(
+            PurePath(remote_tarball_path).parent,
+            tarball_top_dir or PurePath(remote_tarball_path).stem,
+        )
+
+        self._logger.info(
+            "Extracting DPDK tarball on SUT: "
+            f"'{remote_tarball_path}' into '{self._remote_dpdk_tree_path}'."
+        )
+        self.main_session.extract_remote_tarball(
+            remote_tarball_path,
+            self._remote_dpdk_tree_path,
+        )
+
+    def _set_remote_dpdk_build_dir(self, build_dir: str | None):
+        """Set the `remote_dpdk_build_dir` on the SUT.
+
+        Args:
+            build_dir: A directory name, which is located inside `_remote_dpdk_tree_path`.
+
+        Raises:
+            RemoteFileNotFoundError: If the `build_dir` does not exist on the SUT node.
+        """
+        if build_dir:
+            remote_dpdk_build_dir = self.main_session.join_remote_path(
+                self._remote_dpdk_tree_path, build_dir
+            )
+            if not self.main_session.remote_path_exists(remote_dpdk_build_dir):
+                raise RemoteFileNotFoundError(
+                    f"Remote DPDK build dir '{remote_dpdk_build_dir}' not found in SUT node."
+                )
+
+            self._remote_dpdk_build_dir = PurePath(remote_dpdk_build_dir)
+
     def _configure_dpdk_build(self, dpdk_build_config: DPDKBuildConfiguration) -> None:
-        """Populate common environment variables and set DPDK build config."""
+        """Populate common environment variables and set the DPDK build related properties.
+
+        This method sets `compiler_version` for additional information and `remote_dpdk_build_dir`
+        from DPDK build config name.
+
+        Args:
+            dpdk_build_config: A DPDK build configuration to test.
+        """
         self._env_vars = {}
-        self._dpdk_build_config = dpdk_build_config
         self._env_vars.update(self.main_session.get_dpdk_build_env_vars(dpdk_build_config.arch))
         self._env_vars["CC"] = dpdk_build_config.compiler.name
         if dpdk_build_config.compiler_wrapper:
-            self._env_vars["CC"] = f"'{self._dpdk_build_config.compiler_wrapper} "
-            f"{self._dpdk_build_config.compiler.name}'"
-
-    @Node.skip_setup
-    def _copy_dpdk_tarball(self) -> None:
-        """Copy to and extract DPDK tarball on the SUT node."""
-        self._logger.info("Copying DPDK tarball to SUT.")
-        self.main_session.copy_to(SETTINGS.dpdk_tarball_path, self._remote_tmp_dir)
-
-        # construct remote tarball path
-        # the basename is the same on local host and on remote Node
-        remote_tarball_path = self.main_session.join_remote_path(
-            self._remote_tmp_dir, os.path.basename(SETTINGS.dpdk_tarball_path)
-        )
+            self._env_vars[
+                "CC"
+            ] = f"'{dpdk_build_config.compiler_wrapper} {dpdk_build_config.compiler.name}'"
 
-        # construct remote path after extracting
-        with tarfile.open(SETTINGS.dpdk_tarball_path) as dpdk_tar:
-            dpdk_top_dir = dpdk_tar.getnames()[0]
-        self._remote_dpdk_dir = self.main_session.join_remote_path(
-            self._remote_tmp_dir, dpdk_top_dir
+        self.compiler_version = self.main_session.get_compiler_version(
+            dpdk_build_config.compiler.name
         )
 
-        self._logger.info(
-            f"Extracting DPDK tarball on SUT: "
-            f"'{remote_tarball_path}' into '{self._remote_dpdk_dir}'."
+        self._remote_dpdk_build_dir = self.main_session.join_remote_path(
+            self._remote_dpdk_tree_path, dpdk_build_config.name
         )
-        # clean remote path where we're extracting
-        self.main_session.remove_remote_dir(self._remote_dpdk_dir)
-
-        # then extract to remote path
-        self.main_session.extract_remote_tarball(remote_tarball_path, self._remote_dpdk_dir)
 
-    @Node.skip_setup
     def _build_dpdk(self) -> None:
         """Build DPDK.
 
-        Uses the already configured target. Assumes that the tarball has
-        already been copied to and extracted on the SUT node.
+        Uses the already configured DPDK build configuration. Assumes that the
+        `_remote_dpdk_tree_path` has already been sets on the SUT node.
         """
         self.main_session.build_dpdk(
             self._env_vars,
             MesonArgs(default_library="static", enable_kmods=True, libdir="lib"),
-            self._remote_dpdk_dir,
+            self._remote_dpdk_tree_path,
             self.remote_dpdk_build_dir,
         )
 
@@ -285,7 +399,7 @@ def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePa
             self._env_vars,
             MesonArgs(examples=app_name, **meson_dpdk_args),  # type: ignore [arg-type]
             # ^^ https://github.com/python/mypy/issues/11583
-            self._remote_dpdk_dir,
+            self._remote_dpdk_tree_path,
             self.remote_dpdk_build_dir,
             rebuild=True,
             timeout=self._app_compile_timeout,
-- 
2.43.0


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [RFC PATCH v1 09/12] doc: update argument options for external DPDK build
  2024-09-06 13:26 [RFC PATCH v1 00/12] DTS external DPDK build and stats Juraj Linkeš
                   ` (7 preceding siblings ...)
  2024-09-06 13:26 ` [RFC PATCH v1 08/12] dts: add support for externally compiled DPDK Juraj Linkeš
@ 2024-09-06 13:26 ` Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 10/12] dts: remove git ref option Juraj Linkeš
                   ` (2 subsequent siblings)
  11 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-06 13:26 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, paul.szczepanek, Luca.Vizzarro,
	alex.chapman, probb, jspewock, npratte, dmarx
  Cc: dev, Tomáš Ďurovec

From: Tomáš Ďurovec <tomas.durovec@pantheon.tech>

Signed-off-by: Tomáš Ďurovec <tomas.durovec@pantheon.tech>
---
 doc/guides/tools/dts.rst | 8 +++++---
 1 file changed, 5 insertions(+), 3 deletions(-)

diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 059776c888..8aac22bc60 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -235,12 +235,14 @@ DTS is run with ``main.py`` located in the ``dts`` directory after entering Poet
      -t SECONDS, --timeout SECONDS
                            [DTS_TIMEOUT] The default timeout for all DTS operations except for compiling DPDK. (default: 15)
      -v, --verbose         [DTS_VERBOSE] Specify to enable verbose output, logging all messages to the console. (default: False)
-     -s, --skip-setup      [DTS_SKIP_SETUP] Specify to skip all setup steps on SUT and TG nodes. (default: False)
+     --dpdk-tree DIR_PATH  [DTS_DPDK_TREE] Path to DPDK source code tree to test. (default: None)
      --tarball FILE_PATH, --snapshot FILE_PATH
                            [DTS_DPDK_TARBALL] Path to DPDK source code tarball to test. (default: None)
      --revision ID, --rev ID, --git-ref ID
                            [DTS_DPDK_REVISION_ID] Git revision ID to test. Could be commit, tag, tree ID etc. To test local changes, first
                            commit them, then use their commit ID. (default: None)
+     --remote-source       [DTS_REMOTE_SOURCE] Set when the DPDK source tree or tarball is located on the SUT node. (default: False)
+     --build-dir DIR_NAME  [DTS_BUILD_DIR] A directory name, which would be located in the `dpdk tree` or `tarball`. (default: None)
      -f, --force           [DTS_FORCE] Specify to remove an already existing dpdk tarball before copying/extracting a new one. (default: False)
      --compile-timeout SECONDS
                            [DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK. (default: 1200)
@@ -255,8 +257,8 @@ DTS is run with ``main.py`` located in the ``dts`` directory after entering Poet
 
 
 The brackets contain the names of environment variables that set the same thing.
-The minimum DTS needs is a config file and a DPDK tarball or git ref ID.
-You may pass those to DTS using the command line arguments or use the default paths.
+The minimum DTS needs is a config file and a DPDK source which can add in config
+or command line argument/environment variable option.
 
 Example command for running DTS with the template configuration and DPDK tag v23.11:
 
-- 
2.43.0


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [RFC PATCH v1 10/12] dts: remove git ref option
  2024-09-06 13:26 [RFC PATCH v1 00/12] DTS external DPDK build and stats Juraj Linkeš
                   ` (8 preceding siblings ...)
  2024-09-06 13:26 ` [RFC PATCH v1 09/12] doc: update argument options for external DPDK build Juraj Linkeš
@ 2024-09-06 13:26 ` Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 11/12] doc: remove git-ref argument Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 12/12] dts: improve statistics Juraj Linkeš
  11 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-06 13:26 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, paul.szczepanek, Luca.Vizzarro,
	alex.chapman, probb, jspewock, npratte, dmarx
  Cc: dev, Tomáš Ďurovec

From: Tomáš Ďurovec <tomas.durovec@pantheon.tech>

Signed-off-by: Tomáš Ďurovec <tomas.durovec@pantheon.tech>
---
 dts/framework/settings.py |  31 ----------
 dts/framework/utils.py    | 117 --------------------------------------
 2 files changed, 148 deletions(-)

diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 97acd62fd8..d514e887d3 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -49,12 +49,6 @@
 
     Path to DPDK source code tarball to test.
 
-.. option:: --revision, --rev, --git-ref
-.. envvar:: DTS_DPDK_REVISION_ID
-
-    Git revision ID to test. Could be commit, tag, tree ID etc.
-    To test local changes, first commit them, then use their commit ID.
-
 .. option:: --remote-source
 .. envvar:: DTS_REMOTE_SOURCE
 
@@ -101,8 +95,6 @@
 from typing import Callable
 
 from .config import DPDKLocation, TestSuiteConfig
-from .exception import ConfigurationError
-from .utils import DPDKGitTarball, get_commit_id
 
 
 @dataclass(slots=True)
@@ -249,14 +241,6 @@ def _get_help_string(self, action):
         return help
 
 
-def _parse_revision_id(rev_id: str) -> str:
-    """Validate revision ID and retrieve corresponding commit ID."""
-    try:
-        return get_commit_id(rev_id)
-    except ConfigurationError:
-        raise argparse.ArgumentTypeError("The Git revision ID supplied is invalid or ambiguous")
-
-
 def _required_with_one_of(parser: _DTSArgumentParser, action: Action, *required_dests: str) -> None:
     """Verify that `action` is listed together with `required_dests`.
 
@@ -372,18 +356,6 @@ def _get_parser() -> _DTSArgumentParser:
     )
     _add_env_var_to_action(action, "DPDK_TARBALL")
 
-    action = dpdk_source.add_argument(
-        "--revision",
-        "--rev",
-        "--git-ref",
-        type=_parse_revision_id,
-        help="Git revision ID to test. Could be commit, tag, tree ID etc. "
-        "To test local changes, first commit them, then use their commit ID.",
-        metavar="ID",
-        dest="dpdk_revision_id",
-    )
-    _add_env_var_to_action(action)
-
     action = parser.add_argument(
         "--remote-source",
         action="store_true",
@@ -526,9 +498,6 @@ def get_settings() -> Settings:
     parser = _get_parser()
     args = parser.parse_args()
 
-    if args.dpdk_revision_id:
-        args.dpdk_tarball_path = Path(DPDKGitTarball(args.dpdk_revision_id, args.output_dir))
-
     args.dpdk_location = _process_dpdk_location(
         args.dpdk_tree_path, args.dpdk_tarball_path, args.remote_source, args.build_dir
     )
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index 5757872fbd..37313c268b 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -14,21 +14,16 @@
     REGEX_FOR_PCI_ADDRESS: The regex representing a PCI address, e.g. ``0000:00:08.0``.
 """
 
-import atexit
 import fnmatch
 import json
 import os
-import subprocess
 import tarfile
 from enum import Enum
 from pathlib import Path
-from subprocess import SubprocessError
 from typing import Any
 
 from scapy.packet import Packet  # type: ignore[import-untyped]
 
-from .exception import ConfigurationError
-
 REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
 
 
@@ -74,31 +69,6 @@ def get_packet_summaries(packets: list[Packet]) -> str:
     return f"Packet contents: \n{packet_summaries}"
 
 
-def get_commit_id(rev_id: str) -> str:
-    """Given a Git revision ID, return the corresponding commit ID.
-
-    Args:
-        rev_id: The Git revision ID.
-
-    Raises:
-        ConfigurationError: The ``git rev-parse`` command failed, suggesting
-            an invalid or ambiguous revision ID was supplied.
-    """
-    result = subprocess.run(
-        ["git", "rev-parse", "--verify", rev_id],
-        text=True,
-        capture_output=True,
-    )
-    if result.returncode != 0:
-        raise ConfigurationError(
-            f"{rev_id} is not a valid git reference.\n"
-            f"Command: {result.args}\n"
-            f"Stdout: {result.stdout}\n"
-            f"Stderr: {result.stderr}"
-        )
-    return result.stdout.strip()
-
-
 class StrEnum(Enum):
     """Enum with members stored as strings."""
 
@@ -174,93 +144,6 @@ def extension(self):
         return f".{self.value}" if self == self.none else f".{self.none.value}.{self.value}"
 
 
-class DPDKGitTarball:
-    """Compressed tarball of DPDK from the repository.
-
-    The class supports the :class:`os.PathLike` protocol,
-    which is used to get the Path of the tarball::
-
-        from pathlib import Path
-        tarball = DPDKGitTarball("HEAD", "output")
-        tarball_path = Path(tarball)
-    """
-
-    _git_ref: str
-    _tar_compression_format: TarCompressionFormat
-    _tarball_dir: Path
-    _tarball_name: str
-    _tarball_path: Path | None
-
-    def __init__(
-        self,
-        git_ref: str,
-        output_dir: str,
-        tar_compression_format: TarCompressionFormat = TarCompressionFormat.xz,
-    ):
-        """Create the tarball during initialization.
-
-        The DPDK version is specified with `git_ref`. The tarball will be compressed with
-        `tar_compression_format`, which must be supported by the DTS execution environment.
-        The resulting tarball will be put into `output_dir`.
-
-        Args:
-            git_ref: A git commit ID, tag ID or tree ID.
-            output_dir: The directory where to put the resulting tarball.
-            tar_compression_format: The compression format to use.
-        """
-        self._git_ref = git_ref
-        self._tar_compression_format = tar_compression_format
-
-        self._tarball_dir = Path(output_dir, "tarball")
-
-        self._create_tarball_dir()
-
-        self._tarball_name = f"dpdk-tarball-{self._git_ref}{self._tar_compression_format.extension}"
-        self._tarball_path = self._check_tarball_path()
-        if not self._tarball_path:
-            self._create_tarball()
-
-    def _create_tarball_dir(self) -> None:
-        os.makedirs(self._tarball_dir, exist_ok=True)
-
-    def _check_tarball_path(self) -> Path | None:
-        if self._tarball_name in os.listdir(self._tarball_dir):
-            return Path(self._tarball_dir, self._tarball_name)
-        return None
-
-    def _create_tarball(self) -> None:
-        self._tarball_path = Path(self._tarball_dir, self._tarball_name)
-
-        atexit.register(self._delete_tarball)
-
-        result = subprocess.run(
-            'git -C "$(git rev-parse --show-toplevel)" archive '
-            f'{self._git_ref} --prefix="dpdk-tarball-{self._git_ref + os.sep}" | '
-            f"{self._tar_compression_format} > {Path(self._tarball_path.absolute())}",
-            shell=True,
-            text=True,
-            capture_output=True,
-        )
-
-        if result.returncode != 0:
-            raise SubprocessError(
-                f"Git archive creation failed with exit code {result.returncode}.\n"
-                f"Command: {result.args}\n"
-                f"Stdout: {result.stdout}\n"
-                f"Stderr: {result.stderr}"
-            )
-
-        atexit.unregister(self._delete_tarball)
-
-    def _delete_tarball(self) -> None:
-        if self._tarball_path and os.path.exists(self._tarball_path):
-            os.remove(self._tarball_path)
-
-    def __fspath__(self) -> str:
-        """The os.PathLike protocol implementation."""
-        return str(self._tarball_path)
-
-
 def ensure_list_of_strings(value: Any | list[Any]) -> list[str]:
     """Ensure the input is a list of strings.
 
-- 
2.43.0


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [RFC PATCH v1 11/12] doc: remove git-ref argument
  2024-09-06 13:26 [RFC PATCH v1 00/12] DTS external DPDK build and stats Juraj Linkeš
                   ` (9 preceding siblings ...)
  2024-09-06 13:26 ` [RFC PATCH v1 10/12] dts: remove git ref option Juraj Linkeš
@ 2024-09-06 13:26 ` Juraj Linkeš
  2024-09-06 13:26 ` [RFC PATCH v1 12/12] dts: improve statistics Juraj Linkeš
  11 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-06 13:26 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, paul.szczepanek, Luca.Vizzarro,
	alex.chapman, probb, jspewock, npratte, dmarx
  Cc: dev, Tomáš Ďurovec

From: Tomáš Ďurovec <tomas.durovec@pantheon.tech>

Signed-off-by: Tomáš Ďurovec <tomas.durovec@pantheon.tech>
---
 doc/guides/tools/dts.rst | 8 --------
 1 file changed, 8 deletions(-)

diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 8aac22bc60..55e9c37c9b 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -236,8 +236,6 @@ DTS is run with ``main.py`` located in the ``dts`` directory after entering Poet
                            [DTS_TIMEOUT] The default timeout for all DTS operations except for compiling DPDK. (default: 15)
      -v, --verbose         [DTS_VERBOSE] Specify to enable verbose output, logging all messages to the console. (default: False)
      --dpdk-tree DIR_PATH  [DTS_DPDK_TREE] Path to DPDK source code tree to test. (default: None)
-     --tarball FILE_PATH, --snapshot FILE_PATH
-                           [DTS_DPDK_TARBALL] Path to DPDK source code tarball to test. (default: None)
      --revision ID, --rev ID, --git-ref ID
                            [DTS_DPDK_REVISION_ID] Git revision ID to test. Could be commit, tag, tree ID etc. To test local changes, first
                            commit them, then use their commit ID. (default: None)
@@ -260,12 +258,6 @@ The brackets contain the names of environment variables that set the same thing.
 The minimum DTS needs is a config file and a DPDK source which can add in config
 or command line argument/environment variable option.
 
-Example command for running DTS with the template configuration and DPDK tag v23.11:
-
-.. code-block:: console
-
-   (dts-py3.10) $ ./main.py --git-ref v23.11
-
 
 DTS Results
 ~~~~~~~~~~~
-- 
2.43.0


^ permalink raw reply	[flat|nested] 13+ messages in thread

* [RFC PATCH v1 12/12] dts: improve statistics
  2024-09-06 13:26 [RFC PATCH v1 00/12] DTS external DPDK build and stats Juraj Linkeš
                   ` (10 preceding siblings ...)
  2024-09-06 13:26 ` [RFC PATCH v1 11/12] doc: remove git-ref argument Juraj Linkeš
@ 2024-09-06 13:26 ` Juraj Linkeš
  11 siblings, 0 replies; 13+ messages in thread
From: Juraj Linkeš @ 2024-09-06 13:26 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, paul.szczepanek, Luca.Vizzarro,
	alex.chapman, probb, jspewock, npratte, dmarx
  Cc: dev, Tomáš Ďurovec

From: Tomáš Ďurovec <tomas.durovec@pantheon.tech>

Signed-off-by: Tomáš Ďurovec <tomas.durovec@pantheon.tech>
---
 dts/framework/runner.py      |   5 +-
 dts/framework/test_result.py | 272 +++++++++++++++++++++++------------
 2 files changed, 187 insertions(+), 90 deletions(-)

diff --git a/dts/framework/runner.py b/dts/framework/runner.py
index c4ac5db194..ff8270a8d7 100644
--- a/dts/framework/runner.py
+++ b/dts/framework/runner.py
@@ -419,7 +419,8 @@ def _run_test_run(
         self._logger.info(
             f"Running test run with SUT '{test_run_config.system_under_test_node.name}'."
         )
-        test_run_result.add_sut_info(sut_node.node_info)
+        test_run_result.ports = sut_node.ports
+        test_run_result.sut_info = sut_node.node_info
         try:
             dpdk_location = SETTINGS.dpdk_location or test_run_config.dpdk_location
             if not dpdk_location:
@@ -431,7 +432,7 @@ def _run_test_run(
                 )
 
             sut_node.set_up_test_run(test_run_config, dpdk_location)
-            test_run_result.add_dpdk_build_info(sut_node.get_dpdk_build_info())
+            test_run_result.dpdk_build_info = sut_node.get_dpdk_build_info()
             tg_node.set_up_test_run(test_run_config, dpdk_location)
             test_run_result.update_setup(Result.PASS)
         except Exception as e:
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index c4343602aa..cfa1171d7b 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -22,18 +22,20 @@
 variable modify the directory where the files with results will be stored.
 """
 
-import os.path
+import json
 from collections.abc import MutableSequence
-from dataclasses import dataclass
+from dataclasses import asdict, dataclass
 from enum import Enum, auto
+from pathlib import Path
 from types import FunctionType
-from typing import Union
+from typing import Any, TypedDict
 
 from .config import DPDKBuildInfo, NodeInfo, TestRunConfiguration, TestSuiteConfig
 from .exception import DTSError, ErrorSeverity
 from .logger import DTSLogger
 from .settings import SETTINGS
 from .test_suite import TestSuite
+from .testbed_model.port import Port
 
 
 @dataclass(slots=True, frozen=True)
@@ -85,6 +87,29 @@ def __bool__(self) -> bool:
         return self is self.PASS
 
 
+class TestCaseResultDict(TypedDict):
+    test_case_name: str
+    result: str
+
+
+class TestSuiteResultDict(TypedDict):
+    test_suite_name: str
+    test_cases: list[TestCaseResultDict]
+
+
+class TestRunResultDict(TypedDict, total=False):
+    compiler_version: str | None
+    dpdk_version: str | None
+    ports: list[dict[str, Any]] | None
+    test_suites: list[TestSuiteResultDict]
+    summary: dict[str, Any]
+
+
+class DtsRunResultDict(TypedDict):
+    test_runs: list[TestRunResultDict]
+    summary: dict[str, Any]
+
+
 class FixtureResult:
     """A record that stores the result of a setup or a teardown.
 
@@ -198,14 +223,12 @@ def get_errors(self) -> list[Exception]:
         """
         return self._get_setup_teardown_errors() + self._get_child_errors()
 
-    def add_stats(self, statistics: "Statistics") -> None:
-        """Collate stats from the whole result hierarchy.
+    def to_dict(self):
+        """ """
 
-        Args:
-            statistics: The :class:`Statistics` object where the stats will be collated.
-        """
+    def add_result(self, results: dict[str, Any] | dict[str, float]):
         for child_result in self.child_results:
-            child_result.add_stats(statistics)
+            child_result.add_result(results)
 
 
 class DTSResult(BaseResult):
@@ -229,8 +252,6 @@ class DTSResult(BaseResult):
     _logger: DTSLogger
     _errors: list[Exception]
     _return_code: ErrorSeverity
-    _stats_result: Union["Statistics", None]
-    _stats_filename: str
 
     def __init__(self, logger: DTSLogger):
         """Extend the constructor with top-level specifics.
@@ -243,8 +264,6 @@ def __init__(self, logger: DTSLogger):
         self._logger = logger
         self._errors = []
         self._return_code = ErrorSeverity.NO_ERR
-        self._stats_result = None
-        self._stats_filename = os.path.join(SETTINGS.output_dir, "statistics.txt")
 
     def add_test_run(self, test_run_config: TestRunConfiguration) -> "TestRunResult":
         """Add and return the child result (test run).
@@ -281,10 +300,8 @@ def process(self) -> None:
             for error in self._errors:
                 self._logger.debug(repr(error))
 
-        self._stats_result = Statistics(self.dpdk_version)
-        self.add_stats(self._stats_result)
-        with open(self._stats_filename, "w+") as stats_file:
-            stats_file.write(str(self._stats_result))
+        TextSummary(self).save(Path(SETTINGS.output_dir, "results_summary.txt"))
+        JsonResults(self).save(Path(SETTINGS.output_dir, "results.json"))
 
     def get_return_code(self) -> int:
         """Go through all stored Exceptions and return the final DTS error code.
@@ -302,6 +319,16 @@ def get_return_code(self) -> int:
 
         return int(self._return_code)
 
+    def to_dict(self) -> DtsRunResultDict:
+        def merge_all_results(all_results: list[dict[str, Any]]) -> dict[str, Any]:
+            return {key.name: sum(d[key.name] for d in all_results) for key in Result}
+
+        test_runs = [child.to_dict() for child in self.child_results]
+        return {
+            "test_runs": test_runs,
+            "summary": merge_all_results([test_run["summary"] for test_run in test_runs]),
+        }
+
 
 class TestRunResult(BaseResult):
     """The test run specific result.
@@ -316,13 +343,11 @@ class TestRunResult(BaseResult):
         sut_kernel_version: The operating system kernel version of the SUT node.
     """
 
-    compiler_version: str | None
-    dpdk_version: str | None
-    sut_os_name: str
-    sut_os_version: str
-    sut_kernel_version: str
     _config: TestRunConfiguration
     _test_suites_with_cases: list[TestSuiteWithCases]
+    _ports: list[Port]
+    _sut_info: NodeInfo | None
+    _dpdk_build_info: DPDKBuildInfo | None
 
     def __init__(self, test_run_config: TestRunConfiguration):
         """Extend the constructor with the test run's config.
@@ -331,10 +356,11 @@ def __init__(self, test_run_config: TestRunConfiguration):
             test_run_config: A test run configuration.
         """
         super().__init__()
-        self.compiler_version = None
-        self.dpdk_version = None
         self._config = test_run_config
         self._test_suites_with_cases = []
+        self._ports = []
+        self._sut_info = None
+        self._dpdk_build_info = None
 
     def add_test_suite(
         self,
@@ -374,24 +400,70 @@ def test_suites_with_cases(self, test_suites_with_cases: list[TestSuiteWithCases
             )
         self._test_suites_with_cases = test_suites_with_cases
 
-    def add_sut_info(self, sut_info: NodeInfo) -> None:
-        """Add SUT information gathered at runtime.
+    @property
+    def ports(self) -> list[Port]:
+        """The list of ports associated with the test run.
 
-        Args:
-            sut_info: The additional SUT node information.
+        This list stores all the ports that are involved in the test run.
+        Ports can only be assigned once, and attempting to modify them after
+        assignment will raise an error.
+
+        Returns:
+            A list of `Port` objects associated with the test run.
         """
-        self.sut_os_name = sut_info.os_name
-        self.sut_os_version = sut_info.os_version
-        self.sut_kernel_version = sut_info.kernel_version
+        return self._ports
 
-    def add_dpdk_build_info(self, versions: DPDKBuildInfo) -> None:
-        """Add information about the DPDK build gathered at runtime.
+    @ports.setter
+    def ports(self, ports: list[Port]) -> None:
+        if self._ports:
+            raise ValueError(
+                "Attempted to assign ports to a test run result which already has ports."
+            )
+        self._ports = ports
 
-        Args:
-            versions: The additional information.
-        """
-        self.compiler_version = versions.compiler_version
-        self.dpdk_version = versions.dpdk_version
+    @property
+    def sut_info(self) -> NodeInfo | None:
+        return self._sut_info
+
+    @sut_info.setter
+    def sut_info(self, sut_info: NodeInfo) -> None:
+        if self._sut_info:
+            raise ValueError(
+                "Attempted to assign `sut_info` to a test run result which already has `sut_info`."
+            )
+        self._sut_info = sut_info
+
+    @property
+    def dpdk_build_info(self) -> DPDKBuildInfo | None:
+        return self._dpdk_build_info
+
+    @dpdk_build_info.setter
+    def dpdk_build_info(self, dpdk_build_info: DPDKBuildInfo) -> None:
+        if self._dpdk_build_info:
+            raise ValueError(
+                "Attempted to assign `dpdk_build_info` to a test run result which already "
+                "has `dpdk_build_info`."
+            )
+        self._dpdk_build_info = dpdk_build_info
+
+    def to_dict(self) -> TestRunResultDict:
+        results = {result.name: 0 for result in Result}
+        self.add_result(results)
+
+        compiler_version = None
+        dpdk_version = None
+
+        if self.dpdk_build_info:
+            compiler_version = self.dpdk_build_info.compiler_version
+            dpdk_version = self.dpdk_build_info.dpdk_version
+
+        return {
+            "compiler_version": compiler_version,
+            "dpdk_version": dpdk_version,
+            "ports": [asdict(port) for port in self.ports] or None,
+            "test_suites": [child.to_dict() for child in self.child_results],
+            "summary": results,
+        }
 
     def _block_result(self) -> None:
         r"""Mark the result as :attr:`~Result.BLOCK`\ed."""
@@ -436,6 +508,12 @@ def add_test_case(self, test_case_name: str) -> "TestCaseResult":
         self.child_results.append(result)
         return result
 
+    def to_dict(self) -> TestSuiteResultDict:
+        return {
+            "test_suite_name": self.test_suite_name,
+            "test_cases": [child.to_dict() for child in self.child_results],
+        }
+
     def _block_result(self) -> None:
         r"""Mark the result as :attr:`~Result.BLOCK`\ed."""
         for test_case_method in self._test_suite_with_cases.test_cases:
@@ -483,16 +561,12 @@ def _get_child_errors(self) -> list[Exception]:
             return [self.error]
         return []
 
-    def add_stats(self, statistics: "Statistics") -> None:
-        r"""Add the test case result to statistics.
+    def to_dict(self) -> TestCaseResultDict:
+        """Convert the test case result to a dictionary."""
+        return {"test_case_name": self.test_case_name, "result": self.result.name}
 
-        The base method goes through the hierarchy recursively and this method is here to stop
-        the recursion, as the :class:`TestCaseResult`\s are the leaves of the hierarchy tree.
-
-        Args:
-            statistics: The :class:`Statistics` object where the stats will be added.
-        """
-        statistics += self.result
+    def add_result(self, results: dict[str, Any]):
+        results[self.result.name] += 1
 
     def _block_result(self) -> None:
         r"""Mark the result as :attr:`~Result.BLOCK`\ed."""
@@ -503,53 +577,75 @@ def __bool__(self) -> bool:
         return bool(self.setup_result) and bool(self.teardown_result) and bool(self.result)
 
 
-class Statistics(dict):
-    """How many test cases ended in which result state along some other basic information.
+class TextSummary:
+    def __init__(self, dts_run_result: DTSResult) -> None:
+        self._dics_result = dts_run_result.to_dict()
+        self._summary = self._dics_result["summary"]
+        self._text = ""
 
-    Subclassing :class:`dict` provides a convenient way to format the data.
-
-    The data are stored in the following keys:
-
-    * **PASS RATE** (:class:`int`) -- The FAIL/PASS ratio of all test cases.
-    * **DPDK VERSION** (:class:`str`) -- The tested DPDK version.
-    """
+    @property
+    def _outdent(self) -> str:
+        """Appropriate indentation based on multiple test run results."""
+        return "\t" if len(self._dics_result["test_runs"]) > 1 else ""
 
-    def __init__(self, dpdk_version: str | None):
-        """Extend the constructor with keys in which the data are stored.
+    def save(self, output_path: Path):
+        """Save the generated text statistics to a file.
 
         Args:
-            dpdk_version: The version of tested DPDK.
+            output_path: The path where the text file will be saved.
         """
-        super().__init__()
-        for result in Result:
-            self[result.name] = 0
-        self["PASS RATE"] = 0.0
-        self["DPDK VERSION"] = dpdk_version
+        if self._dics_result["test_runs"]:
+            with open(f"{output_path}", "w") as fp:
+                self._init_text()
+                fp.write(self._text)
+
+    def _init_text(self):
+        if len(self._dics_result["test_runs"]) > 1:
+            self._add_test_runs_dics()
+            self._add_overall_results()
+        else:
+            test_run_result = self._dics_result["test_runs"][0]
+            self._add_test_run_dics(test_run_result)
+
+    def _add_test_runs_dics(self):
+        for idx, test_run_dics in enumerate(self._dics_result["test_runs"]):
+            self._text += f"TEST_RUN_{idx}\n"
+            self._add_test_run_dics(test_run_dics)
+            self._text += "\n"
+
+    def _add_test_run_dics(self, test_run_dics: TestRunResultDict):
+        self._add_pass_rate_to_results(test_run_dics["summary"])
+        self._add_column(
+            DPDK_VERSION=test_run_dics["dpdk_version"],
+            COMPILER_VERSION=test_run_dics["compiler_version"],
+            **test_run_dics["summary"],
+        )
+
+    def _add_pass_rate_to_results(self, results_dics: dict[str, Any]):
+        results_dics["PASS_RATE"] = (
+            float(results_dics[Result.PASS.name])
+            * 100
+            / sum(results_dics[result.name] for result in Result)
+        )
 
-    def __iadd__(self, other: Result) -> "Statistics":
-        """Add a Result to the final count.
+    def _add_column(self, **rows):
+        rows = {k: "N/A" if v is None else v for k, v in rows.items()}
+        max_length = len(max(rows, key=len))
+        for key, value in rows.items():
+            self._text += f"{self._outdent}{key:<{max_length}} = {value}\n"
 
-        Example:
-            stats: Statistics = Statistics()  # empty Statistics
-            stats += Result.PASS  # add a Result to `stats`
+    def _add_overall_results(self):
+        self._text += "OVERALL\n"
+        self._add_pass_rate_to_results(self._summary)
+        self._add_column(**self._summary)
 
-        Args:
-            other: The Result to add to this statistics object.
 
-        Returns:
-            The modified statistics object.
-        """
-        self[other.name] += 1
-        self["PASS RATE"] = (
-            float(self[Result.PASS.name]) * 100 / sum(self[result.name] for result in Result)
-        )
-        return self
-
-    def __str__(self) -> str:
-        """Each line contains the formatted key = value pair."""
-        stats_str = ""
-        for key, value in self.items():
-            stats_str += f"{key:<12} = {value}\n"
-            # according to docs, we should use \n when writing to text files
-            # on all platforms
-        return stats_str
+class JsonResults:
+    _dics_result: DtsRunResultDict
+
+    def __init__(self, dts_run_result: DTSResult):
+        self._dics_result = dts_run_result.to_dict()
+
+    def save(self, output_path: Path):
+        with open(f"{output_path}", "w") as fp:
+            json.dump(self._dics_result, fp, indent=4)
-- 
2.43.0


^ permalink raw reply	[flat|nested] 13+ messages in thread

end of thread, other threads:[~2024-09-06 13:28 UTC | newest]

Thread overview: 13+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2024-09-06 13:26 [RFC PATCH v1 00/12] DTS external DPDK build and stats Juraj Linkeš
2024-09-06 13:26 ` [RFC PATCH v1 01/12] dts: rename build target to DPDK build Juraj Linkeš
2024-09-06 13:26 ` [RFC PATCH v1 02/12] dts: one dpdk build per test run Juraj Linkeš
2024-09-06 13:26 ` [RFC PATCH v1 03/12] dts: fix remote session transferring files Juraj Linkeš
2024-09-06 13:26 ` [RFC PATCH v1 04/12] dts: improve path handling for local and remote paths Juraj Linkeš
2024-09-06 13:26 ` [RFC PATCH v1 05/12] dts: add the ability to copy directories via remote Juraj Linkeš
2024-09-06 13:26 ` [RFC PATCH v1 06/12] dts: add ability to prevent overwriting files/dirs Juraj Linkeš
2024-09-06 13:26 ` [RFC PATCH v1 07/12] dts: update argument option for prevent overwriting Juraj Linkeš
2024-09-06 13:26 ` [RFC PATCH v1 08/12] dts: add support for externally compiled DPDK Juraj Linkeš
2024-09-06 13:26 ` [RFC PATCH v1 09/12] doc: update argument options for external DPDK build Juraj Linkeš
2024-09-06 13:26 ` [RFC PATCH v1 10/12] dts: remove git ref option Juraj Linkeš
2024-09-06 13:26 ` [RFC PATCH v1 11/12] doc: remove git-ref argument Juraj Linkeš
2024-09-06 13:26 ` [RFC PATCH v1 12/12] dts: improve statistics Juraj Linkeš

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).