* [PATCH v2 0/2] Add DTS smoke tests
@ 2023-07-10 16:19 jspewock
2023-07-10 16:19 ` [PATCH v2 1/2] dts: add " jspewock
` (2 more replies)
0 siblings, 3 replies; 8+ messages in thread
From: jspewock @ 2023-07-10 16:19 UTC (permalink / raw)
To: Honnappa.Nagarahalli, juraj.linkes, thomas, lijuan.tu,
wathsala.vithanage, probb
Cc: dev, Jeremy Spewock
From: Jeremy Spewock <jspewock@iol.unh.edu>
This version of the series addresses comments and improvements listed on
the previous non-RFC patch.
RFCs for this patch:
* v3: https://mails.dpdk.org/archives/dev/2023-June/269859.html
* v2: https://mails.dpdk.org/archives/dev/2023-May/267915.html
* v1: https://mails.dpdk.org/archives/dev/2023-April/266580.html
Previous patch:
* v1: https://mails.dpdk.org/archives/dev/2023-June/271309.html
Jeremy Spewock (2):
dts: add smoke tests
dts: add paramiko to dependencies
dts/conf.yaml | 17 +-
dts/framework/config/__init__.py | 116 +++++++++--
dts/framework/config/conf_yaml_schema.json | 142 +++++++++++++-
dts/framework/dts.py | 88 ++++++---
dts/framework/exception.py | 12 ++
dts/framework/remote_session/__init__.py | 10 +-
dts/framework/remote_session/os_session.py | 24 ++-
dts/framework/remote_session/posix_session.py | 29 ++-
.../remote_session/remote/__init__.py | 10 +
.../remote/interactive_remote_session.py | 118 ++++++++++++
.../remote/interactive_shell.py | 99 ++++++++++
.../remote_session/remote/testpmd_shell.py | 67 +++++++
dts/framework/test_result.py | 37 +++-
dts/framework/test_suite.py | 21 +-
dts/framework/testbed_model/node.py | 2 +
dts/framework/testbed_model/sut_node.py | 180 +++++++++++++-----
dts/poetry.lock | 160 ++++++++++++----
dts/pyproject.toml | 1 +
dts/tests/TestSuite_smoke_tests.py | 118 ++++++++++++
19 files changed, 1118 insertions(+), 133 deletions(-)
create mode 100644 dts/framework/remote_session/remote/interactive_remote_session.py
create mode 100644 dts/framework/remote_session/remote/interactive_shell.py
create mode 100644 dts/framework/remote_session/remote/testpmd_shell.py
create mode 100644 dts/tests/TestSuite_smoke_tests.py
--
2.41.0
^ permalink raw reply [flat|nested] 8+ messages in thread
* [PATCH v2 1/2] dts: add smoke tests
2023-07-10 16:19 [PATCH v2 0/2] Add DTS smoke tests jspewock
@ 2023-07-10 16:19 ` jspewock
2023-07-11 9:42 ` Juraj Linkeš
2023-07-10 16:19 ` [PATCH v2 2/2] dts: add paramiko to dependencies jspewock
2023-08-18 18:34 ` [PATCH v2 0/2] Add DTS smoke tests Adam Hassick
2 siblings, 1 reply; 8+ messages in thread
From: jspewock @ 2023-07-10 16:19 UTC (permalink / raw)
To: Honnappa.Nagarahalli, juraj.linkes, thomas, lijuan.tu,
wathsala.vithanage, probb
Cc: dev, Jeremy Spewock
From: Jeremy Spewock <jspewock@iol.unh.edu>
Adds a new test suite for running smoke tests that verify general
configuration aspects of the system under test. If any of these tests
fail, the DTS execution terminates as part of a "fail-fast" model.
Signed-off-by: Jeremy Spewock <jspewock@iol.unh.edu>
---
dts/conf.yaml | 17 +-
dts/framework/config/__init__.py | 116 +++++++++--
dts/framework/config/conf_yaml_schema.json | 142 +++++++++++++-
dts/framework/dts.py | 88 ++++++---
dts/framework/exception.py | 12 ++
dts/framework/remote_session/__init__.py | 10 +-
dts/framework/remote_session/os_session.py | 24 ++-
dts/framework/remote_session/posix_session.py | 29 ++-
.../remote_session/remote/__init__.py | 10 +
.../remote/interactive_remote_session.py | 118 ++++++++++++
.../remote/interactive_shell.py | 99 ++++++++++
.../remote_session/remote/testpmd_shell.py | 67 +++++++
dts/framework/test_result.py | 37 +++-
dts/framework/test_suite.py | 21 +-
dts/framework/testbed_model/node.py | 2 +
dts/framework/testbed_model/sut_node.py | 180 +++++++++++++-----
dts/tests/TestSuite_smoke_tests.py | 118 ++++++++++++
17 files changed, 994 insertions(+), 96 deletions(-)
create mode 100644 dts/framework/remote_session/remote/interactive_remote_session.py
create mode 100644 dts/framework/remote_session/remote/interactive_shell.py
create mode 100644 dts/framework/remote_session/remote/testpmd_shell.py
create mode 100644 dts/tests/TestSuite_smoke_tests.py
diff --git a/dts/conf.yaml b/dts/conf.yaml
index a9bd8a3e..2717de13 100644
--- a/dts/conf.yaml
+++ b/dts/conf.yaml
@@ -10,9 +10,13 @@ executions:
compiler_wrapper: ccache
perf: false
func: true
+ skip_smoke_tests: false #optional flag that allow you to ski smoke tests
test_suites:
- hello_world
- system_under_test: "SUT 1"
+ system_under_test:
+ node_name: "SUT 1"
+ vdevs: #optional: if removed vdevs won't be used in the execution
+ - "crypto_openssl"
nodes:
- name: "SUT 1"
hostname: sut1.change.me.localhost
@@ -20,6 +24,17 @@ nodes:
arch: x86_64
os: linux
lcores: ""
+ ports:
+ - pci: "0000:00:08.0"
+ os_driver_for_dpdk: vfio-pci #OS driver that DPDK will use
+ os_driver: i40e
+ peer_node: "TG 1"
+ peer_pci: "0000:00:08.0"
+ - pci: "0000:00:08.1"
+ os_driver_for_dpdk: vfio-pci
+ os_driver: i40e
+ peer_node: "TG 1"
+ peer_pci: "0000:00:08.1"
use_first_core: false
memory_channels: 4
hugepages: # optional; if removed, will use system hugepage configuration
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index ebb0823f..75ac1cbe 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -12,6 +12,7 @@
import pathlib
from dataclasses import dataclass
from enum import Enum, auto, unique
+from pathlib import PurePath
from typing import Any, TypedDict
import warlock # type: ignore
@@ -72,6 +73,20 @@ class HugepageConfiguration:
force_first_numa: bool
+@dataclass(slots=True, frozen=True)
+class PortConfig:
+ node: str
+ pci: str
+ os_driver_for_dpdk: str
+ os_driver: str
+ peer_node: str
+ peer_pci: str
+
+ @staticmethod
+ def from_dict(node: str, d: dict) -> "PortConfig":
+ return PortConfig(node=node, **d)
+
+
@dataclass(slots=True, frozen=True)
class NodeConfiguration:
name: str
@@ -84,6 +99,7 @@ class NodeConfiguration:
use_first_core: bool
memory_channels: int
hugepages: HugepageConfiguration | None
+ ports: list[PortConfig]
@staticmethod
def from_dict(d: dict) -> "NodeConfiguration":
@@ -92,18 +108,43 @@ def from_dict(d: dict) -> "NodeConfiguration":
if "force_first_numa" not in hugepage_config:
hugepage_config["force_first_numa"] = False
hugepage_config = HugepageConfiguration(**hugepage_config)
+ common_config = {
+ "name": d["name"],
+ "hostname": d["hostname"],
+ "user": d["user"],
+ "password": d.get("password"),
+ "arch": Architecture(d["arch"]),
+ "os": OS(d["os"]),
+ "lcores": d.get("lcores", "1"),
+ "use_first_core": d.get("use_first_core", False),
+ "memory_channels": d.get("memory_channels", 1),
+ "hugepages": hugepage_config,
+ "ports": [PortConfig.from_dict(d["name"], port) for port in d["ports"]],
+ }
+
+ return NodeConfiguration(**common_config)
+
+
+@dataclass(slots=True)
+class NodeInfo:
+ """Class to hold important versions within the node.
+
+ This class, unlike the NodeConfiguration class, cannot be generated at the start.
+ This is because we need to initialize a connection with the node before we can
+ collect the information needed in this class. Therefore, it cannot be a part of
+ the configuration class above.
+ """
- return NodeConfiguration(
- name=d["name"],
- hostname=d["hostname"],
- user=d["user"],
- password=d.get("password"),
- arch=Architecture(d["arch"]),
- os=OS(d["os"]),
- lcores=d.get("lcores", "1"),
- use_first_core=d.get("use_first_core", False),
- memory_channels=d.get("memory_channels", 1),
- hugepages=hugepage_config,
+ os_name: str
+ os_version: str
+ kernel_version: str
+
+ @staticmethod
+ def from_dict(d: dict):
+ return NodeInfo(
+ os_name=d["os_name"],
+ os_version=d["os_version"],
+ kernel_version=d["kernel_version"],
)
@@ -128,6 +169,24 @@ def from_dict(d: dict) -> "BuildTargetConfiguration":
)
+@dataclass(slots=True)
+class BuildTargetInfo:
+ """Class to hold important versions within the build target.
+
+ This is very similar to the NodeVersionInfo class, it just instead holds information
+ for the build target.
+ """
+
+ dpdk_version: str
+ compiler_version: str
+
+ @staticmethod
+ def from_dict(d: dict):
+ return BuildTargetInfo(
+ dpdk_version=d["dpdk_version"], compiler_version=d["compiler_version"]
+ )
+
+
class TestSuiteConfigDict(TypedDict):
suite: str
cases: list[str]
@@ -157,6 +216,8 @@ class ExecutionConfiguration:
func: bool
test_suites: list[TestSuiteConfig]
system_under_test: NodeConfiguration
+ vdevs: list[str]
+ skip_smoke_tests: bool
@staticmethod
def from_dict(d: dict, node_map: dict) -> "ExecutionConfiguration":
@@ -166,15 +227,20 @@ def from_dict(d: dict, node_map: dict) -> "ExecutionConfiguration":
test_suites: list[TestSuiteConfig] = list(
map(TestSuiteConfig.from_dict, d["test_suites"])
)
- sut_name = d["system_under_test"]
+ sut_name = d["system_under_test"]["node_name"]
+ skip_smoke_tests = d.get("skip_smoke_tests", False)
assert sut_name in node_map, f"Unknown SUT {sut_name} in execution {d}"
-
+ vdevs = (
+ d["system_under_test"]["vdevs"] if "vdevs" in d["system_under_test"] else []
+ )
return ExecutionConfiguration(
build_targets=build_targets,
perf=d["perf"],
func=d["func"],
+ skip_smoke_tests=skip_smoke_tests,
test_suites=test_suites,
system_under_test=node_map[sut_name],
+ vdevs=vdevs,
)
@@ -221,3 +287,27 @@ def load_config() -> Configuration:
CONFIGURATION = load_config()
+
+
+@unique
+class InteractiveApp(Enum):
+ """An enum that represents different supported interactive applications
+
+ The values in this enum must all be set to objects that have a key called
+ "default_path" where "default_path" represents a PurPath object for the path
+ to the application. This default path will be passed into the handler class
+ for the application so that it can start the application. For every key other
+ than the default shell option, the path will be appended to the path to the DPDK
+ build directory for the current SUT node.
+ """
+
+ shell = {"default_path": PurePath()}
+ testpmd = {"default_path": PurePath("app", "dpdk-testpmd")}
+
+ def get_path(self) -> PurePath:
+ """A method for getting the default paths of an application
+
+ Returns:
+ String array that represents an OS agnostic path to the application.
+ """
+ return self.value["default_path"]
diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
index ca2d4a1e..61f52b43 100644
--- a/dts/framework/config/conf_yaml_schema.json
+++ b/dts/framework/config/conf_yaml_schema.json
@@ -6,6 +6,76 @@
"type": "string",
"description": "A unique identifier for a node"
},
+ "NIC": {
+ "type": "string",
+ "enum": [
+ "ALL",
+ "ConnectX3_MT4103",
+ "ConnectX4_LX_MT4117",
+ "ConnectX4_MT4115",
+ "ConnectX5_MT4119",
+ "ConnectX5_MT4121",
+ "I40E_10G-10G_BASE_T_BC",
+ "I40E_10G-10G_BASE_T_X722",
+ "I40E_10G-SFP_X722",
+ "I40E_10G-SFP_XL710",
+ "I40E_10G-X722_A0",
+ "I40E_1G-1G_BASE_T_X722",
+ "I40E_25G-25G_SFP28",
+ "I40E_40G-QSFP_A",
+ "I40E_40G-QSFP_B",
+ "IAVF-ADAPTIVE_VF",
+ "IAVF-VF",
+ "IAVF_10G-X722_VF",
+ "ICE_100G-E810C_QSFP",
+ "ICE_25G-E810C_SFP",
+ "ICE_25G-E810_XXV_SFP",
+ "IGB-I350_VF",
+ "IGB_1G-82540EM",
+ "IGB_1G-82545EM_COPPER",
+ "IGB_1G-82571EB_COPPER",
+ "IGB_1G-82574L",
+ "IGB_1G-82576",
+ "IGB_1G-82576_QUAD_COPPER",
+ "IGB_1G-82576_QUAD_COPPER_ET2",
+ "IGB_1G-82580_COPPER",
+ "IGB_1G-I210_COPPER",
+ "IGB_1G-I350_COPPER",
+ "IGB_1G-I354_SGMII",
+ "IGB_1G-PCH_LPTLP_I218_LM",
+ "IGB_1G-PCH_LPTLP_I218_V",
+ "IGB_1G-PCH_LPT_I217_LM",
+ "IGB_1G-PCH_LPT_I217_V",
+ "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
+ "IGC-I225_LM",
+ "IGC-I226_LM",
+ "IXGBE_10G-82599_SFP",
+ "IXGBE_10G-82599_SFP_SF_QP",
+ "IXGBE_10G-82599_T3_LOM",
+ "IXGBE_10G-82599_VF",
+ "IXGBE_10G-X540T",
+ "IXGBE_10G-X540_VF",
+ "IXGBE_10G-X550EM_A_SFP",
+ "IXGBE_10G-X550EM_X_10G_T",
+ "IXGBE_10G-X550EM_X_SFP",
+ "IXGBE_10G-X550EM_X_VF",
+ "IXGBE_10G-X550T",
+ "IXGBE_10G-X550_VF",
+ "brcm_57414",
+ "brcm_P2100G",
+ "cavium_0011",
+ "cavium_a034",
+ "cavium_a063",
+ "cavium_a064",
+ "fastlinq_ql41000",
+ "fastlinq_ql41000_vf",
+ "fastlinq_ql45000",
+ "fastlinq_ql45000_vf",
+ "hi1822",
+ "virtio"
+ ]
+ },
+
"ARCH": {
"type": "string",
"enum": [
@@ -94,6 +164,19 @@
"amount"
]
},
+ "pci_address": {
+ "type": "string",
+ "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
+ },
+ "port_peer_address": {
+ "description": "Peer is a TRex port, and IXIA port or a PCI address",
+ "oneOf": [
+ {
+ "description": "PCI peer port",
+ "$ref": "#/definitions/pci_address"
+ }
+ ]
+ },
"test_suite": {
"type": "string",
"enum": [
@@ -165,6 +248,44 @@
},
"hugepages": {
"$ref": "#/definitions/hugepages"
+ },
+ "ports": {
+ "type": "array",
+ "items": {
+ "type": "object",
+ "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are an inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.",
+ "properties": {
+ "pci": {
+ "$ref": "#/definitions/pci_address",
+ "description": "The local PCI address of the port"
+ },
+ "os_driver_for_dpdk": {
+ "type": "string",
+ "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)"
+ },
+ "os_driver": {
+ "type": "string",
+ "description": "The driver normally used by this port (ex: i40e)"
+ },
+ "peer_node": {
+ "type": "string",
+ "description": "The name of the node the peer port is on"
+ },
+ "peer_pci": {
+ "$ref": "#/definitions/pci_address",
+ "description": "The PCI address of the peer port"
+ }
+ },
+ "additionalProperties": false,
+ "required": [
+ "pci",
+ "os_driver_for_dpdk",
+ "os_driver",
+ "peer_node",
+ "peer_pci"
+ ]
+ },
+ "minimum": 1
}
},
"additionalProperties": false,
@@ -211,8 +332,27 @@
]
}
},
+ "skip_smoke_tests": {
+ "description": "Optional field that allows you to skip smoke testing",
+ "type": "boolean"
+ },
"system_under_test": {
- "$ref": "#/definitions/node_name"
+ "type":"object",
+ "properties": {
+ "node_name": {
+ "$ref": "#/definitions/node_name"
+ },
+ "vdevs": {
+ "description": "Opentional list of names of vdevs to be used in execution",
+ "type": "array",
+ "items": {
+ "type": "string"
+ }
+ }
+ },
+ "required": [
+ "node_name"
+ ]
}
},
"additionalProperties": false,
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index 05022845..1b67938f 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -5,7 +5,13 @@
import sys
-from .config import CONFIGURATION, BuildTargetConfiguration, ExecutionConfiguration
+from .config import (
+ CONFIGURATION,
+ BuildTargetConfiguration,
+ ExecutionConfiguration,
+ TestSuiteConfig,
+)
+from .exception import BlockingTestSuiteError
from .logger import DTSLOG, getLogger
from .test_result import BuildTargetResult, DTSResult, ExecutionResult, Result
from .test_suite import get_test_suites
@@ -82,7 +88,7 @@ def _run_execution(
running all build targets in the given execution.
"""
dts_logger.info(f"Running execution with SUT '{execution.system_under_test.name}'.")
- execution_result = result.add_execution(sut_node.config)
+ execution_result = result.add_execution(sut_node.config, sut_node.node_info)
try:
sut_node.set_up_execution(execution)
@@ -118,14 +124,15 @@ def _run_build_target(
try:
sut_node.set_up_build_target(build_target)
- result.dpdk_version = sut_node.dpdk_version
+ # result.dpdk_version = sut_node.dpdk_version
+ build_target_result.add_build_target_versions(sut_node.get_build_target_info())
build_target_result.update_setup(Result.PASS)
except Exception as e:
dts_logger.exception("Build target setup failed.")
build_target_result.update_setup(Result.FAIL, e)
else:
- _run_suites(sut_node, execution, build_target_result)
+ _run_all_suites(sut_node, execution, build_target_result)
finally:
try:
@@ -136,7 +143,7 @@ def _run_build_target(
build_target_result.update_teardown(Result.FAIL, e)
-def _run_suites(
+def _run_all_suites(
sut_node: SutNode,
execution: ExecutionConfiguration,
build_target_result: BuildTargetResult,
@@ -146,27 +153,62 @@ def _run_suites(
with possibly only a subset of test cases.
If no subset is specified, run all test cases.
"""
+ end_build_target = False
+ if not execution.skip_smoke_tests:
+ execution.test_suites[:0] = [TestSuiteConfig.from_dict("smoke_tests")]
for test_suite_config in execution.test_suites:
try:
- full_suite_path = f"tests.TestSuite_{test_suite_config.test_suite}"
- test_suite_classes = get_test_suites(full_suite_path)
- suites_str = ", ".join((x.__name__ for x in test_suite_classes))
- dts_logger.debug(
- f"Found test suites '{suites_str}' in '{full_suite_path}'."
+ _run_single_suite(
+ sut_node, execution, build_target_result, test_suite_config
)
- except Exception as e:
- dts_logger.exception("An error occurred when searching for test suites.")
- result.update_setup(Result.ERROR, e)
-
- else:
- for test_suite_class in test_suite_classes:
- test_suite = test_suite_class(
- sut_node,
- test_suite_config.test_cases,
- execution.func,
- build_target_result,
- )
- test_suite.run()
+ except BlockingTestSuiteError as e:
+ dts_logger.exception(
+ f"An error occurred within {test_suite_config.test_suite}. "
+ "Skipping build target..."
+ )
+ result.add_error(e)
+ end_build_target = True
+ # if a blocking test failed and we need to bail out of suite executions
+ if end_build_target:
+ break
+
+
+def _run_single_suite(
+ sut_node: SutNode,
+ execution: ExecutionConfiguration,
+ build_target_result: BuildTargetResult,
+ test_suite_config: TestSuiteConfig,
+) -> None:
+ """Runs a single test suite.
+
+ Args:
+ sut_node: Node to run tests on.
+ execution: Execution the test case belongs to.
+ build_target_result: Build target configuration test case is run on
+ test_suite_config: Test suite configuration
+
+ Raises:
+ BlockingTestSuiteError: If a test suite that was marked as blocking fails.
+ """
+ try:
+ full_suite_path = f"tests.TestSuite_{test_suite_config.test_suite}"
+ test_suite_classes = get_test_suites(full_suite_path)
+ suites_str = ", ".join((x.__name__ for x in test_suite_classes))
+ dts_logger.debug(f"Found test suites '{suites_str}' in '{full_suite_path}'.")
+ except Exception as e:
+ dts_logger.exception("An error occurred when searching for test suites.")
+ result.update_setup(Result.ERROR, e)
+
+ else:
+ for test_suite_class in test_suite_classes:
+ test_suite = test_suite_class(
+ sut_node,
+ test_suite_config.test_cases,
+ execution.func,
+ build_target_result,
+ result,
+ )
+ test_suite.run()
def _exit_dts() -> None:
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index ca353d98..dfb12df4 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -25,6 +25,7 @@ class ErrorSeverity(IntEnum):
SSH_ERR = 4
DPDK_BUILD_ERR = 10
TESTCASE_VERIFY_ERR = 20
+ BLOCKING_TESTSUITE_ERR = 25
class DTSError(Exception):
@@ -144,3 +145,14 @@ def __init__(self, value: str):
def __str__(self) -> str:
return repr(self.value)
+
+
+class BlockingTestSuiteError(DTSError):
+ suite_name: str
+ severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
+
+ def __init__(self, suite_name: str) -> None:
+ self.suite_name = suite_name
+
+ def __str__(self) -> str:
+ return f"Blocking suite {self.suite_name} failed."
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index ee221503..4fe32d35 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -1,5 +1,6 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2023 University of New Hampshire
"""
The package provides modules for managing remote connections to a remote host (node),
@@ -17,7 +18,14 @@
from .linux_session import LinuxSession
from .os_session import OSSession
-from .remote import CommandResult, RemoteSession, SSHSession
+from .remote import (
+ CommandResult,
+ InteractiveRemoteSession,
+ InteractiveShell,
+ RemoteSession,
+ SSHSession,
+ TestPmdShell,
+)
def create_session(
diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/remote_session/os_session.py
index 4c48ae25..4346ecc4 100644
--- a/dts/framework/remote_session/os_session.py
+++ b/dts/framework/remote_session/os_session.py
@@ -6,13 +6,19 @@
from collections.abc import Iterable
from pathlib import PurePath
-from framework.config import Architecture, NodeConfiguration
+from framework.config import Architecture, NodeConfiguration, NodeInfo
from framework.logger import DTSLOG
from framework.settings import SETTINGS
from framework.testbed_model import LogicalCore
from framework.utils import EnvVarsDict, MesonArgs
-from .remote import CommandResult, RemoteSession, create_remote_session
+from .remote import (
+ CommandResult,
+ InteractiveRemoteSession,
+ RemoteSession,
+ create_interactive_session,
+ create_remote_session,
+)
class OSSession(ABC):
@@ -26,6 +32,7 @@ class OSSession(ABC):
name: str
_logger: DTSLOG
remote_session: RemoteSession
+ interactive_session: InteractiveRemoteSession
def __init__(
self,
@@ -37,6 +44,7 @@ def __init__(
self.name = name
self._logger = logger
self.remote_session = create_remote_session(node_config, name, logger)
+ self.interactive_session = create_interactive_session(node_config, name, logger)
def close(self, force: bool = False) -> None:
"""
@@ -173,3 +181,15 @@ def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
if needed and mount the hugepages if needed.
If force_first_numa is True, configure hugepages just on the first socket.
"""
+
+ @abstractmethod
+ def get_compiler_version(self, compiler_name: str) -> str:
+ """
+ Get installed version of compiler used for DPDK
+ """
+
+ @abstractmethod
+ def get_node_info(self) -> NodeInfo:
+ """
+ Collect information about the node
+ """
diff --git a/dts/framework/remote_session/posix_session.py b/dts/framework/remote_session/posix_session.py
index d38062e8..f8ec159f 100644
--- a/dts/framework/remote_session/posix_session.py
+++ b/dts/framework/remote_session/posix_session.py
@@ -6,7 +6,7 @@
from collections.abc import Iterable
from pathlib import PurePath, PurePosixPath
-from framework.config import Architecture
+from framework.config import Architecture, NodeInfo
from framework.exception import DPDKBuildError, RemoteCommandExecutionError
from framework.settings import SETTINGS
from framework.utils import EnvVarsDict, MesonArgs
@@ -219,3 +219,30 @@ def _remove_dpdk_runtime_dirs(
def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
return ""
+
+ def get_compiler_version(self, compiler_name: str) -> str:
+ match compiler_name:
+ case "gcc":
+ return self.send_command(f"{compiler_name} --version", 60).stdout.split(
+ "\n"
+ )[0]
+ case "clang":
+ return self.send_command(f"{compiler_name} --version", 60).stdout.split(
+ "\n"
+ )[0]
+ case "msvc":
+ return self.send_command("cl", 60).stdout
+ case "icc":
+ return self.send_command(f"{compiler_name} -V", 60).stdout
+ case _:
+ raise ValueError(f"Unknown compiler {compiler_name}")
+
+ def get_node_info(self) -> NodeInfo:
+ os_release_info = self.send_command(
+ "awk -F= '$1 ~ /^NAME$|^VERSION$/ {print $2}' /etc/os-release",
+ SETTINGS.timeout,
+ ).stdout.split("\n")
+ kernel_version = self.send_command("uname -r", SETTINGS.timeout).stdout
+ return NodeInfo(
+ os_release_info[0].strip(), os_release_info[1].strip(), kernel_version
+ )
diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
index 8a151221..224598a8 100644
--- a/dts/framework/remote_session/remote/__init__.py
+++ b/dts/framework/remote_session/remote/__init__.py
@@ -1,16 +1,26 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2022-2023 University of New Hampshire
# pylama:ignore=W0611
from framework.config import NodeConfiguration
from framework.logger import DTSLOG
+from .interactive_remote_session import InteractiveRemoteSession
+from .interactive_shell import InteractiveShell
from .remote_session import CommandResult, RemoteSession
from .ssh_session import SSHSession
+from .testpmd_shell import TestPmdShell
def create_remote_session(
node_config: NodeConfiguration, name: str, logger: DTSLOG
) -> RemoteSession:
return SSHSession(node_config, name, logger)
+
+
+def create_interactive_session(
+ node_config: NodeConfiguration, name: str, logger: DTSLOG
+) -> InteractiveRemoteSession:
+ return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/interactive_remote_session.py b/dts/framework/remote_session/remote/interactive_remote_session.py
new file mode 100644
index 00000000..e145d35d
--- /dev/null
+++ b/dts/framework/remote_session/remote/interactive_remote_session.py
@@ -0,0 +1,118 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 University of New Hampshire
+
+import socket
+import traceback
+from pathlib import PurePath
+from typing import Union
+
+from paramiko import AutoAddPolicy, SSHClient, Transport # type: ignore
+from paramiko.ssh_exception import ( # type: ignore
+ AuthenticationException,
+ BadHostKeyException,
+ NoValidConnectionsError,
+ SSHException,
+)
+
+from framework.config import InteractiveApp, NodeConfiguration
+from framework.exception import SSHConnectionError
+from framework.logger import DTSLOG
+
+from .interactive_shell import InteractiveShell
+from .testpmd_shell import TestPmdShell
+
+
+class InteractiveRemoteSession:
+ hostname: str
+ ip: str
+ port: int
+ username: str
+ password: str
+ _logger: DTSLOG
+ _node_config: NodeConfiguration
+ session: SSHClient
+ _transport: Transport | None
+
+ def __init__(self, node_config: NodeConfiguration, _logger: DTSLOG) -> None:
+ self._node_config = node_config
+ self._logger = _logger
+ self.hostname = node_config.hostname
+ self.username = node_config.user
+ self.password = node_config.password if node_config.password else ""
+ port = "22"
+ self.ip = node_config.hostname
+ if ":" in node_config.hostname:
+ self.ip, port = node_config.hostname.split(":")
+ self.port = int(port)
+ self._logger.info(
+ f"Initializing interactive connection for {self.username}@{self.hostname}"
+ )
+ self._connect()
+ self._logger.info(
+ f"Interactive connection successful for {self.username}@{self.hostname}"
+ )
+
+ def _connect(self) -> None:
+ client = SSHClient()
+ client.set_missing_host_key_policy(AutoAddPolicy)
+ self.session = client
+ retry_attempts = 10
+ for retry_attempt in range(retry_attempts):
+ try:
+ client.connect(
+ self.ip,
+ username=self.username,
+ port=self.port,
+ password=self.password,
+ timeout=20 if self.port else 10,
+ )
+ except (TypeError, BadHostKeyException, AuthenticationException) as e:
+ self._logger.exception(e)
+ raise SSHConnectionError(self.hostname) from e
+ except (NoValidConnectionsError, socket.error, SSHException) as e:
+ self._logger.debug(traceback.format_exc())
+ self._logger.warning(e)
+ self._logger.info(
+ "Retrying interactive session connection: "
+ f"retry number {retry_attempt +1}"
+ )
+ else:
+ break
+ else:
+ raise SSHConnectionError(self.hostname)
+ # Interactive sessions are used on an "as needed" basis so we have
+ # to set a keepalive
+ self._transport = self.session.get_transport()
+ if self._transport is not None:
+ self._transport.set_keepalive(30)
+
+ def create_interactive_shell(
+ self,
+ shell_type: InteractiveApp,
+ path_to_app: PurePath,
+ eal_parameters: str,
+ timeout: float,
+ ) -> Union[InteractiveShell, TestPmdShell]:
+ """
+ See "create_interactive_shell" in SutNode
+ """
+ match (shell_type):
+ case InteractiveApp.shell:
+ return InteractiveShell(
+ self.session, self._logger, path_to_app, timeout
+ )
+ case InteractiveApp.testpmd:
+ return TestPmdShell(
+ self.session,
+ self._logger,
+ path_to_app,
+ timeout=timeout,
+ eal_flags=eal_parameters,
+ )
+ case _:
+ self._logger.info(
+ f"Unhandled app type {shell_type.name}, defaulting to shell."
+ )
+ return InteractiveShell(
+ self.session, self._logger, path_to_app, timeout
+ )
diff --git a/dts/framework/remote_session/remote/interactive_shell.py b/dts/framework/remote_session/remote/interactive_shell.py
new file mode 100644
index 00000000..4b0735c8
--- /dev/null
+++ b/dts/framework/remote_session/remote/interactive_shell.py
@@ -0,0 +1,99 @@
+from pathlib import PurePath
+
+from paramiko import Channel, SSHClient, channel # type: ignore
+
+from framework.logger import DTSLOG
+from framework.settings import SETTINGS
+
+
+class InteractiveShell:
+
+ _interactive_session: SSHClient
+ _stdin: channel.ChannelStdinFile
+ _stdout: channel.ChannelFile
+ _ssh_channel: Channel
+ _logger: DTSLOG
+ _timeout: float
+ _path_to_app: PurePath
+
+ def __init__(
+ self,
+ interactive_session: SSHClient,
+ logger: DTSLOG,
+ path_to_app: PurePath,
+ timeout: float = SETTINGS.timeout,
+ ) -> None:
+ self._interactive_session = interactive_session
+ self._ssh_channel = self._interactive_session.invoke_shell()
+ self._stdin = self._ssh_channel.makefile_stdin("w")
+ self._stdout = self._ssh_channel.makefile("r")
+ self._ssh_channel.settimeout(timeout)
+ self._ssh_channel.set_combine_stderr(True) # combines stdout and stderr streams
+ self._logger = logger
+ self._timeout = timeout
+ self._path_to_app = path_to_app
+
+
+ def send_command_no_output(self, command: str) -> None:
+ """Send command to channel without recording output.
+
+ This method will not verify any input or output, it will simply assume the
+ command succeeded. This method will also consume all output in the buffer
+ after executing the command.
+ """
+ self._logger.info(
+ f"Sending command {command.strip()} and not collecting output"
+ )
+ self._stdin.write(f"{command}\n")
+ self._stdin.flush()
+ self.empty_stdout_buffer()
+
+ def empty_stdout_buffer(self) -> None:
+ """Removes all data from the stdout buffer.
+
+ Because of the way paramiko handles read buffers, there is no way to effectively
+ remove data from, or "flush", read buffers. This method essentially moves our
+ offset on the buffer to the end and thus "removes" the data from the buffer.
+ Timeouts are thrown on read operations of paramiko pipes based on whether data
+ had been received before timeout so we assume that if we reach the timeout then
+ we are at the end of the buffer.
+ """
+ self._ssh_channel.settimeout(0.5)
+ try:
+ for line in self._stdout:
+ pass
+ except TimeoutError:
+ pass
+ self._ssh_channel.settimeout(self._timeout) # reset timeout
+
+ def send_command_get_output(self, command: str, prompt: str) -> str:
+ """Send a command and get all output before the expected ending string.
+
+ Lines that expect input are not included in the stdout buffer so they cannot be
+ used for expect. For example, if you were prompted to log into something
+ with a username and password, you cannot expect "username:" because it won't
+ yet be in the stdout buffer. A work around for this could be consuming an
+ extra newline character to force the current prompt into the stdout buffer.
+
+ Returns:
+ All output in the buffer before expected string
+ """
+ self._logger.info(f"Sending command {command.strip()}...")
+ self._stdin.write(f"{command}\n")
+ self._stdin.flush()
+ out: str = ""
+ for line in self._stdout:
+ out += line
+ if prompt in line and not line.rstrip().endswith(
+ command.rstrip()
+ ): # ignore line that sent command
+ break
+ self._logger.debug(f"Got output: {out}")
+ return out
+
+ def close(self) -> None:
+ self._stdin.close()
+ self._ssh_channel.close()
+
+ def __del__(self) -> None:
+ self.close()
diff --git a/dts/framework/remote_session/remote/testpmd_shell.py b/dts/framework/remote_session/remote/testpmd_shell.py
new file mode 100644
index 00000000..bde3468b
--- /dev/null
+++ b/dts/framework/remote_session/remote/testpmd_shell.py
@@ -0,0 +1,67 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 University of New Hampshire
+
+
+from pathlib import PurePath
+
+from paramiko import SSHClient # type: ignore
+
+from framework.logger import DTSLOG
+from framework.settings import SETTINGS
+
+from .interactive_shell import InteractiveShell
+
+
+class TestPmdShell(InteractiveShell):
+ expected_prompt: str = "testpmd>"
+ _eal_flags: str
+
+ def __init__(
+ self,
+ interactive_session: SSHClient,
+ logger: DTSLOG,
+ path_to_testpmd: PurePath,
+ eal_flags: str,
+ timeout: float = SETTINGS.timeout,
+ ) -> None:
+ """Initializes an interactive testpmd session using specified parameters."""
+ self._eal_flags = eal_flags
+
+ super(TestPmdShell, self).__init__(
+ interactive_session,
+ logger=logger,
+ path_to_app=path_to_testpmd,
+ timeout=timeout,
+ )
+ self._start_application()
+
+ def _start_application(self) -> None:
+ """Starts a new interactive testpmd shell using _path_to_app.
+ """
+ self.send_command(
+ f"{self._path_to_app} {self._eal_flags} -- -i",
+ )
+
+ def send_command(self, command: str, prompt: str = expected_prompt) -> str:
+ """Specific way of handling the command for testpmd
+
+ An extra newline character is consumed in order to force the current line into
+ the stdout buffer.
+ """
+ return self.send_command_get_output(f"{command}\n", prompt)
+
+ def get_devices(self) -> list[str]:
+ """Get a list of device names that are known to testpmd
+
+ Uses the device info listed in testpmd and then parses the output to
+ return only the names of the devices.
+
+ Returns:
+ A list of strings representing device names (e.g. 0000:14:00.1)
+ """
+ dev_info: str = self.send_command("show device info all")
+ dev_list: list[str] = []
+ for line in dev_info.split("\n"):
+ if "device name:" in line.lower():
+ dev_list.append(line.strip().split(": ")[1].strip())
+ return dev_list
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 74391982..2436eb7f 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -1,5 +1,6 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2022-2023 University of New Hampshire
"""
Generic result container and reporters
@@ -13,9 +14,11 @@
OS,
Architecture,
BuildTargetConfiguration,
+ BuildTargetInfo,
Compiler,
CPUType,
NodeConfiguration,
+ NodeInfo,
)
from .exception import DTSError, ErrorSeverity
from .logger import DTSLOG
@@ -67,12 +70,14 @@ class Statistics(dict):
Using a dict provides a convenient way to format the data.
"""
- def __init__(self, dpdk_version):
+ def __init__(self, output_info: dict[str, str] | None):
super(Statistics, self).__init__()
for result in Result:
self[result.name] = 0
self["PASS RATE"] = 0.0
- self["DPDK VERSION"] = dpdk_version
+ if output_info:
+ for info_key, info_val in output_info.items():
+ self[info_key] = info_val
def __iadd__(self, other: Result) -> "Statistics":
"""
@@ -206,6 +211,8 @@ class BuildTargetResult(BaseResult):
os: OS
cpu: CPUType
compiler: Compiler
+ compiler_version: str | None
+ dpdk_version: str | None
def __init__(self, build_target: BuildTargetConfiguration):
super(BuildTargetResult, self).__init__()
@@ -213,6 +220,12 @@ def __init__(self, build_target: BuildTargetConfiguration):
self.os = build_target.os
self.cpu = build_target.cpu
self.compiler = build_target.compiler
+ self.compiler_version = None
+ self.dpdk_version = None
+
+ def add_build_target_versions(self, versions: BuildTargetInfo) -> None:
+ self.compiler_version = versions.compiler_version
+ self.dpdk_version = versions.dpdk_version
def add_test_suite(self, test_suite_name: str) -> TestSuiteResult:
test_suite_result = TestSuiteResult(test_suite_name)
@@ -228,10 +241,17 @@ class ExecutionResult(BaseResult):
"""
sut_node: NodeConfiguration
+ sut_os_name: str
+ sut_os_version: str
+ sut_kernel_version: str
- def __init__(self, sut_node: NodeConfiguration):
+ def __init__(self, sut_node: NodeConfiguration, sut_version_info: NodeInfo):
super(ExecutionResult, self).__init__()
self.sut_node = sut_node
+ self.sut_version_info = sut_version_info
+ self.sut_os_name = sut_version_info.os_name
+ self.sut_os_version = sut_version_info.os_version
+ self.sut_kernel_version = sut_version_info.kernel_version
def add_build_target(
self, build_target: BuildTargetConfiguration
@@ -258,6 +278,7 @@ class DTSResult(BaseResult):
"""
dpdk_version: str | None
+ output: dict | None
_logger: DTSLOG
_errors: list[Exception]
_return_code: ErrorSeverity
@@ -267,14 +288,17 @@ class DTSResult(BaseResult):
def __init__(self, logger: DTSLOG):
super(DTSResult, self).__init__()
self.dpdk_version = None
+ self.output = None
self._logger = logger
self._errors = []
self._return_code = ErrorSeverity.NO_ERR
self._stats_result = None
self._stats_filename = os.path.join(SETTINGS.output_dir, "statistics.txt")
- def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
- execution_result = ExecutionResult(sut_node)
+ def add_execution(
+ self, sut_node: NodeConfiguration, sut_version_info: NodeInfo
+ ) -> ExecutionResult:
+ execution_result = ExecutionResult(sut_node, sut_version_info)
self._inner_results.append(execution_result)
return execution_result
@@ -296,7 +320,8 @@ def process(self) -> None:
for error in self._errors:
self._logger.debug(repr(error))
- self._stats_result = Statistics(self.dpdk_version)
+ self._stats_result = Statistics(self.output)
+ # add information gathered from the smoke tests to the statistics
self.add_stats(self._stats_result)
with open(self._stats_filename, "w+") as stats_file:
stats_file.write(str(self._stats_result))
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index 0705f38f..5df5d2a6 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -11,10 +11,21 @@
import re
from types import MethodType
-from .exception import ConfigurationError, SSHTimeoutError, TestCaseVerifyError
+from .exception import (
+ BlockingTestSuiteError,
+ ConfigurationError,
+ SSHTimeoutError,
+ TestCaseVerifyError,
+)
from .logger import DTSLOG, getLogger
from .settings import SETTINGS
-from .test_result import BuildTargetResult, Result, TestCaseResult, TestSuiteResult
+from .test_result import (
+ BuildTargetResult,
+ DTSResult,
+ Result,
+ TestCaseResult,
+ TestSuiteResult,
+)
from .testbed_model import SutNode
@@ -37,10 +48,12 @@ class TestSuite(object):
"""
sut_node: SutNode
+ is_blocking = False
_logger: DTSLOG
_test_cases_to_run: list[str]
_func: bool
_result: TestSuiteResult
+ _dts_result: DTSResult
def __init__(
self,
@@ -48,6 +61,7 @@ def __init__(
test_cases: list[str],
func: bool,
build_target_result: BuildTargetResult,
+ dts_result: DTSResult,
):
self.sut_node = sut_node
self._logger = getLogger(self.__class__.__name__)
@@ -55,6 +69,7 @@ def __init__(
self._test_cases_to_run.extend(SETTINGS.test_cases)
self._func = func
self._result = build_target_result.add_test_suite(self.__class__.__name__)
+ self._dts_result = dts_result
def set_up_suite(self) -> None:
"""
@@ -118,6 +133,8 @@ def run(self) -> None:
f"the next test suite may be affected."
)
self._result.update_setup(Result.ERROR, e)
+ if len(self._result.get_errors()) > 0 and self.is_blocking:
+ raise BlockingTestSuiteError(test_suite_name)
def _execute_test_suite(self) -> None:
"""
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index d48fafe6..c5147e0e 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -40,6 +40,7 @@ class Node(object):
lcores: list[LogicalCore]
_logger: DTSLOG
_other_sessions: list[OSSession]
+ _execution_config: ExecutionConfiguration
def __init__(self, node_config: NodeConfiguration):
self.config = node_config
@@ -64,6 +65,7 @@ def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
"""
self._setup_hugepages()
self._set_up_execution(execution_config)
+ self._execution_config = execution_config
def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
"""
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 2b2b50d9..9b17ac3d 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -1,14 +1,27 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2010-2014 Intel Corporation
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+# Copyright(c) 2022-2023 University of New Hampshire
import os
import tarfile
import time
from pathlib import PurePath
-
-from framework.config import BuildTargetConfiguration, NodeConfiguration
-from framework.remote_session import CommandResult, OSSession
+from typing import Union
+
+from framework.config import (
+ BuildTargetConfiguration,
+ BuildTargetInfo,
+ InteractiveApp,
+ NodeConfiguration,
+ NodeInfo,
+)
+from framework.remote_session import (
+ CommandResult,
+ InteractiveShell,
+ OSSession,
+ TestPmdShell,
+)
from framework.settings import SETTINGS
from framework.utils import EnvVarsDict, MesonArgs
@@ -16,6 +29,52 @@
from .node import Node
+class EalParameters(object):
+ def __init__(
+ self,
+ lcore_list: LogicalCoreList,
+ memory_channels: int,
+ prefix: str,
+ no_pci: bool,
+ vdevs: list[VirtualDevice],
+ other_eal_param: str,
+ ):
+ """
+ Generate eal parameters character string;
+ :param lcore_list: the list of logical cores to use.
+ :param memory_channels: the number of memory channels to use.
+ :param prefix: set file prefix string, eg:
+ prefix='vf'
+ :param no_pci: switch of disable PCI bus eg:
+ no_pci=True
+ :param vdevs: virtual device list, eg:
+ vdevs=[
+ VirtualDevice('net_ring0'),
+ VirtualDevice('net_ring1')
+ ]
+ :param other_eal_param: user defined DPDK eal parameters, eg:
+ other_eal_param='--single-file-segments'
+ """
+ self._lcore_list = f"-l {lcore_list}"
+ self._memory_channels = f"-n {memory_channels}"
+ self._prefix = prefix
+ if prefix:
+ self._prefix = f"--file-prefix={prefix}"
+ self._no_pci = "--no-pci" if no_pci else ""
+ self._vdevs = " ".join(f"--vdev {vdev}" for vdev in vdevs)
+ self._other_eal_param = other_eal_param
+
+ def __str__(self) -> str:
+ return (
+ f"{self._lcore_list} "
+ f"{self._memory_channels} "
+ f"{self._prefix} "
+ f"{self._no_pci} "
+ f"{self._vdevs} "
+ f"{self._other_eal_param}"
+ )
+
+
class SutNode(Node):
"""
A class for managing connections to the System under Test, providing
@@ -30,9 +89,11 @@ class SutNode(Node):
_env_vars: EnvVarsDict
_remote_tmp_dir: PurePath
__remote_dpdk_dir: PurePath | None
- _dpdk_version: str | None
_app_compile_timeout: float
_dpdk_kill_session: OSSession | None
+ _dpdk_version: str | None
+ _node_info: NodeInfo | None
+ _compiler_version: str | None
def __init__(self, node_config: NodeConfiguration):
super(SutNode, self).__init__(node_config)
@@ -41,12 +102,14 @@ def __init__(self, node_config: NodeConfiguration):
self._env_vars = EnvVarsDict()
self._remote_tmp_dir = self.main_session.get_remote_tmp_dir()
self.__remote_dpdk_dir = None
- self._dpdk_version = None
self._app_compile_timeout = 90
self._dpdk_kill_session = None
self._dpdk_timestamp = (
f"{str(os.getpid())}_{time.strftime('%Y%m%d%H%M%S', time.localtime())}"
)
+ self._dpdk_version = None
+ self._node_info = None
+ self._compiler_version = None
@property
def _remote_dpdk_dir(self) -> PurePath:
@@ -75,6 +138,32 @@ def dpdk_version(self) -> str:
)
return self._dpdk_version
+ @property
+ def node_info(self) -> NodeInfo:
+ if self._node_info is None:
+ self._node_info = self.main_session.get_node_info()
+ return self._node_info
+
+ @property
+ def compiler_version(self) -> str:
+ if self._compiler_version is None:
+ if self._build_target_config is not None:
+ self._compiler_version = self.main_session.get_compiler_version(
+ self._build_target_config.compiler.name
+ )
+ else:
+ self._logger.warning(
+ "Failed to get compiler version because"
+ "_build_target_config is None."
+ )
+ return ""
+ return self._compiler_version
+
+ def get_build_target_info(self) -> BuildTargetInfo:
+ return BuildTargetInfo(
+ dpdk_version=self.dpdk_version, compiler_version=self.compiler_version
+ )
+
def _guess_dpdk_remote_dir(self) -> PurePath:
return self.main_session.guess_dpdk_remote_dir(self._remote_tmp_dir)
@@ -84,6 +173,10 @@ def _set_up_build_target(
"""
Setup DPDK on the SUT node.
"""
+ # we want to ensure that dpdk_version and compiler_version is reset for new
+ # build targets
+ self._dpdk_version = None
+ self._compiler_version = None
self._configure_build_target(build_target_config)
self._copy_dpdk_tarball()
self._build_dpdk()
@@ -262,48 +355,43 @@ def run_dpdk_app(
f"{app_path} {eal_args}", timeout, verify=True
)
-
-class EalParameters(object):
- def __init__(
+ def create_interactive_shell(
self,
- lcore_list: LogicalCoreList,
- memory_channels: int,
- prefix: str,
- no_pci: bool,
- vdevs: list[VirtualDevice],
- other_eal_param: str,
- ):
+ shell_type: InteractiveApp,
+ path_to_app: PurePath | None = None,
+ timeout: float = SETTINGS.timeout,
+ eal_parameters: EalParameters | None = None,
+ ) -> Union[InteractiveShell, TestPmdShell]:
+ """Create a handler for an interactive session.
+
+ This method is a factory that calls a method in OSSession to create shells for
+ different DPDK applications.
+
+ Args:
+ shell_type: Enum value representing the desired application.
+ path_to_app: Represents a path to the application you are attempting to
+ launch. This path will be executed at the start of the app
+ initialization. If one isn't provided, the default specified in the
+ enumeration will be used.
+ timeout: Timeout for reading output from the SSH channel. If you are
+ reading from the buffer and don't receive any data within the timeout
+ it will throw an error.
+ eal_parameters: List of EAL parameters to use to launch the app. This is
+ ignored for base "shell" types.
+ Returns:
+ Instance of the desired interactive application.
"""
- Generate eal parameters character string;
- :param lcore_list: the list of logical cores to use.
- :param memory_channels: the number of memory channels to use.
- :param prefix: set file prefix string, eg:
- prefix='vf'
- :param no_pci: switch of disable PCI bus eg:
- no_pci=True
- :param vdevs: virtual device list, eg:
- vdevs=[
- VirtualDevice('net_ring0'),
- VirtualDevice('net_ring1')
- ]
- :param other_eal_param: user defined DPDK eal parameters, eg:
- other_eal_param='--single-file-segments'
- """
- self._lcore_list = f"-l {lcore_list}"
- self._memory_channels = f"-n {memory_channels}"
- self._prefix = prefix
- if prefix:
- self._prefix = f"--file-prefix={prefix}"
- self._no_pci = "--no-pci" if no_pci else ""
- self._vdevs = " ".join(f"--vdev {vdev}" for vdev in vdevs)
- self._other_eal_param = other_eal_param
+ # if we just want a default shell, there is no need to append the DPDK build
+ # directory to the path
+ default_path = shell_type.get_path()
- def __str__(self) -> str:
- return (
- f"{self._lcore_list} "
- f"{self._memory_channels} "
- f"{self._prefix} "
- f"{self._no_pci} "
- f"{self._vdevs} "
- f"{self._other_eal_param}"
+ if shell_type != InteractiveApp.shell:
+ default_path = self.main_session.join_remote_path(
+ self.remote_dpdk_build_dir, shell_type.get_path()
+ )
+ return self.main_session.interactive_session.create_interactive_shell(
+ shell_type,
+ path_to_app if path_to_app else default_path,
+ str(eal_parameters) if eal_parameters else "",
+ timeout,
)
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
new file mode 100644
index 00000000..b7e70ee1
--- /dev/null
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -0,0 +1,118 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 University of New Hampshire
+
+import re
+
+from framework.config import InteractiveApp
+from framework.remote_session import TestPmdShell
+from framework.settings import SETTINGS
+from framework.test_suite import TestSuite
+
+
+class SmokeTests(TestSuite):
+ is_blocking = True
+ # dicts in this list are expected to have two keys:
+ # "pci_address" and "current_driver"
+ nics_in_node: list[dict[str, str]] = []
+
+ def set_up_suite(self) -> None:
+ """
+ Setup:
+ Set the build directory path and generate a list of NICs in the SUT node.
+ """
+ self.dpdk_build_dir_path = self.sut_node.remote_dpdk_build_dir
+ for nic in self.sut_node.config.ports:
+ new_dict = {
+ "pci_address": nic.pci,
+ "current_driver": nic.os_driver.strip(),
+ }
+ self.nics_in_node.append(new_dict)
+
+ def test_unit_tests(self) -> None:
+ """
+ Test:
+ Run the fast-test unit-test suite through meson.
+ """
+ self.sut_node.main_session.send_command(
+ f"meson test -C {self.dpdk_build_dir_path} --suite fast-tests",
+ 300,
+ verify=True,
+ )
+
+ def test_driver_tests(self) -> None:
+ """
+ Test:
+ Run the driver-test unit-test suite through meson.
+ """
+ list_of_vdevs = ""
+ for dev in self.sut_node._execution_config.vdevs:
+ list_of_vdevs += f"--vdev {dev} "
+ list_of_vdevs = list_of_vdevs[:-1]
+ if list_of_vdevs:
+ self._logger.info(
+ "Running driver tests with the following virtual "
+ f"devices: {list_of_vdevs}"
+ )
+ self.sut_node.main_session.send_command(
+ f"meson test -C {self.dpdk_build_dir_path} --suite driver-tests "
+ f'--test-args "{list_of_vdevs}"',
+ 300,
+ verify=True,
+ )
+ else:
+ self.sut_node.main_session.send_command(
+ f"meson test -C {self.dpdk_build_dir_path} --suite driver-tests",
+ 300,
+ verify=True,
+ )
+
+ def test_devices_listed_in_testpmd(self) -> None:
+ """
+ Test:
+ Uses testpmd driver to verify that devices have been found by testpmd.
+ """
+ testpmd_driver = self.sut_node.create_interactive_shell(InteractiveApp.testpmd)
+ # We know it should always be a TestPmdShell but mypy doesn't
+ assert isinstance(testpmd_driver, TestPmdShell)
+ dev_list: list[str] = testpmd_driver.get_devices()
+ for nic in self.nics_in_node:
+ self.verify(
+ nic["pci_address"] in dev_list,
+ f"Device {nic['pci_address']} was not listed in testpmd's available devices, "
+ "please check your configuration",
+ )
+
+ def test_device_bound_to_driver(self) -> None:
+ """
+ Test:
+ Ensure that all drivers listed in the config are bound to the correct driver.
+ """
+ path_to_devbind = self.sut_node.main_session.join_remote_path(
+ self.sut_node._remote_dpdk_dir, "usertools", "dpdk-devbind.py"
+ )
+
+ regex_for_pci_address = "/[0-9]{4}:[0-9]{2}:[0-9]{2}.[0-9]{1}/"
+ all_nics_in_dpdk_devbind = self.sut_node.main_session.send_command(
+ f"{path_to_devbind} --status | awk '{regex_for_pci_address}'",
+ SETTINGS.timeout,
+ ).stdout
+
+ for nic in self.nics_in_node:
+ # This regular expression finds the line in the above string that starts
+ # with the address for the nic we are on in the loop and then captures the
+ # name of the driver in a group
+ devbind_info_for_nic = re.search(
+ f"{nic['pci_address']}[^\\n]*drv=([\\d\\w]*) [^\\n]*",
+ all_nics_in_dpdk_devbind,
+ )
+ self.verify(
+ devbind_info_for_nic is not None,
+ f"Failed to find configured device ({nic['pci_address']}) using dpdk-devbind.py",
+ )
+ # We know this isn't None, but mypy doesn't
+ assert devbind_info_for_nic is not None
+ self.verify(
+ devbind_info_for_nic.group(1) == nic["current_driver"],
+ f"Driver for device {nic['pci_address']} does not match driver listed in "
+ f"configuration (bound to {devbind_info_for_nic.group(1)})",
+ )
--
2.41.0
^ permalink raw reply [flat|nested] 8+ messages in thread
* [PATCH v2 2/2] dts: add paramiko to dependencies
2023-07-10 16:19 [PATCH v2 0/2] Add DTS smoke tests jspewock
2023-07-10 16:19 ` [PATCH v2 1/2] dts: add " jspewock
@ 2023-07-10 16:19 ` jspewock
2023-07-10 22:09 ` Patrick Robb
2023-08-18 19:44 ` Adam Hassick
2023-08-18 18:34 ` [PATCH v2 0/2] Add DTS smoke tests Adam Hassick
2 siblings, 2 replies; 8+ messages in thread
From: jspewock @ 2023-07-10 16:19 UTC (permalink / raw)
To: Honnappa.Nagarahalli, juraj.linkes, thomas, lijuan.tu,
wathsala.vithanage, probb
Cc: dev, Jeremy Spewock
From: Jeremy Spewock <jspewock@iol.unh.edu>
added paramiko to the dependency files
Signed-off-by: Jeremy Spewock <jspewock@iol.unh.edu>
---
dts/poetry.lock | 160 ++++++++++++++++++++++++++++++++++-----------
dts/pyproject.toml | 1 +
2 files changed, 124 insertions(+), 37 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 0b2a007d..dfd9a240 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,20 +1,33 @@
[[package]]
name = "attrs"
-version = "22.1.0"
+version = "23.1.0"
description = "Classes Without Boilerplate"
category = "main"
optional = false
-python-versions = ">=3.5"
+python-versions = ">=3.7"
[package.extras]
-dev = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "zope.interface", "furo", "sphinx", "sphinx-notfound-page", "pre-commit", "cloudpickle"]
-docs = ["furo", "sphinx", "zope.interface", "sphinx-notfound-page"]
-tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "zope.interface", "cloudpickle"]
-tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "cloudpickle"]
+cov = ["attrs", "coverage[toml] (>=5.3)"]
+dev = ["attrs", "pre-commit"]
+docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
+tests = ["attrs", "zope-interface"]
+tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest-mypy-plugins", "pytest-xdist", "pytest (>=4.3.0)"]
+
+[[package]]
+name = "bcrypt"
+version = "4.0.1"
+description = "Modern password hashing for your software and your servers"
+category = "main"
+optional = false
+python-versions = ">=3.6"
+
+[package.extras]
+tests = ["pytest (>=3.2.1,!=3.3.0)"]
+typecheck = ["mypy"]
[[package]]
name = "black"
-version = "22.10.0"
+version = "22.12.0"
description = "The uncompromising code formatter."
category = "dev"
optional = false
@@ -33,6 +46,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "cffi"
+version = "1.15.1"
+description = "Foreign Function Interface for Python calling C code."
+category = "main"
+optional = false
+python-versions = "*"
+
+[package.dependencies]
+pycparser = "*"
+
[[package]]
name = "click"
version = "8.1.3"
@@ -52,18 +76,39 @@ category = "dev"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
+[[package]]
+name = "cryptography"
+version = "41.0.1"
+description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
+category = "main"
+optional = false
+python-versions = ">=3.7"
+
+[package.dependencies]
+cffi = ">=1.12"
+
+[package.extras]
+docs = ["sphinx (>=5.3.0)", "sphinx-rtd-theme (>=1.1.1)"]
+docstest = ["pyenchant (>=1.6.11)", "twine (>=1.12.0)", "sphinxcontrib-spelling (>=4.0.1)"]
+nox = ["nox"]
+pep8test = ["black", "ruff", "mypy", "check-sdist"]
+sdist = ["build"]
+ssh = ["bcrypt (>=3.1.5)"]
+test = ["pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist", "pretend"]
+test-randomorder = ["pytest-randomly"]
+
[[package]]
name = "isort"
-version = "5.10.1"
+version = "5.12.0"
description = "A Python utility / library to sort Python imports."
category = "dev"
optional = false
-python-versions = ">=3.6.1,<4.0"
+python-versions = ">=3.8.0"
[package.extras]
-pipfile_deprecated_finder = ["pipreqs", "requirementslib"]
-requirements_deprecated_finder = ["pipreqs", "pip-api"]
-colors = ["colorama (>=0.4.3,<0.5.0)"]
+colors = ["colorama (>=0.4.3)"]
+requirements-deprecated-finder = ["pip-api", "pipreqs"]
+pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"]
plugins = ["setuptools"]
[[package]]
@@ -87,7 +132,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
[[package]]
name = "jsonschema"
-version = "4.17.0"
+version = "4.17.3"
description = "An implementation of JSON Schema validation for Python"
category = "main"
optional = false
@@ -129,15 +174,33 @@ reports = ["lxml"]
[[package]]
name = "mypy-extensions"
-version = "0.4.3"
-description = "Experimental type system extensions for programs checked with the mypy typechecker."
+version = "1.0.0"
+description = "Type system extensions for programs checked with the mypy type checker."
category = "dev"
optional = false
-python-versions = "*"
+python-versions = ">=3.5"
+
+[[package]]
+name = "paramiko"
+version = "3.2.0"
+description = "SSH2 protocol library"
+category = "main"
+optional = false
+python-versions = ">=3.6"
+
+[package.dependencies]
+bcrypt = ">=3.2"
+cryptography = ">=3.3"
+pynacl = ">=1.5"
+
+[package.extras]
+all = ["pyasn1 (>=0.1.7)", "invoke (>=2.0)", "gssapi (>=1.4.1)", "pywin32 (>=2.1.8)"]
+gssapi = ["pyasn1 (>=0.1.7)", "gssapi (>=1.4.1)", "pywin32 (>=2.1.8)"]
+invoke = ["invoke (>=2.0)"]
[[package]]
name = "pathspec"
-version = "0.10.1"
+version = "0.11.1"
description = "Utility library for gitignore style pattern matching of file paths."
category = "dev"
optional = false
@@ -156,15 +219,15 @@ ptyprocess = ">=0.5"
[[package]]
name = "platformdirs"
-version = "2.5.2"
-description = "A small Python module for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
+version = "3.5.1"
+description = "A small Python package for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
category = "dev"
optional = false
python-versions = ">=3.7"
[package.extras]
-docs = ["furo (>=2021.7.5b38)", "proselint (>=0.10.2)", "sphinx-autodoc-typehints (>=1.12)", "sphinx (>=4)"]
-test = ["appdirs (==1.4.4)", "pytest-cov (>=2.7)", "pytest-mock (>=3.6)", "pytest (>=6)"]
+docs = ["furo (>=2023.3.27)", "proselint (>=0.13)", "sphinx-autodoc-typehints (>=1.23,!=1.23.4)", "sphinx (>=6.2.1)"]
+test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest-cov (>=4)", "pytest-mock (>=3.10)", "pytest (>=7.3.1)"]
[[package]]
name = "ptyprocess"
@@ -176,25 +239,33 @@ python-versions = "*"
[[package]]
name = "pycodestyle"
-version = "2.9.1"
+version = "2.10.0"
description = "Python style guide checker"
category = "dev"
optional = false
python-versions = ">=3.6"
+[[package]]
+name = "pycparser"
+version = "2.21"
+description = "C parser in Python"
+category = "main"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+
[[package]]
name = "pydocstyle"
-version = "6.1.1"
+version = "6.3.0"
description = "Python docstring style checker"
category = "dev"
optional = false
python-versions = ">=3.6"
[package.dependencies]
-snowballstemmer = "*"
+snowballstemmer = ">=2.2.0"
[package.extras]
-toml = ["toml"]
+toml = ["tomli (>=1.2.3)"]
[[package]]
name = "pyflakes"
@@ -228,9 +299,24 @@ tests = ["pytest (>=7.1.2)", "pytest-mypy", "eradicate (>=2.0.0)", "radon (>=5.1
toml = ["toml (>=0.10.2)"]
vulture = ["vulture"]
+[[package]]
+name = "pynacl"
+version = "1.5.0"
+description = "Python binding to the Networking and Cryptography (NaCl) library"
+category = "main"
+optional = false
+python-versions = ">=3.6"
+
+[package.dependencies]
+cffi = ">=1.4.1"
+
+[package.extras]
+docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"]
+tests = ["pytest (>=3.2.1,!=3.3.0)", "hypothesis (>=3.27.0)"]
+
[[package]]
name = "pyrsistent"
-version = "0.19.1"
+version = "0.19.3"
description = "Persistent/Functional/Immutable data structures"
category = "main"
optional = false
@@ -270,7 +356,7 @@ python-versions = ">=3.7"
[[package]]
name = "types-pyyaml"
-version = "6.0.12.1"
+version = "6.0.12.10"
description = "Typing stubs for PyYAML"
category = "main"
optional = false
@@ -278,7 +364,7 @@ python-versions = "*"
[[package]]
name = "typing-extensions"
-version = "4.4.0"
+version = "4.6.2"
description = "Backported and Experimental Type Hints for Python 3.7+"
category = "dev"
optional = false
@@ -299,13 +385,16 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "1.1"
python-versions = "^3.10"
-content-hash = "a0f040b07fc6ce4deb0be078b9a88c2a465cb6bccb9e260a67e92c2403e2319f"
+content-hash = "c119901b1c13b14adf9b53624430aff8720d9a3a180b028579b82c65d49474df"
[metadata.files]
attrs = []
+bcrypt = []
black = []
+cffi = []
click = []
colorama = []
+cryptography = []
isort = []
jsonpatch = []
jsonpointer = []
@@ -313,20 +402,17 @@ jsonschema = []
mccabe = []
mypy = []
mypy-extensions = []
+paramiko = []
pathspec = []
-pexpect = [
- {file = "pexpect-4.8.0-py2.py3-none-any.whl", hash = "sha256:0b48a55dcb3c05f3329815901ea4fc1537514d6ba867a152b581d69ae3710937"},
- {file = "pexpect-4.8.0.tar.gz", hash = "sha256:fc65a43959d153d0114afe13997d439c22823a27cefceb5ff35c2178c6784c0c"},
-]
-platformdirs = [
- {file = "platformdirs-2.5.2-py3-none-any.whl", hash = "sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788"},
- {file = "platformdirs-2.5.2.tar.gz", hash = "sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19"},
-]
+pexpect = []
+platformdirs = []
ptyprocess = []
pycodestyle = []
+pycparser = []
pydocstyle = []
pyflakes = []
pylama = []
+pynacl = []
pyrsistent = []
pyyaml = []
snowballstemmer = []
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index a136c91e..928837cb 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -13,6 +13,7 @@ pexpect = "^4.8.0"
warlock = "^2.0.1"
PyYAML = "^6.0"
types-PyYAML = "^6.0.8"
+paramiko = "^3.1.0"
[tool.poetry.dev-dependencies]
mypy = "^0.961"
--
2.41.0
^ permalink raw reply [flat|nested] 8+ messages in thread
* Re: [PATCH v2 2/2] dts: add paramiko to dependencies
2023-07-10 16:19 ` [PATCH v2 2/2] dts: add paramiko to dependencies jspewock
@ 2023-07-10 22:09 ` Patrick Robb
2023-08-18 19:44 ` Adam Hassick
1 sibling, 0 replies; 8+ messages in thread
From: Patrick Robb @ 2023-07-10 22:09 UTC (permalink / raw)
To: jspewock
Cc: Honnappa.Nagarahalli, juraj.linkes, thomas, lijuan.tu,
wathsala.vithanage, dev
[-- Attachment #1: Type: text/plain, Size: 10990 bytes --]
Tested-by: Patrick Robb <probb@iol.unh.edu>
On Mon, Jul 10, 2023 at 12:23 PM <jspewock@iol.unh.edu> wrote:
> From: Jeremy Spewock <jspewock@iol.unh.edu>
>
> added paramiko to the dependency files
>
> Signed-off-by: Jeremy Spewock <jspewock@iol.unh.edu>
> ---
> dts/poetry.lock | 160 ++++++++++++++++++++++++++++++++++-----------
> dts/pyproject.toml | 1 +
> 2 files changed, 124 insertions(+), 37 deletions(-)
>
> diff --git a/dts/poetry.lock b/dts/poetry.lock
> index 0b2a007d..dfd9a240 100644
> --- a/dts/poetry.lock
> +++ b/dts/poetry.lock
> @@ -1,20 +1,33 @@
> [[package]]
> name = "attrs"
> -version = "22.1.0"
> +version = "23.1.0"
> description = "Classes Without Boilerplate"
> category = "main"
> optional = false
> -python-versions = ">=3.5"
> +python-versions = ">=3.7"
>
> [package.extras]
> -dev = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest
> (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins",
> "zope.interface", "furo", "sphinx", "sphinx-notfound-page", "pre-commit",
> "cloudpickle"]
> -docs = ["furo", "sphinx", "zope.interface", "sphinx-notfound-page"]
> -tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest
> (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins",
> "zope.interface", "cloudpickle"]
> -tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler",
> "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins",
> "cloudpickle"]
> +cov = ["attrs", "coverage[toml] (>=5.3)"]
> +dev = ["attrs", "pre-commit"]
> +docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page",
> "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
> +tests = ["attrs", "zope-interface"]
> +tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)",
> "pympler", "pytest-mypy-plugins", "pytest-xdist", "pytest (>=4.3.0)"]
> +
> +[[package]]
> +name = "bcrypt"
> +version = "4.0.1"
> +description = "Modern password hashing for your software and your servers"
> +category = "main"
> +optional = false
> +python-versions = ">=3.6"
> +
> +[package.extras]
> +tests = ["pytest (>=3.2.1,!=3.3.0)"]
> +typecheck = ["mypy"]
>
> [[package]]
> name = "black"
> -version = "22.10.0"
> +version = "22.12.0"
> description = "The uncompromising code formatter."
> category = "dev"
> optional = false
> @@ -33,6 +46,17 @@ d = ["aiohttp (>=3.7.4)"]
> jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
> uvloop = ["uvloop (>=0.15.2)"]
>
> +[[package]]
> +name = "cffi"
> +version = "1.15.1"
> +description = "Foreign Function Interface for Python calling C code."
> +category = "main"
> +optional = false
> +python-versions = "*"
> +
> +[package.dependencies]
> +pycparser = "*"
> +
> [[package]]
> name = "click"
> version = "8.1.3"
> @@ -52,18 +76,39 @@ category = "dev"
> optional = false
> python-versions =
> "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
>
> +[[package]]
> +name = "cryptography"
> +version = "41.0.1"
> +description = "cryptography is a package which provides cryptographic
> recipes and primitives to Python developers."
> +category = "main"
> +optional = false
> +python-versions = ">=3.7"
> +
> +[package.dependencies]
> +cffi = ">=1.12"
> +
> +[package.extras]
> +docs = ["sphinx (>=5.3.0)", "sphinx-rtd-theme (>=1.1.1)"]
> +docstest = ["pyenchant (>=1.6.11)", "twine (>=1.12.0)",
> "sphinxcontrib-spelling (>=4.0.1)"]
> +nox = ["nox"]
> +pep8test = ["black", "ruff", "mypy", "check-sdist"]
> +sdist = ["build"]
> +ssh = ["bcrypt (>=3.1.5)"]
> +test = ["pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov",
> "pytest-xdist", "pretend"]
> +test-randomorder = ["pytest-randomly"]
> +
> [[package]]
> name = "isort"
> -version = "5.10.1"
> +version = "5.12.0"
> description = "A Python utility / library to sort Python imports."
> category = "dev"
> optional = false
> -python-versions = ">=3.6.1,<4.0"
> +python-versions = ">=3.8.0"
>
> [package.extras]
> -pipfile_deprecated_finder = ["pipreqs", "requirementslib"]
> -requirements_deprecated_finder = ["pipreqs", "pip-api"]
> -colors = ["colorama (>=0.4.3,<0.5.0)"]
> +colors = ["colorama (>=0.4.3)"]
> +requirements-deprecated-finder = ["pip-api", "pipreqs"]
> +pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs",
> "requirementslib"]
> plugins = ["setuptools"]
>
> [[package]]
> @@ -87,7 +132,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*,
> !=3.3.*"
>
> [[package]]
> name = "jsonschema"
> -version = "4.17.0"
> +version = "4.17.3"
> description = "An implementation of JSON Schema validation for Python"
> category = "main"
> optional = false
> @@ -129,15 +174,33 @@ reports = ["lxml"]
>
> [[package]]
> name = "mypy-extensions"
> -version = "0.4.3"
> -description = "Experimental type system extensions for programs checked
> with the mypy typechecker."
> +version = "1.0.0"
> +description = "Type system extensions for programs checked with the mypy
> type checker."
> category = "dev"
> optional = false
> -python-versions = "*"
> +python-versions = ">=3.5"
> +
> +[[package]]
> +name = "paramiko"
> +version = "3.2.0"
> +description = "SSH2 protocol library"
> +category = "main"
> +optional = false
> +python-versions = ">=3.6"
> +
> +[package.dependencies]
> +bcrypt = ">=3.2"
> +cryptography = ">=3.3"
> +pynacl = ">=1.5"
> +
> +[package.extras]
> +all = ["pyasn1 (>=0.1.7)", "invoke (>=2.0)", "gssapi (>=1.4.1)", "pywin32
> (>=2.1.8)"]
> +gssapi = ["pyasn1 (>=0.1.7)", "gssapi (>=1.4.1)", "pywin32 (>=2.1.8)"]
> +invoke = ["invoke (>=2.0)"]
>
> [[package]]
> name = "pathspec"
> -version = "0.10.1"
> +version = "0.11.1"
> description = "Utility library for gitignore style pattern matching of
> file paths."
> category = "dev"
> optional = false
> @@ -156,15 +219,15 @@ ptyprocess = ">=0.5"
>
> [[package]]
> name = "platformdirs"
> -version = "2.5.2"
> -description = "A small Python module for determining appropriate
> platform-specific dirs, e.g. a \"user data dir\"."
> +version = "3.5.1"
> +description = "A small Python package for determining appropriate
> platform-specific dirs, e.g. a \"user data dir\"."
> category = "dev"
> optional = false
> python-versions = ">=3.7"
>
> [package.extras]
> -docs = ["furo (>=2021.7.5b38)", "proselint (>=0.10.2)",
> "sphinx-autodoc-typehints (>=1.12)", "sphinx (>=4)"]
> -test = ["appdirs (==1.4.4)", "pytest-cov (>=2.7)", "pytest-mock (>=3.6)",
> "pytest (>=6)"]
> +docs = ["furo (>=2023.3.27)", "proselint (>=0.13)",
> "sphinx-autodoc-typehints (>=1.23,!=1.23.4)", "sphinx (>=6.2.1)"]
> +test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest-cov (>=4)",
> "pytest-mock (>=3.10)", "pytest (>=7.3.1)"]
>
> [[package]]
> name = "ptyprocess"
> @@ -176,25 +239,33 @@ python-versions = "*"
>
> [[package]]
> name = "pycodestyle"
> -version = "2.9.1"
> +version = "2.10.0"
> description = "Python style guide checker"
> category = "dev"
> optional = false
> python-versions = ">=3.6"
>
> +[[package]]
> +name = "pycparser"
> +version = "2.21"
> +description = "C parser in Python"
> +category = "main"
> +optional = false
> +python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
> +
> [[package]]
> name = "pydocstyle"
> -version = "6.1.1"
> +version = "6.3.0"
> description = "Python docstring style checker"
> category = "dev"
> optional = false
> python-versions = ">=3.6"
>
> [package.dependencies]
> -snowballstemmer = "*"
> +snowballstemmer = ">=2.2.0"
>
> [package.extras]
> -toml = ["toml"]
> +toml = ["tomli (>=1.2.3)"]
>
> [[package]]
> name = "pyflakes"
> @@ -228,9 +299,24 @@ tests = ["pytest (>=7.1.2)", "pytest-mypy",
> "eradicate (>=2.0.0)", "radon (>=5.1
> toml = ["toml (>=0.10.2)"]
> vulture = ["vulture"]
>
> +[[package]]
> +name = "pynacl"
> +version = "1.5.0"
> +description = "Python binding to the Networking and Cryptography (NaCl)
> library"
> +category = "main"
> +optional = false
> +python-versions = ">=3.6"
> +
> +[package.dependencies]
> +cffi = ">=1.4.1"
> +
> +[package.extras]
> +docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"]
> +tests = ["pytest (>=3.2.1,!=3.3.0)", "hypothesis (>=3.27.0)"]
> +
> [[package]]
> name = "pyrsistent"
> -version = "0.19.1"
> +version = "0.19.3"
> description = "Persistent/Functional/Immutable data structures"
> category = "main"
> optional = false
> @@ -270,7 +356,7 @@ python-versions = ">=3.7"
>
> [[package]]
> name = "types-pyyaml"
> -version = "6.0.12.1"
> +version = "6.0.12.10"
> description = "Typing stubs for PyYAML"
> category = "main"
> optional = false
> @@ -278,7 +364,7 @@ python-versions = "*"
>
> [[package]]
> name = "typing-extensions"
> -version = "4.4.0"
> +version = "4.6.2"
> description = "Backported and Experimental Type Hints for Python 3.7+"
> category = "dev"
> optional = false
> @@ -299,13 +385,16 @@ jsonschema = ">=4,<5"
> [metadata]
> lock-version = "1.1"
> python-versions = "^3.10"
> -content-hash =
> "a0f040b07fc6ce4deb0be078b9a88c2a465cb6bccb9e260a67e92c2403e2319f"
> +content-hash =
> "c119901b1c13b14adf9b53624430aff8720d9a3a180b028579b82c65d49474df"
>
> [metadata.files]
> attrs = []
> +bcrypt = []
> black = []
> +cffi = []
> click = []
> colorama = []
> +cryptography = []
> isort = []
> jsonpatch = []
> jsonpointer = []
> @@ -313,20 +402,17 @@ jsonschema = []
> mccabe = []
> mypy = []
> mypy-extensions = []
> +paramiko = []
> pathspec = []
> -pexpect = [
> - {file = "pexpect-4.8.0-py2.py3-none-any.whl", hash =
> "sha256:0b48a55dcb3c05f3329815901ea4fc1537514d6ba867a152b581d69ae3710937"},
> - {file = "pexpect-4.8.0.tar.gz", hash =
> "sha256:fc65a43959d153d0114afe13997d439c22823a27cefceb5ff35c2178c6784c0c"},
> -]
> -platformdirs = [
> - {file = "platformdirs-2.5.2-py3-none-any.whl", hash =
> "sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788"},
> - {file = "platformdirs-2.5.2.tar.gz", hash =
> "sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19"},
> -]
> +pexpect = []
> +platformdirs = []
> ptyprocess = []
> pycodestyle = []
> +pycparser = []
> pydocstyle = []
> pyflakes = []
> pylama = []
> +pynacl = []
> pyrsistent = []
> pyyaml = []
> snowballstemmer = []
> diff --git a/dts/pyproject.toml b/dts/pyproject.toml
> index a136c91e..928837cb 100644
> --- a/dts/pyproject.toml
> +++ b/dts/pyproject.toml
> @@ -13,6 +13,7 @@ pexpect = "^4.8.0"
> warlock = "^2.0.1"
> PyYAML = "^6.0"
> types-PyYAML = "^6.0.8"
> +paramiko = "^3.1.0"
>
> [tool.poetry.dev-dependencies]
> mypy = "^0.961"
> --
> 2.41.0
>
>
--
Patrick Robb
Technical Service Manager
UNH InterOperability Laboratory
21 Madbury Rd, Suite 100, Durham, NH 03824
www.iol.unh.edu
[-- Attachment #2: Type: text/html, Size: 16954 bytes --]
^ permalink raw reply [flat|nested] 8+ messages in thread
* Re: [PATCH v2 1/2] dts: add smoke tests
2023-07-10 16:19 ` [PATCH v2 1/2] dts: add " jspewock
@ 2023-07-11 9:42 ` Juraj Linkeš
2023-07-11 18:30 ` Jeremy Spewock
0 siblings, 1 reply; 8+ messages in thread
From: Juraj Linkeš @ 2023-07-11 9:42 UTC (permalink / raw)
To: jspewock
Cc: Honnappa.Nagarahalli, thomas, lijuan.tu, wathsala.vithanage, probb, dev
Just a few more comments.
On Mon, Jul 10, 2023 at 6:23 PM <jspewock@iol.unh.edu> wrote:
>
> From: Jeremy Spewock <jspewock@iol.unh.edu>
>
> Adds a new test suite for running smoke tests that verify general
> configuration aspects of the system under test. If any of these tests
> fail, the DTS execution terminates as part of a "fail-fast" model.
>
> Signed-off-by: Jeremy Spewock <jspewock@iol.unh.edu>
> ---
> dts/conf.yaml | 17 +-
> dts/framework/config/__init__.py | 116 +++++++++--
> dts/framework/config/conf_yaml_schema.json | 142 +++++++++++++-
> dts/framework/dts.py | 88 ++++++---
> dts/framework/exception.py | 12 ++
> dts/framework/remote_session/__init__.py | 10 +-
> dts/framework/remote_session/os_session.py | 24 ++-
> dts/framework/remote_session/posix_session.py | 29 ++-
> .../remote_session/remote/__init__.py | 10 +
> .../remote/interactive_remote_session.py | 118 ++++++++++++
> .../remote/interactive_shell.py | 99 ++++++++++
> .../remote_session/remote/testpmd_shell.py | 67 +++++++
> dts/framework/test_result.py | 37 +++-
> dts/framework/test_suite.py | 21 +-
> dts/framework/testbed_model/node.py | 2 +
> dts/framework/testbed_model/sut_node.py | 180 +++++++++++++-----
> dts/tests/TestSuite_smoke_tests.py | 118 ++++++++++++
> 17 files changed, 994 insertions(+), 96 deletions(-)
> create mode 100644 dts/framework/remote_session/remote/interactive_remote_session.py
> create mode 100644 dts/framework/remote_session/remote/interactive_shell.py
> create mode 100644 dts/framework/remote_session/remote/testpmd_shell.py
> create mode 100644 dts/tests/TestSuite_smoke_tests.py
>
> diff --git a/dts/conf.yaml b/dts/conf.yaml
> index a9bd8a3e..2717de13 100644
> --- a/dts/conf.yaml
> +++ b/dts/conf.yaml
> @@ -10,9 +10,13 @@ executions:
> compiler_wrapper: ccache
> perf: false
> func: true
> + skip_smoke_tests: false #optional flag that allow you to ski smoke tests
Typo: ski
Also put a space after #
> test_suites:
> - hello_world
> - system_under_test: "SUT 1"
> + system_under_test:
> + node_name: "SUT 1"
> + vdevs: #optional: if removed vdevs won't be used in the execution
Missing space after #
The sentence after hugepages has a comma in it, let's unify those.
> + - "crypto_openssl"
> nodes:
> - name: "SUT 1"
> hostname: sut1.change.me.localhost
> @@ -20,6 +24,17 @@ nodes:
> arch: x86_64
> os: linux
> lcores: ""
> + ports:
I'm comparing my version with this patch and I've just noticed this -
let's put the ports at the end (after hugepages). This way we'll have
the configuration sorted into sections of sorts:
Cores/cpu config
Memory config
Port/devices config
> + - pci: "0000:00:08.0"
> + os_driver_for_dpdk: vfio-pci #OS driver that DPDK will use
Missing space after #
> + os_driver: i40e
> + peer_node: "TG 1"
> + peer_pci: "0000:00:08.0"
> + - pci: "0000:00:08.1"
> + os_driver_for_dpdk: vfio-pci
> + os_driver: i40e
> + peer_node: "TG 1"
> + peer_pci: "0000:00:08.1"
> use_first_core: false
> memory_channels: 4
> hugepages: # optional; if removed, will use system hugepage configuration
> diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
> index ebb0823f..75ac1cbe 100644
> --- a/dts/framework/config/__init__.py
> +++ b/dts/framework/config/__init__.py
> @@ -12,6 +12,7 @@
> import pathlib
> from dataclasses import dataclass
> from enum import Enum, auto, unique
> +from pathlib import PurePath
> from typing import Any, TypedDict
>
> import warlock # type: ignore
> @@ -72,6 +73,20 @@ class HugepageConfiguration:
> force_first_numa: bool
>
>
> +@dataclass(slots=True, frozen=True)
> +class PortConfig:
> + node: str
> + pci: str
> + os_driver_for_dpdk: str
> + os_driver: str
> + peer_node: str
> + peer_pci: str
> +
> + @staticmethod
> + def from_dict(node: str, d: dict) -> "PortConfig":
> + return PortConfig(node=node, **d)
> +
> +
> @dataclass(slots=True, frozen=True)
> class NodeConfiguration:
> name: str
> @@ -84,6 +99,7 @@ class NodeConfiguration:
> use_first_core: bool
> memory_channels: int
> hugepages: HugepageConfiguration | None
> + ports: list[PortConfig]
>
> @staticmethod
> def from_dict(d: dict) -> "NodeConfiguration":
> @@ -92,18 +108,43 @@ def from_dict(d: dict) -> "NodeConfiguration":
> if "force_first_numa" not in hugepage_config:
> hugepage_config["force_first_numa"] = False
> hugepage_config = HugepageConfiguration(**hugepage_config)
> + common_config = {
> + "name": d["name"],
> + "hostname": d["hostname"],
> + "user": d["user"],
> + "password": d.get("password"),
> + "arch": Architecture(d["arch"]),
> + "os": OS(d["os"]),
> + "lcores": d.get("lcores", "1"),
> + "use_first_core": d.get("use_first_core", False),
> + "memory_channels": d.get("memory_channels", 1),
> + "hugepages": hugepage_config,
> + "ports": [PortConfig.from_dict(d["name"], port) for port in d["ports"]],
> + }
> +
> + return NodeConfiguration(**common_config)
> +
> +
> +@dataclass(slots=True)
Looks like this could be frozen as well. This should work even in the
future, as I imagine we'll get all the info we need just once.
> +class NodeInfo:
> + """Class to hold important versions within the node.
> +
> + This class, unlike the NodeConfiguration class, cannot be generated at the start.
> + This is because we need to initialize a connection with the node before we can
> + collect the information needed in this class. Therefore, it cannot be a part of
> + the configuration class above.
> + """
>
> - return NodeConfiguration(
> - name=d["name"],
> - hostname=d["hostname"],
> - user=d["user"],
> - password=d.get("password"),
> - arch=Architecture(d["arch"]),
> - os=OS(d["os"]),
> - lcores=d.get("lcores", "1"),
> - use_first_core=d.get("use_first_core", False),
> - memory_channels=d.get("memory_channels", 1),
> - hugepages=hugepage_config,
> + os_name: str
> + os_version: str
> + kernel_version: str
> +
> + @staticmethod
> + def from_dict(d: dict):
> + return NodeInfo(
> + os_name=d["os_name"],
> + os_version=d["os_version"],
> + kernel_version=d["kernel_version"],
> )
We don't need the from_dict method, as we can instantiate this class
right away (node_info = NodeInfo(os_name=d["os_name"],
os_version=d["os_version"], kernel_version=d["kernel_version"])).
The other classes need this method because we're doing some processing
before instantiating the classes - this one doesn't need it.
>
>
> @@ -128,6 +169,24 @@ def from_dict(d: dict) -> "BuildTargetConfiguration":
> )
>
>
> +@dataclass(slots=True)
Can also be frozen.
> +class BuildTargetInfo:
> + """Class to hold important versions within the build target.
> +
> + This is very similar to the NodeVersionInfo class, it just instead holds information
References renamed class.
> + for the build target.
> + """
> +
> + dpdk_version: str
> + compiler_version: str
> +
> + @staticmethod
> + def from_dict(d: dict):
> + return BuildTargetInfo(
> + dpdk_version=d["dpdk_version"], compiler_version=d["compiler_version"]
> + )
Same as above.
> +
> +
> class TestSuiteConfigDict(TypedDict):
> suite: str
> cases: list[str]
> @@ -157,6 +216,8 @@ class ExecutionConfiguration:
> func: bool
> test_suites: list[TestSuiteConfig]
> system_under_test: NodeConfiguration
> + vdevs: list[str]
> + skip_smoke_tests: bool
>
> @staticmethod
> def from_dict(d: dict, node_map: dict) -> "ExecutionConfiguration":
> @@ -166,15 +227,20 @@ def from_dict(d: dict, node_map: dict) -> "ExecutionConfiguration":
> test_suites: list[TestSuiteConfig] = list(
> map(TestSuiteConfig.from_dict, d["test_suites"])
> )
> - sut_name = d["system_under_test"]
> + sut_name = d["system_under_test"]["node_name"]
> + skip_smoke_tests = d.get("skip_smoke_tests", False)
> assert sut_name in node_map, f"Unknown SUT {sut_name} in execution {d}"
> -
> + vdevs = (
> + d["system_under_test"]["vdevs"] if "vdevs" in d["system_under_test"] else []
> + )
> return ExecutionConfiguration(
> build_targets=build_targets,
> perf=d["perf"],
> func=d["func"],
> + skip_smoke_tests=skip_smoke_tests,
> test_suites=test_suites,
> system_under_test=node_map[sut_name],
> + vdevs=vdevs,
> )
>
>
> @@ -221,3 +287,27 @@ def load_config() -> Configuration:
>
>
> CONFIGURATION = load_config()
> +
> +
> +@unique
> +class InteractiveApp(Enum):
> + """An enum that represents different supported interactive applications
> +
> + The values in this enum must all be set to objects that have a key called
> + "default_path" where "default_path" represents a PurPath object for the path
> + to the application. This default path will be passed into the handler class
> + for the application so that it can start the application. For every key other
> + than the default shell option, the path will be appended to the path to the DPDK
> + build directory for the current SUT node.
> + """
> +
> + shell = {"default_path": PurePath()}
> + testpmd = {"default_path": PurePath("app", "dpdk-testpmd")}
> +
> + def get_path(self) -> PurePath:
> + """A method for getting the default paths of an application
> +
> + Returns:
> + String array that represents an OS agnostic path to the application.
> + """
> + return self.value["default_path"]
> diff --git a/dts/framework/config/conf_yaml_schema.json b/dts/framework/config/conf_yaml_schema.json
> index ca2d4a1e..61f52b43 100644
> --- a/dts/framework/config/conf_yaml_schema.json
> +++ b/dts/framework/config/conf_yaml_schema.json
> @@ -6,6 +6,76 @@
> "type": "string",
> "description": "A unique identifier for a node"
> },
> + "NIC": {
> + "type": "string",
> + "enum": [
> + "ALL",
> + "ConnectX3_MT4103",
> + "ConnectX4_LX_MT4117",
> + "ConnectX4_MT4115",
> + "ConnectX5_MT4119",
> + "ConnectX5_MT4121",
> + "I40E_10G-10G_BASE_T_BC",
> + "I40E_10G-10G_BASE_T_X722",
> + "I40E_10G-SFP_X722",
> + "I40E_10G-SFP_XL710",
> + "I40E_10G-X722_A0",
> + "I40E_1G-1G_BASE_T_X722",
> + "I40E_25G-25G_SFP28",
> + "I40E_40G-QSFP_A",
> + "I40E_40G-QSFP_B",
> + "IAVF-ADAPTIVE_VF",
> + "IAVF-VF",
> + "IAVF_10G-X722_VF",
> + "ICE_100G-E810C_QSFP",
> + "ICE_25G-E810C_SFP",
> + "ICE_25G-E810_XXV_SFP",
> + "IGB-I350_VF",
> + "IGB_1G-82540EM",
> + "IGB_1G-82545EM_COPPER",
> + "IGB_1G-82571EB_COPPER",
> + "IGB_1G-82574L",
> + "IGB_1G-82576",
> + "IGB_1G-82576_QUAD_COPPER",
> + "IGB_1G-82576_QUAD_COPPER_ET2",
> + "IGB_1G-82580_COPPER",
> + "IGB_1G-I210_COPPER",
> + "IGB_1G-I350_COPPER",
> + "IGB_1G-I354_SGMII",
> + "IGB_1G-PCH_LPTLP_I218_LM",
> + "IGB_1G-PCH_LPTLP_I218_V",
> + "IGB_1G-PCH_LPT_I217_LM",
> + "IGB_1G-PCH_LPT_I217_V",
> + "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
> + "IGC-I225_LM",
> + "IGC-I226_LM",
> + "IXGBE_10G-82599_SFP",
> + "IXGBE_10G-82599_SFP_SF_QP",
> + "IXGBE_10G-82599_T3_LOM",
> + "IXGBE_10G-82599_VF",
> + "IXGBE_10G-X540T",
> + "IXGBE_10G-X540_VF",
> + "IXGBE_10G-X550EM_A_SFP",
> + "IXGBE_10G-X550EM_X_10G_T",
> + "IXGBE_10G-X550EM_X_SFP",
> + "IXGBE_10G-X550EM_X_VF",
> + "IXGBE_10G-X550T",
> + "IXGBE_10G-X550_VF",
> + "brcm_57414",
> + "brcm_P2100G",
> + "cavium_0011",
> + "cavium_a034",
> + "cavium_a063",
> + "cavium_a064",
> + "fastlinq_ql41000",
> + "fastlinq_ql41000_vf",
> + "fastlinq_ql45000",
> + "fastlinq_ql45000_vf",
> + "hi1822",
> + "virtio"
> + ]
> + },
> +
> "ARCH": {
> "type": "string",
> "enum": [
> @@ -94,6 +164,19 @@
> "amount"
> ]
> },
> + "pci_address": {
> + "type": "string",
> + "pattern": "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
> + },
> + "port_peer_address": {
> + "description": "Peer is a TRex port, and IXIA port or a PCI address",
> + "oneOf": [
> + {
> + "description": "PCI peer port",
> + "$ref": "#/definitions/pci_address"
> + }
> + ]
> + },
> "test_suite": {
> "type": "string",
> "enum": [
> @@ -165,6 +248,44 @@
> },
> "hugepages": {
> "$ref": "#/definitions/hugepages"
> + },
> + "ports": {
> + "type": "array",
> + "items": {
> + "type": "object",
> + "description": "Each port should be described on both sides of the connection. This makes configuration slightly more verbose but greatly simplifies implementation. If there are an inconsistencies, then DTS will not run until that issue is fixed. An example inconsistency would be port 1, node 1 says it is connected to port 1, node 2, but port 1, node 2 says it is connected to port 2, node 1.",
> + "properties": {
> + "pci": {
> + "$ref": "#/definitions/pci_address",
> + "description": "The local PCI address of the port"
> + },
> + "os_driver_for_dpdk": {
> + "type": "string",
> + "description": "The driver that the kernel should bind this device to for DPDK to use it. (ex: vfio-pci)"
> + },
> + "os_driver": {
> + "type": "string",
> + "description": "The driver normally used by this port (ex: i40e)"
> + },
> + "peer_node": {
> + "type": "string",
> + "description": "The name of the node the peer port is on"
> + },
> + "peer_pci": {
> + "$ref": "#/definitions/pci_address",
> + "description": "The PCI address of the peer port"
> + }
> + },
> + "additionalProperties": false,
> + "required": [
> + "pci",
> + "os_driver_for_dpdk",
> + "os_driver",
> + "peer_node",
> + "peer_pci"
> + ]
> + },
> + "minimum": 1
> }
> },
> "additionalProperties": false,
> @@ -211,8 +332,27 @@
> ]
> }
> },
> + "skip_smoke_tests": {
> + "description": "Optional field that allows you to skip smoke testing",
> + "type": "boolean"
> + },
> "system_under_test": {
> - "$ref": "#/definitions/node_name"
> + "type":"object",
> + "properties": {
> + "node_name": {
> + "$ref": "#/definitions/node_name"
> + },
> + "vdevs": {
> + "description": "Opentional list of names of vdevs to be used in execution",
> + "type": "array",
> + "items": {
> + "type": "string"
> + }
> + }
> + },
> + "required": [
> + "node_name"
> + ]
> }
> },
> "additionalProperties": false,
> diff --git a/dts/framework/dts.py b/dts/framework/dts.py
> index 05022845..1b67938f 100644
> --- a/dts/framework/dts.py
> +++ b/dts/framework/dts.py
> @@ -5,7 +5,13 @@
>
> import sys
>
> -from .config import CONFIGURATION, BuildTargetConfiguration, ExecutionConfiguration
> +from .config import (
> + CONFIGURATION,
> + BuildTargetConfiguration,
> + ExecutionConfiguration,
> + TestSuiteConfig,
> +)
> +from .exception import BlockingTestSuiteError
> from .logger import DTSLOG, getLogger
> from .test_result import BuildTargetResult, DTSResult, ExecutionResult, Result
> from .test_suite import get_test_suites
> @@ -82,7 +88,7 @@ def _run_execution(
> running all build targets in the given execution.
> """
> dts_logger.info(f"Running execution with SUT '{execution.system_under_test.name}'.")
> - execution_result = result.add_execution(sut_node.config)
> + execution_result = result.add_execution(sut_node.config, sut_node.node_info)
>
> try:
> sut_node.set_up_execution(execution)
> @@ -118,14 +124,15 @@ def _run_build_target(
>
> try:
> sut_node.set_up_build_target(build_target)
> - result.dpdk_version = sut_node.dpdk_version
> + # result.dpdk_version = sut_node.dpdk_version
> + build_target_result.add_build_target_versions(sut_node.get_build_target_info())
> build_target_result.update_setup(Result.PASS)
> except Exception as e:
> dts_logger.exception("Build target setup failed.")
> build_target_result.update_setup(Result.FAIL, e)
>
> else:
> - _run_suites(sut_node, execution, build_target_result)
> + _run_all_suites(sut_node, execution, build_target_result)
>
> finally:
> try:
> @@ -136,7 +143,7 @@ def _run_build_target(
> build_target_result.update_teardown(Result.FAIL, e)
>
>
> -def _run_suites(
> +def _run_all_suites(
> sut_node: SutNode,
> execution: ExecutionConfiguration,
> build_target_result: BuildTargetResult,
> @@ -146,27 +153,62 @@ def _run_suites(
> with possibly only a subset of test cases.
> If no subset is specified, run all test cases.
> """
> + end_build_target = False
> + if not execution.skip_smoke_tests:
> + execution.test_suites[:0] = [TestSuiteConfig.from_dict("smoke_tests")]
> for test_suite_config in execution.test_suites:
> try:
> - full_suite_path = f"tests.TestSuite_{test_suite_config.test_suite}"
> - test_suite_classes = get_test_suites(full_suite_path)
> - suites_str = ", ".join((x.__name__ for x in test_suite_classes))
> - dts_logger.debug(
> - f"Found test suites '{suites_str}' in '{full_suite_path}'."
> + _run_single_suite(
> + sut_node, execution, build_target_result, test_suite_config
> )
> - except Exception as e:
> - dts_logger.exception("An error occurred when searching for test suites.")
> - result.update_setup(Result.ERROR, e)
> -
> - else:
> - for test_suite_class in test_suite_classes:
> - test_suite = test_suite_class(
> - sut_node,
> - test_suite_config.test_cases,
> - execution.func,
> - build_target_result,
> - )
> - test_suite.run()
> + except BlockingTestSuiteError as e:
> + dts_logger.exception(
> + f"An error occurred within {test_suite_config.test_suite}. "
> + "Skipping build target..."
> + )
> + result.add_error(e)
> + end_build_target = True
> + # if a blocking test failed and we need to bail out of suite executions
> + if end_build_target:
> + break
> +
> +
> +def _run_single_suite(
> + sut_node: SutNode,
> + execution: ExecutionConfiguration,
> + build_target_result: BuildTargetResult,
> + test_suite_config: TestSuiteConfig,
> +) -> None:
> + """Runs a single test suite.
> +
> + Args:
> + sut_node: Node to run tests on.
> + execution: Execution the test case belongs to.
> + build_target_result: Build target configuration test case is run on
> + test_suite_config: Test suite configuration
> +
> + Raises:
> + BlockingTestSuiteError: If a test suite that was marked as blocking fails.
> + """
> + try:
> + full_suite_path = f"tests.TestSuite_{test_suite_config.test_suite}"
> + test_suite_classes = get_test_suites(full_suite_path)
> + suites_str = ", ".join((x.__name__ for x in test_suite_classes))
> + dts_logger.debug(f"Found test suites '{suites_str}' in '{full_suite_path}'.")
> + except Exception as e:
> + dts_logger.exception("An error occurred when searching for test suites.")
> + result.update_setup(Result.ERROR, e)
> +
> + else:
> + for test_suite_class in test_suite_classes:
> + test_suite = test_suite_class(
> + sut_node,
> + test_suite_config.test_cases,
> + execution.func,
> + build_target_result,
> + result,
> + )
> + test_suite.run()
>
>
> def _exit_dts() -> None:
> diff --git a/dts/framework/exception.py b/dts/framework/exception.py
> index ca353d98..dfb12df4 100644
> --- a/dts/framework/exception.py
> +++ b/dts/framework/exception.py
> @@ -25,6 +25,7 @@ class ErrorSeverity(IntEnum):
> SSH_ERR = 4
> DPDK_BUILD_ERR = 10
> TESTCASE_VERIFY_ERR = 20
> + BLOCKING_TESTSUITE_ERR = 25
>
>
> class DTSError(Exception):
> @@ -144,3 +145,14 @@ def __init__(self, value: str):
>
> def __str__(self) -> str:
> return repr(self.value)
> +
> +
> +class BlockingTestSuiteError(DTSError):
> + suite_name: str
> + severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
> +
> + def __init__(self, suite_name: str) -> None:
> + self.suite_name = suite_name
> +
> + def __str__(self) -> str:
> + return f"Blocking suite {self.suite_name} failed."
> diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
> index ee221503..4fe32d35 100644
> --- a/dts/framework/remote_session/__init__.py
> +++ b/dts/framework/remote_session/__init__.py
> @@ -1,5 +1,6 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> +# Copyright(c) 2023 University of New Hampshire
>
> """
> The package provides modules for managing remote connections to a remote host (node),
> @@ -17,7 +18,14 @@
>
> from .linux_session import LinuxSession
> from .os_session import OSSession
> -from .remote import CommandResult, RemoteSession, SSHSession
> +from .remote import (
> + CommandResult,
> + InteractiveRemoteSession,
> + InteractiveShell,
> + RemoteSession,
> + SSHSession,
> + TestPmdShell,
> +)
>
>
> def create_session(
> diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/remote_session/os_session.py
> index 4c48ae25..4346ecc4 100644
> --- a/dts/framework/remote_session/os_session.py
> +++ b/dts/framework/remote_session/os_session.py
> @@ -6,13 +6,19 @@
> from collections.abc import Iterable
> from pathlib import PurePath
>
> -from framework.config import Architecture, NodeConfiguration
> +from framework.config import Architecture, NodeConfiguration, NodeInfo
> from framework.logger import DTSLOG
> from framework.settings import SETTINGS
> from framework.testbed_model import LogicalCore
> from framework.utils import EnvVarsDict, MesonArgs
>
> -from .remote import CommandResult, RemoteSession, create_remote_session
> +from .remote import (
> + CommandResult,
> + InteractiveRemoteSession,
> + RemoteSession,
> + create_interactive_session,
> + create_remote_session,
> +)
>
>
> class OSSession(ABC):
> @@ -26,6 +32,7 @@ class OSSession(ABC):
> name: str
> _logger: DTSLOG
> remote_session: RemoteSession
> + interactive_session: InteractiveRemoteSession
>
> def __init__(
> self,
> @@ -37,6 +44,7 @@ def __init__(
> self.name = name
> self._logger = logger
> self.remote_session = create_remote_session(node_config, name, logger)
> + self.interactive_session = create_interactive_session(node_config, name, logger)
>
> def close(self, force: bool = False) -> None:
> """
> @@ -173,3 +181,15 @@ def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
> if needed and mount the hugepages if needed.
> If force_first_numa is True, configure hugepages just on the first socket.
> """
> +
> + @abstractmethod
> + def get_compiler_version(self, compiler_name: str) -> str:
> + """
> + Get installed version of compiler used for DPDK
> + """
> +
> + @abstractmethod
> + def get_node_info(self) -> NodeInfo:
> + """
> + Collect information about the node
> + """
> diff --git a/dts/framework/remote_session/posix_session.py b/dts/framework/remote_session/posix_session.py
> index d38062e8..f8ec159f 100644
> --- a/dts/framework/remote_session/posix_session.py
> +++ b/dts/framework/remote_session/posix_session.py
> @@ -6,7 +6,7 @@
> from collections.abc import Iterable
> from pathlib import PurePath, PurePosixPath
>
> -from framework.config import Architecture
> +from framework.config import Architecture, NodeInfo
> from framework.exception import DPDKBuildError, RemoteCommandExecutionError
> from framework.settings import SETTINGS
> from framework.utils import EnvVarsDict, MesonArgs
> @@ -219,3 +219,30 @@ def _remove_dpdk_runtime_dirs(
>
> def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
> return ""
> +
> + def get_compiler_version(self, compiler_name: str) -> str:
> + match compiler_name:
> + case "gcc":
> + return self.send_command(f"{compiler_name} --version", 60).stdout.split(
> + "\n"
> + )[0]
The timeouts are still there.
> + case "clang":
> + return self.send_command(f"{compiler_name} --version", 60).stdout.split(
> + "\n"
> + )[0]
> + case "msvc":
> + return self.send_command("cl", 60).stdout
> + case "icc":
> + return self.send_command(f"{compiler_name} -V", 60).stdout
> + case _:
> + raise ValueError(f"Unknown compiler {compiler_name}")
> +
> + def get_node_info(self) -> NodeInfo:
> + os_release_info = self.send_command(
> + "awk -F= '$1 ~ /^NAME$|^VERSION$/ {print $2}' /etc/os-release",
> + SETTINGS.timeout,
> + ).stdout.split("\n")
> + kernel_version = self.send_command("uname -r", SETTINGS.timeout).stdout
> + return NodeInfo(
> + os_release_info[0].strip(), os_release_info[1].strip(), kernel_version
> + )
> diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
> index 8a151221..224598a8 100644
> --- a/dts/framework/remote_session/remote/__init__.py
> +++ b/dts/framework/remote_session/remote/__init__.py
> @@ -1,16 +1,26 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> +# Copyright(c) 2022-2023 University of New Hampshire
There are other instances of the copyright statement. It's not
necessary to change them all to just 2023, but I'd say it's
preferable.
>
> # pylama:ignore=W0611
>
> from framework.config import NodeConfiguration
> from framework.logger import DTSLOG
>
> +from .interactive_remote_session import InteractiveRemoteSession
> +from .interactive_shell import InteractiveShell
> from .remote_session import CommandResult, RemoteSession
> from .ssh_session import SSHSession
> +from .testpmd_shell import TestPmdShell
>
>
> def create_remote_session(
> node_config: NodeConfiguration, name: str, logger: DTSLOG
> ) -> RemoteSession:
> return SSHSession(node_config, name, logger)
> +
> +
> +def create_interactive_session(
> + node_config: NodeConfiguration, name: str, logger: DTSLOG
> +) -> InteractiveRemoteSession:
> + return InteractiveRemoteSession(node_config, logger)
> diff --git a/dts/framework/remote_session/remote/interactive_remote_session.py b/dts/framework/remote_session/remote/interactive_remote_session.py
> new file mode 100644
> index 00000000..e145d35d
> --- /dev/null
> +++ b/dts/framework/remote_session/remote/interactive_remote_session.py
> @@ -0,0 +1,118 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 University of New Hampshire
> +
> +import socket
> +import traceback
> +from pathlib import PurePath
> +from typing import Union
> +
> +from paramiko import AutoAddPolicy, SSHClient, Transport # type: ignore
> +from paramiko.ssh_exception import ( # type: ignore
> + AuthenticationException,
> + BadHostKeyException,
> + NoValidConnectionsError,
> + SSHException,
> +)
> +
> +from framework.config import InteractiveApp, NodeConfiguration
> +from framework.exception import SSHConnectionError
> +from framework.logger import DTSLOG
> +
> +from .interactive_shell import InteractiveShell
> +from .testpmd_shell import TestPmdShell
> +
> +
> +class InteractiveRemoteSession:
> + hostname: str
> + ip: str
> + port: int
> + username: str
> + password: str
> + _logger: DTSLOG
> + _node_config: NodeConfiguration
> + session: SSHClient
> + _transport: Transport | None
> +
> + def __init__(self, node_config: NodeConfiguration, _logger: DTSLOG) -> None:
> + self._node_config = node_config
> + self._logger = _logger
> + self.hostname = node_config.hostname
> + self.username = node_config.user
> + self.password = node_config.password if node_config.password else ""
> + port = "22"
> + self.ip = node_config.hostname
> + if ":" in node_config.hostname:
> + self.ip, port = node_config.hostname.split(":")
> + self.port = int(port)
> + self._logger.info(
> + f"Initializing interactive connection for {self.username}@{self.hostname}"
> + )
> + self._connect()
> + self._logger.info(
> + f"Interactive connection successful for {self.username}@{self.hostname}"
> + )
> +
> + def _connect(self) -> None:
> + client = SSHClient()
> + client.set_missing_host_key_policy(AutoAddPolicy)
> + self.session = client
> + retry_attempts = 10
> + for retry_attempt in range(retry_attempts):
> + try:
> + client.connect(
> + self.ip,
> + username=self.username,
> + port=self.port,
> + password=self.password,
> + timeout=20 if self.port else 10,
> + )
> + except (TypeError, BadHostKeyException, AuthenticationException) as e:
> + self._logger.exception(e)
> + raise SSHConnectionError(self.hostname) from e
> + except (NoValidConnectionsError, socket.error, SSHException) as e:
> + self._logger.debug(traceback.format_exc())
> + self._logger.warning(e)
> + self._logger.info(
> + "Retrying interactive session connection: "
> + f"retry number {retry_attempt +1}"
> + )
> + else:
> + break
> + else:
> + raise SSHConnectionError(self.hostname)
> + # Interactive sessions are used on an "as needed" basis so we have
> + # to set a keepalive
> + self._transport = self.session.get_transport()
> + if self._transport is not None:
> + self._transport.set_keepalive(30)
> +
> + def create_interactive_shell(
> + self,
> + shell_type: InteractiveApp,
> + path_to_app: PurePath,
> + eal_parameters: str,
> + timeout: float,
> + ) -> Union[InteractiveShell, TestPmdShell]:
> + """
> + See "create_interactive_shell" in SutNode
> + """
> + match (shell_type):
> + case InteractiveApp.shell:
> + return InteractiveShell(
> + self.session, self._logger, path_to_app, timeout
> + )
> + case InteractiveApp.testpmd:
> + return TestPmdShell(
> + self.session,
> + self._logger,
> + path_to_app,
> + timeout=timeout,
> + eal_flags=eal_parameters,
> + )
> + case _:
> + self._logger.info(
> + f"Unhandled app type {shell_type.name}, defaulting to shell."
> + )
> + return InteractiveShell(
> + self.session, self._logger, path_to_app, timeout
> + )
> diff --git a/dts/framework/remote_session/remote/interactive_shell.py b/dts/framework/remote_session/remote/interactive_shell.py
> new file mode 100644
> index 00000000..4b0735c8
> --- /dev/null
> +++ b/dts/framework/remote_session/remote/interactive_shell.py
> @@ -0,0 +1,99 @@
> +from pathlib import PurePath
> +
> +from paramiko import Channel, SSHClient, channel # type: ignore
> +
> +from framework.logger import DTSLOG
> +from framework.settings import SETTINGS
> +
> +
> +class InteractiveShell:
> +
> + _interactive_session: SSHClient
> + _stdin: channel.ChannelStdinFile
> + _stdout: channel.ChannelFile
> + _ssh_channel: Channel
> + _logger: DTSLOG
> + _timeout: float
> + _path_to_app: PurePath
> +
> + def __init__(
> + self,
> + interactive_session: SSHClient,
> + logger: DTSLOG,
> + path_to_app: PurePath,
> + timeout: float = SETTINGS.timeout,
> + ) -> None:
> + self._interactive_session = interactive_session
> + self._ssh_channel = self._interactive_session.invoke_shell()
> + self._stdin = self._ssh_channel.makefile_stdin("w")
> + self._stdout = self._ssh_channel.makefile("r")
> + self._ssh_channel.settimeout(timeout)
> + self._ssh_channel.set_combine_stderr(True) # combines stdout and stderr streams
> + self._logger = logger
> + self._timeout = timeout
> + self._path_to_app = path_to_app
> +
> +
> + def send_command_no_output(self, command: str) -> None:
> + """Send command to channel without recording output.
> +
> + This method will not verify any input or output, it will simply assume the
> + command succeeded. This method will also consume all output in the buffer
> + after executing the command.
> + """
> + self._logger.info(
> + f"Sending command {command.strip()} and not collecting output"
> + )
> + self._stdin.write(f"{command}\n")
> + self._stdin.flush()
> + self.empty_stdout_buffer()
> +
> + def empty_stdout_buffer(self) -> None:
> + """Removes all data from the stdout buffer.
> +
> + Because of the way paramiko handles read buffers, there is no way to effectively
> + remove data from, or "flush", read buffers. This method essentially moves our
> + offset on the buffer to the end and thus "removes" the data from the buffer.
> + Timeouts are thrown on read operations of paramiko pipes based on whether data
> + had been received before timeout so we assume that if we reach the timeout then
> + we are at the end of the buffer.
> + """
> + self._ssh_channel.settimeout(0.5)
> + try:
> + for line in self._stdout:
> + pass
> + except TimeoutError:
> + pass
> + self._ssh_channel.settimeout(self._timeout) # reset timeout
> +
> + def send_command_get_output(self, command: str, prompt: str) -> str:
> + """Send a command and get all output before the expected ending string.
> +
> + Lines that expect input are not included in the stdout buffer so they cannot be
> + used for expect. For example, if you were prompted to log into something
> + with a username and password, you cannot expect "username:" because it won't
> + yet be in the stdout buffer. A work around for this could be consuming an
> + extra newline character to force the current prompt into the stdout buffer.
> +
> + Returns:
> + All output in the buffer before expected string
> + """
> + self._logger.info(f"Sending command {command.strip()}...")
> + self._stdin.write(f"{command}\n")
> + self._stdin.flush()
> + out: str = ""
> + for line in self._stdout:
> + out += line
> + if prompt in line and not line.rstrip().endswith(
> + command.rstrip()
> + ): # ignore line that sent command
> + break
> + self._logger.debug(f"Got output: {out}")
> + return out
> +
> + def close(self) -> None:
> + self._stdin.close()
> + self._ssh_channel.close()
> +
> + def __del__(self) -> None:
> + self.close()
> diff --git a/dts/framework/remote_session/remote/testpmd_shell.py b/dts/framework/remote_session/remote/testpmd_shell.py
> new file mode 100644
> index 00000000..bde3468b
> --- /dev/null
> +++ b/dts/framework/remote_session/remote/testpmd_shell.py
> @@ -0,0 +1,67 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 University of New Hampshire
> +
> +
> +from pathlib import PurePath
> +
> +from paramiko import SSHClient # type: ignore
> +
> +from framework.logger import DTSLOG
> +from framework.settings import SETTINGS
> +
> +from .interactive_shell import InteractiveShell
> +
> +
> +class TestPmdShell(InteractiveShell):
> + expected_prompt: str = "testpmd>"
> + _eal_flags: str
> +
> + def __init__(
> + self,
> + interactive_session: SSHClient,
> + logger: DTSLOG,
> + path_to_testpmd: PurePath,
> + eal_flags: str,
> + timeout: float = SETTINGS.timeout,
> + ) -> None:
> + """Initializes an interactive testpmd session using specified parameters."""
> + self._eal_flags = eal_flags
> +
> + super(TestPmdShell, self).__init__(
> + interactive_session,
> + logger=logger,
> + path_to_app=path_to_testpmd,
> + timeout=timeout,
> + )
> + self._start_application()
> +
> + def _start_application(self) -> None:
> + """Starts a new interactive testpmd shell using _path_to_app.
> + """
> + self.send_command(
> + f"{self._path_to_app} {self._eal_flags} -- -i",
> + )
> +
> + def send_command(self, command: str, prompt: str = expected_prompt) -> str:
> + """Specific way of handling the command for testpmd
> +
> + An extra newline character is consumed in order to force the current line into
> + the stdout buffer.
> + """
> + return self.send_command_get_output(f"{command}\n", prompt)
> +
> + def get_devices(self) -> list[str]:
> + """Get a list of device names that are known to testpmd
> +
> + Uses the device info listed in testpmd and then parses the output to
> + return only the names of the devices.
> +
> + Returns:
> + A list of strings representing device names (e.g. 0000:14:00.1)
> + """
> + dev_info: str = self.send_command("show device info all")
> + dev_list: list[str] = []
> + for line in dev_info.split("\n"):
> + if "device name:" in line.lower():
> + dev_list.append(line.strip().split(": ")[1].strip())
> + return dev_list
> diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
> index 74391982..2436eb7f 100644
> --- a/dts/framework/test_result.py
> +++ b/dts/framework/test_result.py
> @@ -1,5 +1,6 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> +# Copyright(c) 2022-2023 University of New Hampshire
>
> """
> Generic result container and reporters
> @@ -13,9 +14,11 @@
> OS,
> Architecture,
> BuildTargetConfiguration,
> + BuildTargetInfo,
> Compiler,
> CPUType,
> NodeConfiguration,
> + NodeInfo,
> )
> from .exception import DTSError, ErrorSeverity
> from .logger import DTSLOG
> @@ -67,12 +70,14 @@ class Statistics(dict):
> Using a dict provides a convenient way to format the data.
> """
>
> - def __init__(self, dpdk_version):
> + def __init__(self, output_info: dict[str, str] | None):
> super(Statistics, self).__init__()
> for result in Result:
> self[result.name] = 0
> self["PASS RATE"] = 0.0
> - self["DPDK VERSION"] = dpdk_version
> + if output_info:
> + for info_key, info_val in output_info.items():
> + self[info_key] = info_val
>
> def __iadd__(self, other: Result) -> "Statistics":
> """
> @@ -206,6 +211,8 @@ class BuildTargetResult(BaseResult):
> os: OS
> cpu: CPUType
> compiler: Compiler
> + compiler_version: str | None
> + dpdk_version: str | None
>
> def __init__(self, build_target: BuildTargetConfiguration):
> super(BuildTargetResult, self).__init__()
> @@ -213,6 +220,12 @@ def __init__(self, build_target: BuildTargetConfiguration):
> self.os = build_target.os
> self.cpu = build_target.cpu
> self.compiler = build_target.compiler
> + self.compiler_version = None
> + self.dpdk_version = None
> +
> + def add_build_target_versions(self, versions: BuildTargetInfo) -> None:
> + self.compiler_version = versions.compiler_version
> + self.dpdk_version = versions.dpdk_version
>
> def add_test_suite(self, test_suite_name: str) -> TestSuiteResult:
> test_suite_result = TestSuiteResult(test_suite_name)
> @@ -228,10 +241,17 @@ class ExecutionResult(BaseResult):
> """
>
> sut_node: NodeConfiguration
> + sut_os_name: str
> + sut_os_version: str
> + sut_kernel_version: str
>
> - def __init__(self, sut_node: NodeConfiguration):
> + def __init__(self, sut_node: NodeConfiguration, sut_version_info: NodeInfo):
> super(ExecutionResult, self).__init__()
> self.sut_node = sut_node
> + self.sut_version_info = sut_version_info
> + self.sut_os_name = sut_version_info.os_name
> + self.sut_os_version = sut_version_info.os_version
> + self.sut_kernel_version = sut_version_info.kernel_version
>
> def add_build_target(
> self, build_target: BuildTargetConfiguration
> @@ -258,6 +278,7 @@ class DTSResult(BaseResult):
> """
>
> dpdk_version: str | None
> + output: dict | None
> _logger: DTSLOG
> _errors: list[Exception]
> _return_code: ErrorSeverity
> @@ -267,14 +288,17 @@ class DTSResult(BaseResult):
> def __init__(self, logger: DTSLOG):
> super(DTSResult, self).__init__()
> self.dpdk_version = None
> + self.output = None
> self._logger = logger
> self._errors = []
> self._return_code = ErrorSeverity.NO_ERR
> self._stats_result = None
> self._stats_filename = os.path.join(SETTINGS.output_dir, "statistics.txt")
>
> - def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
> - execution_result = ExecutionResult(sut_node)
> + def add_execution(
> + self, sut_node: NodeConfiguration, sut_version_info: NodeInfo
> + ) -> ExecutionResult:
> + execution_result = ExecutionResult(sut_node, sut_version_info)
> self._inner_results.append(execution_result)
> return execution_result
>
> @@ -296,7 +320,8 @@ def process(self) -> None:
> for error in self._errors:
> self._logger.debug(repr(error))
>
> - self._stats_result = Statistics(self.dpdk_version)
> + self._stats_result = Statistics(self.output)
> + # add information gathered from the smoke tests to the statistics
> self.add_stats(self._stats_result)
> with open(self._stats_filename, "w+") as stats_file:
> stats_file.write(str(self._stats_result))
> diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
> index 0705f38f..5df5d2a6 100644
> --- a/dts/framework/test_suite.py
> +++ b/dts/framework/test_suite.py
> @@ -11,10 +11,21 @@
> import re
> from types import MethodType
>
> -from .exception import ConfigurationError, SSHTimeoutError, TestCaseVerifyError
> +from .exception import (
> + BlockingTestSuiteError,
> + ConfigurationError,
> + SSHTimeoutError,
> + TestCaseVerifyError,
> +)
> from .logger import DTSLOG, getLogger
> from .settings import SETTINGS
> -from .test_result import BuildTargetResult, Result, TestCaseResult, TestSuiteResult
> +from .test_result import (
> + BuildTargetResult,
> + DTSResult,
> + Result,
> + TestCaseResult,
> + TestSuiteResult,
> +)
> from .testbed_model import SutNode
>
>
> @@ -37,10 +48,12 @@ class TestSuite(object):
> """
>
> sut_node: SutNode
> + is_blocking = False
> _logger: DTSLOG
> _test_cases_to_run: list[str]
> _func: bool
> _result: TestSuiteResult
> + _dts_result: DTSResult
>
> def __init__(
> self,
> @@ -48,6 +61,7 @@ def __init__(
> test_cases: list[str],
> func: bool,
> build_target_result: BuildTargetResult,
> + dts_result: DTSResult,
> ):
> self.sut_node = sut_node
> self._logger = getLogger(self.__class__.__name__)
> @@ -55,6 +69,7 @@ def __init__(
> self._test_cases_to_run.extend(SETTINGS.test_cases)
> self._func = func
> self._result = build_target_result.add_test_suite(self.__class__.__name__)
> + self._dts_result = dts_result
>
> def set_up_suite(self) -> None:
> """
> @@ -118,6 +133,8 @@ def run(self) -> None:
> f"the next test suite may be affected."
> )
> self._result.update_setup(Result.ERROR, e)
> + if len(self._result.get_errors()) > 0 and self.is_blocking:
> + raise BlockingTestSuiteError(test_suite_name)
>
> def _execute_test_suite(self) -> None:
> """
> diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
> index d48fafe6..c5147e0e 100644
> --- a/dts/framework/testbed_model/node.py
> +++ b/dts/framework/testbed_model/node.py
> @@ -40,6 +40,7 @@ class Node(object):
> lcores: list[LogicalCore]
> _logger: DTSLOG
> _other_sessions: list[OSSession]
> + _execution_config: ExecutionConfiguration
>
> def __init__(self, node_config: NodeConfiguration):
> self.config = node_config
> @@ -64,6 +65,7 @@ def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
> """
> self._setup_hugepages()
> self._set_up_execution(execution_config)
> + self._execution_config = execution_config
>
> def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
> """
> diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
> index 2b2b50d9..9b17ac3d 100644
> --- a/dts/framework/testbed_model/sut_node.py
> +++ b/dts/framework/testbed_model/sut_node.py
> @@ -1,14 +1,27 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2010-2014 Intel Corporation
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> +# Copyright(c) 2022-2023 University of New Hampshire
>
> import os
> import tarfile
> import time
> from pathlib import PurePath
> -
> -from framework.config import BuildTargetConfiguration, NodeConfiguration
> -from framework.remote_session import CommandResult, OSSession
> +from typing import Union
> +
> +from framework.config import (
> + BuildTargetConfiguration,
> + BuildTargetInfo,
> + InteractiveApp,
> + NodeConfiguration,
> + NodeInfo,
> +)
> +from framework.remote_session import (
> + CommandResult,
> + InteractiveShell,
> + OSSession,
> + TestPmdShell,
> +)
> from framework.settings import SETTINGS
> from framework.utils import EnvVarsDict, MesonArgs
>
> @@ -16,6 +29,52 @@
> from .node import Node
>
>
> +class EalParameters(object):
> + def __init__(
> + self,
> + lcore_list: LogicalCoreList,
> + memory_channels: int,
> + prefix: str,
> + no_pci: bool,
> + vdevs: list[VirtualDevice],
> + other_eal_param: str,
> + ):
> + """
> + Generate eal parameters character string;
> + :param lcore_list: the list of logical cores to use.
> + :param memory_channels: the number of memory channels to use.
> + :param prefix: set file prefix string, eg:
> + prefix='vf'
> + :param no_pci: switch of disable PCI bus eg:
> + no_pci=True
> + :param vdevs: virtual device list, eg:
> + vdevs=[
> + VirtualDevice('net_ring0'),
> + VirtualDevice('net_ring1')
> + ]
> + :param other_eal_param: user defined DPDK eal parameters, eg:
> + other_eal_param='--single-file-segments'
> + """
> + self._lcore_list = f"-l {lcore_list}"
> + self._memory_channels = f"-n {memory_channels}"
> + self._prefix = prefix
> + if prefix:
> + self._prefix = f"--file-prefix={prefix}"
> + self._no_pci = "--no-pci" if no_pci else ""
> + self._vdevs = " ".join(f"--vdev {vdev}" for vdev in vdevs)
> + self._other_eal_param = other_eal_param
> +
> + def __str__(self) -> str:
> + return (
> + f"{self._lcore_list} "
> + f"{self._memory_channels} "
> + f"{self._prefix} "
> + f"{self._no_pci} "
> + f"{self._vdevs} "
> + f"{self._other_eal_param}"
> + )
> +
> +
> class SutNode(Node):
> """
> A class for managing connections to the System under Test, providing
> @@ -30,9 +89,11 @@ class SutNode(Node):
> _env_vars: EnvVarsDict
> _remote_tmp_dir: PurePath
> __remote_dpdk_dir: PurePath | None
> - _dpdk_version: str | None
> _app_compile_timeout: float
> _dpdk_kill_session: OSSession | None
> + _dpdk_version: str | None
> + _node_info: NodeInfo | None
> + _compiler_version: str | None
>
> def __init__(self, node_config: NodeConfiguration):
> super(SutNode, self).__init__(node_config)
> @@ -41,12 +102,14 @@ def __init__(self, node_config: NodeConfiguration):
> self._env_vars = EnvVarsDict()
> self._remote_tmp_dir = self.main_session.get_remote_tmp_dir()
> self.__remote_dpdk_dir = None
> - self._dpdk_version = None
> self._app_compile_timeout = 90
> self._dpdk_kill_session = None
> self._dpdk_timestamp = (
> f"{str(os.getpid())}_{time.strftime('%Y%m%d%H%M%S', time.localtime())}"
> )
> + self._dpdk_version = None
> + self._node_info = None
> + self._compiler_version = None
>
> @property
> def _remote_dpdk_dir(self) -> PurePath:
> @@ -75,6 +138,32 @@ def dpdk_version(self) -> str:
> )
> return self._dpdk_version
>
> + @property
> + def node_info(self) -> NodeInfo:
> + if self._node_info is None:
> + self._node_info = self.main_session.get_node_info()
> + return self._node_info
> +
> + @property
> + def compiler_version(self) -> str:
> + if self._compiler_version is None:
> + if self._build_target_config is not None:
> + self._compiler_version = self.main_session.get_compiler_version(
> + self._build_target_config.compiler.name
> + )
> + else:
> + self._logger.warning(
> + "Failed to get compiler version because"
> + "_build_target_config is None."
> + )
> + return ""
> + return self._compiler_version
> +
> + def get_build_target_info(self) -> BuildTargetInfo:
> + return BuildTargetInfo(
> + dpdk_version=self.dpdk_version, compiler_version=self.compiler_version
> + )
> +
> def _guess_dpdk_remote_dir(self) -> PurePath:
> return self.main_session.guess_dpdk_remote_dir(self._remote_tmp_dir)
>
> @@ -84,6 +173,10 @@ def _set_up_build_target(
> """
> Setup DPDK on the SUT node.
> """
> + # we want to ensure that dpdk_version and compiler_version is reset for new
> + # build targets
> + self._dpdk_version = None
> + self._compiler_version = None
> self._configure_build_target(build_target_config)
> self._copy_dpdk_tarball()
> self._build_dpdk()
> @@ -262,48 +355,43 @@ def run_dpdk_app(
> f"{app_path} {eal_args}", timeout, verify=True
> )
>
> -
> -class EalParameters(object):
> - def __init__(
> + def create_interactive_shell(
> self,
> - lcore_list: LogicalCoreList,
> - memory_channels: int,
> - prefix: str,
> - no_pci: bool,
> - vdevs: list[VirtualDevice],
> - other_eal_param: str,
> - ):
> + shell_type: InteractiveApp,
> + path_to_app: PurePath | None = None,
> + timeout: float = SETTINGS.timeout,
> + eal_parameters: EalParameters | None = None,
> + ) -> Union[InteractiveShell, TestPmdShell]:
> + """Create a handler for an interactive session.
> +
> + This method is a factory that calls a method in OSSession to create shells for
> + different DPDK applications.
> +
> + Args:
> + shell_type: Enum value representing the desired application.
> + path_to_app: Represents a path to the application you are attempting to
> + launch. This path will be executed at the start of the app
> + initialization. If one isn't provided, the default specified in the
> + enumeration will be used.
> + timeout: Timeout for reading output from the SSH channel. If you are
> + reading from the buffer and don't receive any data within the timeout
> + it will throw an error.
> + eal_parameters: List of EAL parameters to use to launch the app. This is
> + ignored for base "shell" types.
> + Returns:
> + Instance of the desired interactive application.
> """
> - Generate eal parameters character string;
> - :param lcore_list: the list of logical cores to use.
> - :param memory_channels: the number of memory channels to use.
> - :param prefix: set file prefix string, eg:
> - prefix='vf'
> - :param no_pci: switch of disable PCI bus eg:
> - no_pci=True
> - :param vdevs: virtual device list, eg:
> - vdevs=[
> - VirtualDevice('net_ring0'),
> - VirtualDevice('net_ring1')
> - ]
> - :param other_eal_param: user defined DPDK eal parameters, eg:
> - other_eal_param='--single-file-segments'
> - """
> - self._lcore_list = f"-l {lcore_list}"
> - self._memory_channels = f"-n {memory_channels}"
> - self._prefix = prefix
> - if prefix:
> - self._prefix = f"--file-prefix={prefix}"
> - self._no_pci = "--no-pci" if no_pci else ""
> - self._vdevs = " ".join(f"--vdev {vdev}" for vdev in vdevs)
> - self._other_eal_param = other_eal_param
> + # if we just want a default shell, there is no need to append the DPDK build
> + # directory to the path
> + default_path = shell_type.get_path()
>
> - def __str__(self) -> str:
> - return (
> - f"{self._lcore_list} "
> - f"{self._memory_channels} "
> - f"{self._prefix} "
> - f"{self._no_pci} "
> - f"{self._vdevs} "
> - f"{self._other_eal_param}"
> + if shell_type != InteractiveApp.shell:
> + default_path = self.main_session.join_remote_path(
> + self.remote_dpdk_build_dir, shell_type.get_path()
> + )
> + return self.main_session.interactive_session.create_interactive_shell(
> + shell_type,
> + path_to_app if path_to_app else default_path,
> + str(eal_parameters) if eal_parameters else "",
> + timeout,
> )
I forgot to mention that I'd like to change to structure a bit here -
calling self.main_session.create_interactive_shell() would makes more
sense to me, as the interactive_session is basically an implementation
detail.
> diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
> new file mode 100644
> index 00000000..b7e70ee1
> --- /dev/null
> +++ b/dts/tests/TestSuite_smoke_tests.py
> @@ -0,0 +1,118 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 University of New Hampshire
> +
> +import re
> +
> +from framework.config import InteractiveApp
> +from framework.remote_session import TestPmdShell
> +from framework.settings import SETTINGS
> +from framework.test_suite import TestSuite
> +
> +
> +class SmokeTests(TestSuite):
> + is_blocking = True
> + # dicts in this list are expected to have two keys:
> + # "pci_address" and "current_driver"
> + nics_in_node: list[dict[str, str]] = []
> +
> + def set_up_suite(self) -> None:
> + """
> + Setup:
> + Set the build directory path and generate a list of NICs in the SUT node.
> + """
> + self.dpdk_build_dir_path = self.sut_node.remote_dpdk_build_dir
> + for nic in self.sut_node.config.ports:
> + new_dict = {
> + "pci_address": nic.pci,
> + "current_driver": nic.os_driver.strip(),
> + }
> + self.nics_in_node.append(new_dict)
> +
> + def test_unit_tests(self) -> None:
> + """
> + Test:
> + Run the fast-test unit-test suite through meson.
> + """
> + self.sut_node.main_session.send_command(
> + f"meson test -C {self.dpdk_build_dir_path} --suite fast-tests",
> + 300,
> + verify=True,
> + )
> +
> + def test_driver_tests(self) -> None:
> + """
> + Test:
> + Run the driver-test unit-test suite through meson.
> + """
> + list_of_vdevs = ""
> + for dev in self.sut_node._execution_config.vdevs:
> + list_of_vdevs += f"--vdev {dev} "
> + list_of_vdevs = list_of_vdevs[:-1]
> + if list_of_vdevs:
> + self._logger.info(
> + "Running driver tests with the following virtual "
> + f"devices: {list_of_vdevs}"
> + )
> + self.sut_node.main_session.send_command(
> + f"meson test -C {self.dpdk_build_dir_path} --suite driver-tests "
> + f'--test-args "{list_of_vdevs}"',
> + 300,
> + verify=True,
> + )
> + else:
> + self.sut_node.main_session.send_command(
> + f"meson test -C {self.dpdk_build_dir_path} --suite driver-tests",
> + 300,
> + verify=True,
> + )
> +
> + def test_devices_listed_in_testpmd(self) -> None:
> + """
> + Test:
> + Uses testpmd driver to verify that devices have been found by testpmd.
> + """
> + testpmd_driver = self.sut_node.create_interactive_shell(InteractiveApp.testpmd)
> + # We know it should always be a TestPmdShell but mypy doesn't
> + assert isinstance(testpmd_driver, TestPmdShell)
> + dev_list: list[str] = testpmd_driver.get_devices()
> + for nic in self.nics_in_node:
> + self.verify(
> + nic["pci_address"] in dev_list,
> + f"Device {nic['pci_address']} was not listed in testpmd's available devices, "
> + "please check your configuration",
> + )
> +
> + def test_device_bound_to_driver(self) -> None:
> + """
> + Test:
> + Ensure that all drivers listed in the config are bound to the correct driver.
> + """
> + path_to_devbind = self.sut_node.main_session.join_remote_path(
> + self.sut_node._remote_dpdk_dir, "usertools", "dpdk-devbind.py"
> + )
> +
> + regex_for_pci_address = "/[0-9]{4}:[0-9]{2}:[0-9]{2}.[0-9]{1}/"
This shouldn't be tucked away in a test case. It should be in some
high-level module - let's put it to utils.py for the time being.
> + all_nics_in_dpdk_devbind = self.sut_node.main_session.send_command(
> + f"{path_to_devbind} --status | awk '{regex_for_pci_address}'",
> + SETTINGS.timeout,
> + ).stdout
> +
> + for nic in self.nics_in_node:
> + # This regular expression finds the line in the above string that starts
> + # with the address for the nic we are on in the loop and then captures the
> + # name of the driver in a group
> + devbind_info_for_nic = re.search(
> + f"{nic['pci_address']}[^\\n]*drv=([\\d\\w]*) [^\\n]*",
> + all_nics_in_dpdk_devbind,
> + )
> + self.verify(
> + devbind_info_for_nic is not None,
> + f"Failed to find configured device ({nic['pci_address']}) using dpdk-devbind.py",
> + )
> + # We know this isn't None, but mypy doesn't
> + assert devbind_info_for_nic is not None
> + self.verify(
> + devbind_info_for_nic.group(1) == nic["current_driver"],
> + f"Driver for device {nic['pci_address']} does not match driver listed in "
> + f"configuration (bound to {devbind_info_for_nic.group(1)})",
> + )
> --
> 2.41.0
>
^ permalink raw reply [flat|nested] 8+ messages in thread
* Re: [PATCH v2 1/2] dts: add smoke tests
2023-07-11 9:42 ` Juraj Linkeš
@ 2023-07-11 18:30 ` Jeremy Spewock
0 siblings, 0 replies; 8+ messages in thread
From: Jeremy Spewock @ 2023-07-11 18:30 UTC (permalink / raw)
To: Juraj Linkeš
Cc: Honnappa.Nagarahalli, thomas, lijuan.tu, wathsala.vithanage, probb, dev
[-- Attachment #1: Type: text/plain, Size: 69497 bytes --]
On Tue, Jul 11, 2023 at 5:42 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> Just a few more comments.
>
> On Mon, Jul 10, 2023 at 6:23 PM <jspewock@iol.unh.edu> wrote:
> >
> > From: Jeremy Spewock <jspewock@iol.unh.edu>
> >
> > Adds a new test suite for running smoke tests that verify general
> > configuration aspects of the system under test. If any of these tests
> > fail, the DTS execution terminates as part of a "fail-fast" model.
> >
> > Signed-off-by: Jeremy Spewock <jspewock@iol.unh.edu>
> > ---
> > dts/conf.yaml | 17 +-
> > dts/framework/config/__init__.py | 116 +++++++++--
> > dts/framework/config/conf_yaml_schema.json | 142 +++++++++++++-
> > dts/framework/dts.py | 88 ++++++---
> > dts/framework/exception.py | 12 ++
> > dts/framework/remote_session/__init__.py | 10 +-
> > dts/framework/remote_session/os_session.py | 24 ++-
> > dts/framework/remote_session/posix_session.py | 29 ++-
> > .../remote_session/remote/__init__.py | 10 +
> > .../remote/interactive_remote_session.py | 118 ++++++++++++
> > .../remote/interactive_shell.py | 99 ++++++++++
> > .../remote_session/remote/testpmd_shell.py | 67 +++++++
> > dts/framework/test_result.py | 37 +++-
> > dts/framework/test_suite.py | 21 +-
> > dts/framework/testbed_model/node.py | 2 +
> > dts/framework/testbed_model/sut_node.py | 180 +++++++++++++-----
> > dts/tests/TestSuite_smoke_tests.py | 118 ++++++++++++
> > 17 files changed, 994 insertions(+), 96 deletions(-)
> > create mode 100644
> dts/framework/remote_session/remote/interactive_remote_session.py
> > create mode 100644
> dts/framework/remote_session/remote/interactive_shell.py
> > create mode 100644 dts/framework/remote_session/remote/testpmd_shell.py
> > create mode 100644 dts/tests/TestSuite_smoke_tests.py
> >
> > diff --git a/dts/conf.yaml b/dts/conf.yaml
> > index a9bd8a3e..2717de13 100644
> > --- a/dts/conf.yaml
> > +++ b/dts/conf.yaml
> > @@ -10,9 +10,13 @@ executions:
> > compiler_wrapper: ccache
> > perf: false
> > func: true
> > + skip_smoke_tests: false #optional flag that allow you to ski smoke
> tests
>
> Typo: ski
> Also put a space after #
>
> > test_suites:
> > - hello_world
> > - system_under_test: "SUT 1"
> > + system_under_test:
> > + node_name: "SUT 1"
> > + vdevs: #optional: if removed vdevs won't be used in the execution
>
> Missing space after #
> The sentence after hugepages has a comma in it, let's unify those.
>
> > + - "crypto_openssl"
> > nodes:
> > - name: "SUT 1"
> > hostname: sut1.change.me.localhost
> > @@ -20,6 +24,17 @@ nodes:
> > arch: x86_64
> > os: linux
> > lcores: ""
> > + ports:
>
> I'm comparing my version with this patch and I've just noticed this -
> let's put the ports at the end (after hugepages). This way we'll have
> the configuration sorted into sections of sorts:
> Cores/cpu config
> Memory config
> Port/devices config
>
> > + - pci: "0000:00:08.0"
> > + os_driver_for_dpdk: vfio-pci #OS driver that DPDK will use
>
> Missing space after #
>
>
Sorry, I'm just used to writing comments without the space but I'll fix
this.
> > + os_driver: i40e
> > + peer_node: "TG 1"
> > + peer_pci: "0000:00:08.0"
> > + - pci: "0000:00:08.1"
> > + os_driver_for_dpdk: vfio-pci
> > + os_driver: i40e
> > + peer_node: "TG 1"
> > + peer_pci: "0000:00:08.1"
> > use_first_core: false
> > memory_channels: 4
> > hugepages: # optional; if removed, will use system hugepage
> configuration
> > diff --git a/dts/framework/config/__init__.py
> b/dts/framework/config/__init__.py
> > index ebb0823f..75ac1cbe 100644
> > --- a/dts/framework/config/__init__.py
> > +++ b/dts/framework/config/__init__.py
> > @@ -12,6 +12,7 @@
> > import pathlib
> > from dataclasses import dataclass
> > from enum import Enum, auto, unique
> > +from pathlib import PurePath
> > from typing import Any, TypedDict
> >
> > import warlock # type: ignore
> > @@ -72,6 +73,20 @@ class HugepageConfiguration:
> > force_first_numa: bool
> >
> >
> > +@dataclass(slots=True, frozen=True)
> > +class PortConfig:
> > + node: str
> > + pci: str
> > + os_driver_for_dpdk: str
> > + os_driver: str
> > + peer_node: str
> > + peer_pci: str
> > +
> > + @staticmethod
> > + def from_dict(node: str, d: dict) -> "PortConfig":
> > + return PortConfig(node=node, **d)
> > +
> > +
> > @dataclass(slots=True, frozen=True)
> > class NodeConfiguration:
> > name: str
> > @@ -84,6 +99,7 @@ class NodeConfiguration:
> > use_first_core: bool
> > memory_channels: int
> > hugepages: HugepageConfiguration | None
> > + ports: list[PortConfig]
> >
> > @staticmethod
> > def from_dict(d: dict) -> "NodeConfiguration":
> > @@ -92,18 +108,43 @@ def from_dict(d: dict) -> "NodeConfiguration":
> > if "force_first_numa" not in hugepage_config:
> > hugepage_config["force_first_numa"] = False
> > hugepage_config = HugepageConfiguration(**hugepage_config)
> > + common_config = {
> > + "name": d["name"],
> > + "hostname": d["hostname"],
> > + "user": d["user"],
> > + "password": d.get("password"),
> > + "arch": Architecture(d["arch"]),
> > + "os": OS(d["os"]),
> > + "lcores": d.get("lcores", "1"),
> > + "use_first_core": d.get("use_first_core", False),
> > + "memory_channels": d.get("memory_channels", 1),
> > + "hugepages": hugepage_config,
> > + "ports": [PortConfig.from_dict(d["name"], port) for port in
> d["ports"]],
> > + }
> > +
> > + return NodeConfiguration(**common_config)
> > +
> > +
> > +@dataclass(slots=True)
>
> Looks like this could be frozen as well. This should work even in the
> future, as I imagine we'll get all the info we need just once.
>
You're right, this data isn't really something that's going to change once
we collect it and it's probably better to enforce that. I'll make this
change.
> > +class NodeInfo:
> > + """Class to hold important versions within the node.
> > +
> > + This class, unlike the NodeConfiguration class, cannot be generated
> at the start.
> > + This is because we need to initialize a connection with the node
> before we can
> > + collect the information needed in this class. Therefore, it cannot
> be a part of
> > + the configuration class above.
> > + """
> >
> > - return NodeConfiguration(
> > - name=d["name"],
> > - hostname=d["hostname"],
> > - user=d["user"],
> > - password=d.get("password"),
> > - arch=Architecture(d["arch"]),
> > - os=OS(d["os"]),
> > - lcores=d.get("lcores", "1"),
> > - use_first_core=d.get("use_first_core", False),
> > - memory_channels=d.get("memory_channels", 1),
> > - hugepages=hugepage_config,
> > + os_name: str
> > + os_version: str
> > + kernel_version: str
> > +
> > + @staticmethod
> > + def from_dict(d: dict):
> > + return NodeInfo(
> > + os_name=d["os_name"],
> > + os_version=d["os_version"],
> > + kernel_version=d["kernel_version"],
> > )
>
> We don't need the from_dict method, as we can instantiate this class
> right away (node_info = NodeInfo(os_name=d["os_name"],
> os_version=d["os_version"], kernel_version=d["kernel_version"])).
>
> The other classes need this method because we're doing some processing
> before instantiating the classes - this one doesn't need it.
>
Good point, I included it to make it similar to the others but it really
isn't necessary because I'm not even using it.
>
> >
> >
> > @@ -128,6 +169,24 @@ def from_dict(d: dict) ->
> "BuildTargetConfiguration":
> > )
> >
> >
> > +@dataclass(slots=True)
>
> Can also be frozen.
>
> > +class BuildTargetInfo:
> > + """Class to hold important versions within the build target.
> > +
> > + This is very similar to the NodeVersionInfo class, it just instead
> holds information
>
> References renamed class.
>
Good catch, I totally missed the reference to the other class in this
comment.
>
> > + for the build target.
> > + """
> > +
> > + dpdk_version: str
> > + compiler_version: str
> > +
> > + @staticmethod
> > + def from_dict(d: dict):
> > + return BuildTargetInfo(
> > + dpdk_version=d["dpdk_version"],
> compiler_version=d["compiler_version"]
> > + )
>
> Same as above.
>
Good catches on these as well, I'll be sure to make the changes.
>
> > +
> > +
> > class TestSuiteConfigDict(TypedDict):
> > suite: str
> > cases: list[str]
> > @@ -157,6 +216,8 @@ class ExecutionConfiguration:
> > func: bool
> > test_suites: list[TestSuiteConfig]
> > system_under_test: NodeConfiguration
> > + vdevs: list[str]
> > + skip_smoke_tests: bool
> >
> > @staticmethod
> > def from_dict(d: dict, node_map: dict) -> "ExecutionConfiguration":
> > @@ -166,15 +227,20 @@ def from_dict(d: dict, node_map: dict) ->
> "ExecutionConfiguration":
> > test_suites: list[TestSuiteConfig] = list(
> > map(TestSuiteConfig.from_dict, d["test_suites"])
> > )
> > - sut_name = d["system_under_test"]
> > + sut_name = d["system_under_test"]["node_name"]
> > + skip_smoke_tests = d.get("skip_smoke_tests", False)
> > assert sut_name in node_map, f"Unknown SUT {sut_name} in
> execution {d}"
> > -
> > + vdevs = (
> > + d["system_under_test"]["vdevs"] if "vdevs" in
> d["system_under_test"] else []
> > + )
> > return ExecutionConfiguration(
> > build_targets=build_targets,
> > perf=d["perf"],
> > func=d["func"],
> > + skip_smoke_tests=skip_smoke_tests,
> > test_suites=test_suites,
> > system_under_test=node_map[sut_name],
> > + vdevs=vdevs,
> > )
> >
> >
> > @@ -221,3 +287,27 @@ def load_config() -> Configuration:
> >
> >
> > CONFIGURATION = load_config()
> > +
> > +
> > +@unique
> > +class InteractiveApp(Enum):
> > + """An enum that represents different supported interactive
> applications
> > +
> > + The values in this enum must all be set to objects that have a key
> called
> > + "default_path" where "default_path" represents a PurPath object for
> the path
> > + to the application. This default path will be passed into the
> handler class
> > + for the application so that it can start the application. For every
> key other
> > + than the default shell option, the path will be appended to the
> path to the DPDK
> > + build directory for the current SUT node.
> > + """
> > +
> > + shell = {"default_path": PurePath()}
> > + testpmd = {"default_path": PurePath("app", "dpdk-testpmd")}
> > +
> > + def get_path(self) -> PurePath:
> > + """A method for getting the default paths of an application
> > +
> > + Returns:
> > + String array that represents an OS agnostic path to the
> application.
> > + """
> > + return self.value["default_path"]
> > diff --git a/dts/framework/config/conf_yaml_schema.json
> b/dts/framework/config/conf_yaml_schema.json
> > index ca2d4a1e..61f52b43 100644
> > --- a/dts/framework/config/conf_yaml_schema.json
> > +++ b/dts/framework/config/conf_yaml_schema.json
> > @@ -6,6 +6,76 @@
> > "type": "string",
> > "description": "A unique identifier for a node"
> > },
> > + "NIC": {
> > + "type": "string",
> > + "enum": [
> > + "ALL",
> > + "ConnectX3_MT4103",
> > + "ConnectX4_LX_MT4117",
> > + "ConnectX4_MT4115",
> > + "ConnectX5_MT4119",
> > + "ConnectX5_MT4121",
> > + "I40E_10G-10G_BASE_T_BC",
> > + "I40E_10G-10G_BASE_T_X722",
> > + "I40E_10G-SFP_X722",
> > + "I40E_10G-SFP_XL710",
> > + "I40E_10G-X722_A0",
> > + "I40E_1G-1G_BASE_T_X722",
> > + "I40E_25G-25G_SFP28",
> > + "I40E_40G-QSFP_A",
> > + "I40E_40G-QSFP_B",
> > + "IAVF-ADAPTIVE_VF",
> > + "IAVF-VF",
> > + "IAVF_10G-X722_VF",
> > + "ICE_100G-E810C_QSFP",
> > + "ICE_25G-E810C_SFP",
> > + "ICE_25G-E810_XXV_SFP",
> > + "IGB-I350_VF",
> > + "IGB_1G-82540EM",
> > + "IGB_1G-82545EM_COPPER",
> > + "IGB_1G-82571EB_COPPER",
> > + "IGB_1G-82574L",
> > + "IGB_1G-82576",
> > + "IGB_1G-82576_QUAD_COPPER",
> > + "IGB_1G-82576_QUAD_COPPER_ET2",
> > + "IGB_1G-82580_COPPER",
> > + "IGB_1G-I210_COPPER",
> > + "IGB_1G-I350_COPPER",
> > + "IGB_1G-I354_SGMII",
> > + "IGB_1G-PCH_LPTLP_I218_LM",
> > + "IGB_1G-PCH_LPTLP_I218_V",
> > + "IGB_1G-PCH_LPT_I217_LM",
> > + "IGB_1G-PCH_LPT_I217_V",
> > + "IGB_2.5G-I354_BACKPLANE_2_5GBPS",
> > + "IGC-I225_LM",
> > + "IGC-I226_LM",
> > + "IXGBE_10G-82599_SFP",
> > + "IXGBE_10G-82599_SFP_SF_QP",
> > + "IXGBE_10G-82599_T3_LOM",
> > + "IXGBE_10G-82599_VF",
> > + "IXGBE_10G-X540T",
> > + "IXGBE_10G-X540_VF",
> > + "IXGBE_10G-X550EM_A_SFP",
> > + "IXGBE_10G-X550EM_X_10G_T",
> > + "IXGBE_10G-X550EM_X_SFP",
> > + "IXGBE_10G-X550EM_X_VF",
> > + "IXGBE_10G-X550T",
> > + "IXGBE_10G-X550_VF",
> > + "brcm_57414",
> > + "brcm_P2100G",
> > + "cavium_0011",
> > + "cavium_a034",
> > + "cavium_a063",
> > + "cavium_a064",
> > + "fastlinq_ql41000",
> > + "fastlinq_ql41000_vf",
> > + "fastlinq_ql45000",
> > + "fastlinq_ql45000_vf",
> > + "hi1822",
> > + "virtio"
> > + ]
> > + },
> > +
> > "ARCH": {
> > "type": "string",
> > "enum": [
> > @@ -94,6 +164,19 @@
> > "amount"
> > ]
> > },
> > + "pci_address": {
> > + "type": "string",
> > + "pattern":
> "^[\\da-fA-F]{4}:[\\da-fA-F]{2}:[\\da-fA-F]{2}.\\d:?\\w*$"
> > + },
> > + "port_peer_address": {
> > + "description": "Peer is a TRex port, and IXIA port or a PCI
> address",
> > + "oneOf": [
> > + {
> > + "description": "PCI peer port",
> > + "$ref": "#/definitions/pci_address"
> > + }
> > + ]
> > + },
> > "test_suite": {
> > "type": "string",
> > "enum": [
> > @@ -165,6 +248,44 @@
> > },
> > "hugepages": {
> > "$ref": "#/definitions/hugepages"
> > + },
> > + "ports": {
> > + "type": "array",
> > + "items": {
> > + "type": "object",
> > + "description": "Each port should be described on both
> sides of the connection. This makes configuration slightly more verbose but
> greatly simplifies implementation. If there are an inconsistencies, then
> DTS will not run until that issue is fixed. An example inconsistency would
> be port 1, node 1 says it is connected to port 1, node 2, but port 1, node
> 2 says it is connected to port 2, node 1.",
> > + "properties": {
> > + "pci": {
> > + "$ref": "#/definitions/pci_address",
> > + "description": "The local PCI address of the port"
> > + },
> > + "os_driver_for_dpdk": {
> > + "type": "string",
> > + "description": "The driver that the kernel should
> bind this device to for DPDK to use it. (ex: vfio-pci)"
> > + },
> > + "os_driver": {
> > + "type": "string",
> > + "description": "The driver normally used by this port
> (ex: i40e)"
> > + },
> > + "peer_node": {
> > + "type": "string",
> > + "description": "The name of the node the peer port is
> on"
> > + },
> > + "peer_pci": {
> > + "$ref": "#/definitions/pci_address",
> > + "description": "The PCI address of the peer port"
> > + }
> > + },
> > + "additionalProperties": false,
> > + "required": [
> > + "pci",
> > + "os_driver_for_dpdk",
> > + "os_driver",
> > + "peer_node",
> > + "peer_pci"
> > + ]
> > + },
> > + "minimum": 1
> > }
> > },
> > "additionalProperties": false,
> > @@ -211,8 +332,27 @@
> > ]
> > }
> > },
> > + "skip_smoke_tests": {
> > + "description": "Optional field that allows you to skip
> smoke testing",
> > + "type": "boolean"
> > + },
> > "system_under_test": {
> > - "$ref": "#/definitions/node_name"
> > + "type":"object",
> > + "properties": {
> > + "node_name": {
> > + "$ref": "#/definitions/node_name"
> > + },
> > + "vdevs": {
> > + "description": "Opentional list of names of vdevs to be
> used in execution",
> > + "type": "array",
> > + "items": {
> > + "type": "string"
> > + }
> > + }
> > + },
> > + "required": [
> > + "node_name"
> > + ]
> > }
> > },
> > "additionalProperties": false,
> > diff --git a/dts/framework/dts.py b/dts/framework/dts.py
> > index 05022845..1b67938f 100644
> > --- a/dts/framework/dts.py
> > +++ b/dts/framework/dts.py
> > @@ -5,7 +5,13 @@
> >
> > import sys
> >
> > -from .config import CONFIGURATION, BuildTargetConfiguration,
> ExecutionConfiguration
> > +from .config import (
> > + CONFIGURATION,
> > + BuildTargetConfiguration,
> > + ExecutionConfiguration,
> > + TestSuiteConfig,
> > +)
> > +from .exception import BlockingTestSuiteError
> > from .logger import DTSLOG, getLogger
> > from .test_result import BuildTargetResult, DTSResult, ExecutionResult,
> Result
> > from .test_suite import get_test_suites
> > @@ -82,7 +88,7 @@ def _run_execution(
> > running all build targets in the given execution.
> > """
> > dts_logger.info(f"Running execution with SUT '{
> execution.system_under_test.name}'.")
> > - execution_result = result.add_execution(sut_node.config)
> > + execution_result = result.add_execution(sut_node.config,
> sut_node.node_info)
> >
> > try:
> > sut_node.set_up_execution(execution)
> > @@ -118,14 +124,15 @@ def _run_build_target(
> >
> > try:
> > sut_node.set_up_build_target(build_target)
> > - result.dpdk_version = sut_node.dpdk_version
> > + # result.dpdk_version = sut_node.dpdk_version
> > +
> build_target_result.add_build_target_versions(sut_node.get_build_target_info())
> > build_target_result.update_setup(Result.PASS)
> > except Exception as e:
> > dts_logger.exception("Build target setup failed.")
> > build_target_result.update_setup(Result.FAIL, e)
> >
> > else:
> > - _run_suites(sut_node, execution, build_target_result)
> > + _run_all_suites(sut_node, execution, build_target_result)
> >
> > finally:
> > try:
> > @@ -136,7 +143,7 @@ def _run_build_target(
> > build_target_result.update_teardown(Result.FAIL, e)
> >
> >
> > -def _run_suites(
> > +def _run_all_suites(
> > sut_node: SutNode,
> > execution: ExecutionConfiguration,
> > build_target_result: BuildTargetResult,
> > @@ -146,27 +153,62 @@ def _run_suites(
> > with possibly only a subset of test cases.
> > If no subset is specified, run all test cases.
> > """
> > + end_build_target = False
> > + if not execution.skip_smoke_tests:
> > + execution.test_suites[:0] =
> [TestSuiteConfig.from_dict("smoke_tests")]
> > for test_suite_config in execution.test_suites:
> > try:
> > - full_suite_path =
> f"tests.TestSuite_{test_suite_config.test_suite}"
> > - test_suite_classes = get_test_suites(full_suite_path)
> > - suites_str = ", ".join((x.__name__ for x in
> test_suite_classes))
> > - dts_logger.debug(
> > - f"Found test suites '{suites_str}' in
> '{full_suite_path}'."
> > + _run_single_suite(
> > + sut_node, execution, build_target_result,
> test_suite_config
> > )
> > - except Exception as e:
> > - dts_logger.exception("An error occurred when searching for
> test suites.")
> > - result.update_setup(Result.ERROR, e)
> > -
> > - else:
> > - for test_suite_class in test_suite_classes:
> > - test_suite = test_suite_class(
> > - sut_node,
> > - test_suite_config.test_cases,
> > - execution.func,
> > - build_target_result,
> > - )
> > - test_suite.run()
> > + except BlockingTestSuiteError as e:
> > + dts_logger.exception(
> > + f"An error occurred within
> {test_suite_config.test_suite}. "
> > + "Skipping build target..."
> > + )
> > + result.add_error(e)
> > + end_build_target = True
> > + # if a blocking test failed and we need to bail out of suite
> executions
> > + if end_build_target:
> > + break
> > +
> > +
> > +def _run_single_suite(
> > + sut_node: SutNode,
> > + execution: ExecutionConfiguration,
> > + build_target_result: BuildTargetResult,
> > + test_suite_config: TestSuiteConfig,
> > +) -> None:
> > + """Runs a single test suite.
> > +
> > + Args:
> > + sut_node: Node to run tests on.
> > + execution: Execution the test case belongs to.
> > + build_target_result: Build target configuration test case is
> run on
> > + test_suite_config: Test suite configuration
> > +
> > + Raises:
> > + BlockingTestSuiteError: If a test suite that was marked as
> blocking fails.
> > + """
> > + try:
> > + full_suite_path =
> f"tests.TestSuite_{test_suite_config.test_suite}"
> > + test_suite_classes = get_test_suites(full_suite_path)
> > + suites_str = ", ".join((x.__name__ for x in test_suite_classes))
> > + dts_logger.debug(f"Found test suites '{suites_str}' in
> '{full_suite_path}'.")
> > + except Exception as e:
> > + dts_logger.exception("An error occurred when searching for test
> suites.")
> > + result.update_setup(Result.ERROR, e)
> > +
> > + else:
> > + for test_suite_class in test_suite_classes:
> > + test_suite = test_suite_class(
> > + sut_node,
> > + test_suite_config.test_cases,
> > + execution.func,
> > + build_target_result,
> > + result,
> > + )
> > + test_suite.run()
> >
> >
> > def _exit_dts() -> None:
> > diff --git a/dts/framework/exception.py b/dts/framework/exception.py
> > index ca353d98..dfb12df4 100644
> > --- a/dts/framework/exception.py
> > +++ b/dts/framework/exception.py
> > @@ -25,6 +25,7 @@ class ErrorSeverity(IntEnum):
> > SSH_ERR = 4
> > DPDK_BUILD_ERR = 10
> > TESTCASE_VERIFY_ERR = 20
> > + BLOCKING_TESTSUITE_ERR = 25
> >
> >
> > class DTSError(Exception):
> > @@ -144,3 +145,14 @@ def __init__(self, value: str):
> >
> > def __str__(self) -> str:
> > return repr(self.value)
> > +
> > +
> > +class BlockingTestSuiteError(DTSError):
> > + suite_name: str
> > + severity: ClassVar[ErrorSeverity] =
> ErrorSeverity.BLOCKING_TESTSUITE_ERR
> > +
> > + def __init__(self, suite_name: str) -> None:
> > + self.suite_name = suite_name
> > +
> > + def __str__(self) -> str:
> > + return f"Blocking suite {self.suite_name} failed."
> > diff --git a/dts/framework/remote_session/__init__.py
> b/dts/framework/remote_session/__init__.py
> > index ee221503..4fe32d35 100644
> > --- a/dts/framework/remote_session/__init__.py
> > +++ b/dts/framework/remote_session/__init__.py
> > @@ -1,5 +1,6 @@
> > # SPDX-License-Identifier: BSD-3-Clause
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> > +# Copyright(c) 2023 University of New Hampshire
> >
> > """
> > The package provides modules for managing remote connections to a
> remote host (node),
> > @@ -17,7 +18,14 @@
> >
> > from .linux_session import LinuxSession
> > from .os_session import OSSession
> > -from .remote import CommandResult, RemoteSession, SSHSession
> > +from .remote import (
> > + CommandResult,
> > + InteractiveRemoteSession,
> > + InteractiveShell,
> > + RemoteSession,
> > + SSHSession,
> > + TestPmdShell,
> > +)
> >
> >
> > def create_session(
> > diff --git a/dts/framework/remote_session/os_session.py
> b/dts/framework/remote_session/os_session.py
> > index 4c48ae25..4346ecc4 100644
> > --- a/dts/framework/remote_session/os_session.py
> > +++ b/dts/framework/remote_session/os_session.py
> > @@ -6,13 +6,19 @@
> > from collections.abc import Iterable
> > from pathlib import PurePath
> >
> > -from framework.config import Architecture, NodeConfiguration
> > +from framework.config import Architecture, NodeConfiguration, NodeInfo
> > from framework.logger import DTSLOG
> > from framework.settings import SETTINGS
> > from framework.testbed_model import LogicalCore
> > from framework.utils import EnvVarsDict, MesonArgs
> >
> > -from .remote import CommandResult, RemoteSession, create_remote_session
> > +from .remote import (
> > + CommandResult,
> > + InteractiveRemoteSession,
> > + RemoteSession,
> > + create_interactive_session,
> > + create_remote_session,
> > +)
> >
> >
> > class OSSession(ABC):
> > @@ -26,6 +32,7 @@ class OSSession(ABC):
> > name: str
> > _logger: DTSLOG
> > remote_session: RemoteSession
> > + interactive_session: InteractiveRemoteSession
> >
> > def __init__(
> > self,
> > @@ -37,6 +44,7 @@ def __init__(
> > self.name = name
> > self._logger = logger
> > self.remote_session = create_remote_session(node_config, name,
> logger)
> > + self.interactive_session =
> create_interactive_session(node_config, name, logger)
> >
> > def close(self, force: bool = False) -> None:
> > """
> > @@ -173,3 +181,15 @@ def setup_hugepages(self, hugepage_amount: int,
> force_first_numa: bool) -> None:
> > if needed and mount the hugepages if needed.
> > If force_first_numa is True, configure hugepages just on the
> first socket.
> > """
> > +
> > + @abstractmethod
> > + def get_compiler_version(self, compiler_name: str) -> str:
> > + """
> > + Get installed version of compiler used for DPDK
> > + """
> > +
> > + @abstractmethod
> > + def get_node_info(self) -> NodeInfo:
> > + """
> > + Collect information about the node
> > + """
> > diff --git a/dts/framework/remote_session/posix_session.py
> b/dts/framework/remote_session/posix_session.py
> > index d38062e8..f8ec159f 100644
> > --- a/dts/framework/remote_session/posix_session.py
> > +++ b/dts/framework/remote_session/posix_session.py
> > @@ -6,7 +6,7 @@
> > from collections.abc import Iterable
> > from pathlib import PurePath, PurePosixPath
> >
> > -from framework.config import Architecture
> > +from framework.config import Architecture, NodeInfo
> > from framework.exception import DPDKBuildError,
> RemoteCommandExecutionError
> > from framework.settings import SETTINGS
> > from framework.utils import EnvVarsDict, MesonArgs
> > @@ -219,3 +219,30 @@ def _remove_dpdk_runtime_dirs(
> >
> > def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
> > return ""
> > +
> > + def get_compiler_version(self, compiler_name: str) -> str:
> > + match compiler_name:
> > + case "gcc":
> > + return self.send_command(f"{compiler_name} --version",
> 60).stdout.split(
> > + "\n"
> > + )[0]
>
> The timeouts are still there.
>
> > + case "clang":
> > + return self.send_command(f"{compiler_name} --version",
> 60).stdout.split(
> > + "\n"
> > + )[0]
> > + case "msvc":
> > + return self.send_command("cl", 60).stdout
> > + case "icc":
> > + return self.send_command(f"{compiler_name} -V",
> 60).stdout
> > + case _:
> > + raise ValueError(f"Unknown compiler {compiler_name}")
> > +
> > + def get_node_info(self) -> NodeInfo:
> > + os_release_info = self.send_command(
> > + "awk -F= '$1 ~ /^NAME$|^VERSION$/ {print $2}'
> /etc/os-release",
> > + SETTINGS.timeout,
> > + ).stdout.split("\n")
> > + kernel_version = self.send_command("uname -r",
> SETTINGS.timeout).stdout
> > + return NodeInfo(
> > + os_release_info[0].strip(), os_release_info[1].strip(),
> kernel_version
> > + )
> > diff --git a/dts/framework/remote_session/remote/__init__.py
> b/dts/framework/remote_session/remote/__init__.py
> > index 8a151221..224598a8 100644
> > --- a/dts/framework/remote_session/remote/__init__.py
> > +++ b/dts/framework/remote_session/remote/__init__.py
> > @@ -1,16 +1,26 @@
> > # SPDX-License-Identifier: BSD-3-Clause
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> > +# Copyright(c) 2022-2023 University of New Hampshire
>
> There are other instances of the copyright statement. It's not
> necessary to change them all to just 2023, but I'd say it's
> preferable.
>
I agree. Knowing more about them after your other comments, this should
also be changed.
>
> >
> > # pylama:ignore=W0611
> >
> > from framework.config import NodeConfiguration
> > from framework.logger import DTSLOG
> >
> > +from .interactive_remote_session import InteractiveRemoteSession
> > +from .interactive_shell import InteractiveShell
> > from .remote_session import CommandResult, RemoteSession
> > from .ssh_session import SSHSession
> > +from .testpmd_shell import TestPmdShell
> >
> >
> > def create_remote_session(
> > node_config: NodeConfiguration, name: str, logger: DTSLOG
> > ) -> RemoteSession:
> > return SSHSession(node_config, name, logger)
> > +
> > +
> > +def create_interactive_session(
> > + node_config: NodeConfiguration, name: str, logger: DTSLOG
> > +) -> InteractiveRemoteSession:
> > + return InteractiveRemoteSession(node_config, logger)
> > diff --git
> a/dts/framework/remote_session/remote/interactive_remote_session.py
> b/dts/framework/remote_session/remote/interactive_remote_session.py
> > new file mode 100644
> > index 00000000..e145d35d
> > --- /dev/null
> > +++ b/dts/framework/remote_session/remote/interactive_remote_session.py
> > @@ -0,0 +1,118 @@
> > +# SPDX-License-Identifier: BSD-3-Clause
> > +# Copyright(c) 2023 University of New Hampshire
> > +
> > +import socket
> > +import traceback
> > +from pathlib import PurePath
> > +from typing import Union
> > +
> > +from paramiko import AutoAddPolicy, SSHClient, Transport # type: ignore
> > +from paramiko.ssh_exception import ( # type: ignore
> > + AuthenticationException,
> > + BadHostKeyException,
> > + NoValidConnectionsError,
> > + SSHException,
> > +)
> > +
> > +from framework.config import InteractiveApp, NodeConfiguration
> > +from framework.exception import SSHConnectionError
> > +from framework.logger import DTSLOG
> > +
> > +from .interactive_shell import InteractiveShell
> > +from .testpmd_shell import TestPmdShell
> > +
> > +
> > +class InteractiveRemoteSession:
> > + hostname: str
> > + ip: str
> > + port: int
> > + username: str
> > + password: str
> > + _logger: DTSLOG
> > + _node_config: NodeConfiguration
> > + session: SSHClient
> > + _transport: Transport | None
> > +
> > + def __init__(self, node_config: NodeConfiguration, _logger: DTSLOG)
> -> None:
> > + self._node_config = node_config
> > + self._logger = _logger
> > + self.hostname = node_config.hostname
> > + self.username = node_config.user
> > + self.password = node_config.password if node_config.password
> else ""
> > + port = "22"
> > + self.ip = node_config.hostname
> > + if ":" in node_config.hostname:
> > + self.ip, port = node_config.hostname.split(":")
> > + self.port = int(port)
> > + self._logger.info(
> > + f"Initializing interactive connection for {self.username}@
> {self.hostname}"
> > + )
> > + self._connect()
> > + self._logger.info(
> > + f"Interactive connection successful for {self.username}@
> {self.hostname}"
> > + )
> > +
> > + def _connect(self) -> None:
> > + client = SSHClient()
> > + client.set_missing_host_key_policy(AutoAddPolicy)
> > + self.session = client
> > + retry_attempts = 10
> > + for retry_attempt in range(retry_attempts):
> > + try:
> > + client.connect(
> > + self.ip,
> > + username=self.username,
> > + port=self.port,
> > + password=self.password,
> > + timeout=20 if self.port else 10,
> > + )
> > + except (TypeError, BadHostKeyException,
> AuthenticationException) as e:
> > + self._logger.exception(e)
> > + raise SSHConnectionError(self.hostname) from e
> > + except (NoValidConnectionsError, socket.error,
> SSHException) as e:
> > + self._logger.debug(traceback.format_exc())
> > + self._logger.warning(e)
> > + self._logger.info(
> > + "Retrying interactive session connection: "
> > + f"retry number {retry_attempt +1}"
> > + )
> > + else:
> > + break
> > + else:
> > + raise SSHConnectionError(self.hostname)
> > + # Interactive sessions are used on an "as needed" basis so we
> have
> > + # to set a keepalive
> > + self._transport = self.session.get_transport()
> > + if self._transport is not None:
> > + self._transport.set_keepalive(30)
> > +
> > + def create_interactive_shell(
> > + self,
> > + shell_type: InteractiveApp,
> > + path_to_app: PurePath,
> > + eal_parameters: str,
> > + timeout: float,
> > + ) -> Union[InteractiveShell, TestPmdShell]:
> > + """
> > + See "create_interactive_shell" in SutNode
> > + """
> > + match (shell_type):
> > + case InteractiveApp.shell:
> > + return InteractiveShell(
> > + self.session, self._logger, path_to_app, timeout
> > + )
> > + case InteractiveApp.testpmd:
> > + return TestPmdShell(
> > + self.session,
> > + self._logger,
> > + path_to_app,
> > + timeout=timeout,
> > + eal_flags=eal_parameters,
> > + )
> > + case _:
> > + self._logger.info(
> > + f"Unhandled app type {shell_type.name}, defaulting
> to shell."
> > + )
> > + return InteractiveShell(
> > + self.session, self._logger, path_to_app, timeout
> > + )
> > diff --git a/dts/framework/remote_session/remote/interactive_shell.py
> b/dts/framework/remote_session/remote/interactive_shell.py
> > new file mode 100644
> > index 00000000..4b0735c8
> > --- /dev/null
> > +++ b/dts/framework/remote_session/remote/interactive_shell.py
> > @@ -0,0 +1,99 @@
> > +from pathlib import PurePath
> > +
> > +from paramiko import Channel, SSHClient, channel # type: ignore
> > +
> > +from framework.logger import DTSLOG
> > +from framework.settings import SETTINGS
> > +
> > +
> > +class InteractiveShell:
> > +
> > + _interactive_session: SSHClient
> > + _stdin: channel.ChannelStdinFile
> > + _stdout: channel.ChannelFile
> > + _ssh_channel: Channel
> > + _logger: DTSLOG
> > + _timeout: float
> > + _path_to_app: PurePath
> > +
> > + def __init__(
> > + self,
> > + interactive_session: SSHClient,
> > + logger: DTSLOG,
> > + path_to_app: PurePath,
> > + timeout: float = SETTINGS.timeout,
> > + ) -> None:
> > + self._interactive_session = interactive_session
> > + self._ssh_channel = self._interactive_session.invoke_shell()
> > + self._stdin = self._ssh_channel.makefile_stdin("w")
> > + self._stdout = self._ssh_channel.makefile("r")
> > + self._ssh_channel.settimeout(timeout)
> > + self._ssh_channel.set_combine_stderr(True) # combines stdout
> and stderr streams
> > + self._logger = logger
> > + self._timeout = timeout
> > + self._path_to_app = path_to_app
> > +
> > +
> > + def send_command_no_output(self, command: str) -> None:
> > + """Send command to channel without recording output.
> > +
> > + This method will not verify any input or output, it will simply
> assume the
> > + command succeeded. This method will also consume all output in
> the buffer
> > + after executing the command.
> > + """
> > + self._logger.info(
> > + f"Sending command {command.strip()} and not collecting
> output"
> > + )
> > + self._stdin.write(f"{command}\n")
> > + self._stdin.flush()
> > + self.empty_stdout_buffer()
> > +
> > + def empty_stdout_buffer(self) -> None:
> > + """Removes all data from the stdout buffer.
> > +
> > + Because of the way paramiko handles read buffers, there is no
> way to effectively
> > + remove data from, or "flush", read buffers. This method
> essentially moves our
> > + offset on the buffer to the end and thus "removes" the data
> from the buffer.
> > + Timeouts are thrown on read operations of paramiko pipes based
> on whether data
> > + had been received before timeout so we assume that if we reach
> the timeout then
> > + we are at the end of the buffer.
> > + """
> > + self._ssh_channel.settimeout(0.5)
> > + try:
> > + for line in self._stdout:
> > + pass
> > + except TimeoutError:
> > + pass
> > + self._ssh_channel.settimeout(self._timeout) # reset timeout
> > +
> > + def send_command_get_output(self, command: str, prompt: str) -> str:
> > + """Send a command and get all output before the expected ending
> string.
> > +
> > + Lines that expect input are not included in the stdout buffer
> so they cannot be
> > + used for expect. For example, if you were prompted to log into
> something
> > + with a username and password, you cannot expect "username:"
> because it won't
> > + yet be in the stdout buffer. A work around for this could be
> consuming an
> > + extra newline character to force the current prompt into the
> stdout buffer.
> > +
> > + Returns:
> > + All output in the buffer before expected string
> > + """
> > + self._logger.info(f"Sending command {command.strip()}...")
> > + self._stdin.write(f"{command}\n")
> > + self._stdin.flush()
> > + out: str = ""
> > + for line in self._stdout:
> > + out += line
> > + if prompt in line and not line.rstrip().endswith(
> > + command.rstrip()
> > + ): # ignore line that sent command
> > + break
> > + self._logger.debug(f"Got output: {out}")
> > + return out
> > +
> > + def close(self) -> None:
> > + self._stdin.close()
> > + self._ssh_channel.close()
> > +
> > + def __del__(self) -> None:
> > + self.close()
> > diff --git a/dts/framework/remote_session/remote/testpmd_shell.py
> b/dts/framework/remote_session/remote/testpmd_shell.py
> > new file mode 100644
> > index 00000000..bde3468b
> > --- /dev/null
> > +++ b/dts/framework/remote_session/remote/testpmd_shell.py
> > @@ -0,0 +1,67 @@
> > +# SPDX-License-Identifier: BSD-3-Clause
> > +# Copyright(c) 2023 University of New Hampshire
> > +
> > +
> > +from pathlib import PurePath
> > +
> > +from paramiko import SSHClient # type: ignore
> > +
> > +from framework.logger import DTSLOG
> > +from framework.settings import SETTINGS
> > +
> > +from .interactive_shell import InteractiveShell
> > +
> > +
> > +class TestPmdShell(InteractiveShell):
> > + expected_prompt: str = "testpmd>"
> > + _eal_flags: str
> > +
> > + def __init__(
> > + self,
> > + interactive_session: SSHClient,
> > + logger: DTSLOG,
> > + path_to_testpmd: PurePath,
> > + eal_flags: str,
> > + timeout: float = SETTINGS.timeout,
> > + ) -> None:
> > + """Initializes an interactive testpmd session using specified
> parameters."""
> > + self._eal_flags = eal_flags
> > +
> > + super(TestPmdShell, self).__init__(
> > + interactive_session,
> > + logger=logger,
> > + path_to_app=path_to_testpmd,
> > + timeout=timeout,
> > + )
> > + self._start_application()
> > +
> > + def _start_application(self) -> None:
> > + """Starts a new interactive testpmd shell using _path_to_app.
> > + """
> > + self.send_command(
> > + f"{self._path_to_app} {self._eal_flags} -- -i",
> > + )
> > +
> > + def send_command(self, command: str, prompt: str = expected_prompt)
> -> str:
> > + """Specific way of handling the command for testpmd
> > +
> > + An extra newline character is consumed in order to force the
> current line into
> > + the stdout buffer.
> > + """
> > + return self.send_command_get_output(f"{command}\n", prompt)
> > +
> > + def get_devices(self) -> list[str]:
> > + """Get a list of device names that are known to testpmd
> > +
> > + Uses the device info listed in testpmd and then parses the
> output to
> > + return only the names of the devices.
> > +
> > + Returns:
> > + A list of strings representing device names (e.g.
> 0000:14:00.1)
> > + """
> > + dev_info: str = self.send_command("show device info all")
> > + dev_list: list[str] = []
> > + for line in dev_info.split("\n"):
> > + if "device name:" in line.lower():
> > + dev_list.append(line.strip().split(": ")[1].strip())
> > + return dev_list
> > diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
> > index 74391982..2436eb7f 100644
> > --- a/dts/framework/test_result.py
> > +++ b/dts/framework/test_result.py
> > @@ -1,5 +1,6 @@
> > # SPDX-License-Identifier: BSD-3-Clause
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> > +# Copyright(c) 2022-2023 University of New Hampshire
> >
> > """
> > Generic result container and reporters
> > @@ -13,9 +14,11 @@
> > OS,
> > Architecture,
> > BuildTargetConfiguration,
> > + BuildTargetInfo,
> > Compiler,
> > CPUType,
> > NodeConfiguration,
> > + NodeInfo,
> > )
> > from .exception import DTSError, ErrorSeverity
> > from .logger import DTSLOG
> > @@ -67,12 +70,14 @@ class Statistics(dict):
> > Using a dict provides a convenient way to format the data.
> > """
> >
> > - def __init__(self, dpdk_version):
> > + def __init__(self, output_info: dict[str, str] | None):
> > super(Statistics, self).__init__()
> > for result in Result:
> > self[result.name] = 0
> > self["PASS RATE"] = 0.0
> > - self["DPDK VERSION"] = dpdk_version
> > + if output_info:
> > + for info_key, info_val in output_info.items():
> > + self[info_key] = info_val
> >
> > def __iadd__(self, other: Result) -> "Statistics":
> > """
> > @@ -206,6 +211,8 @@ class BuildTargetResult(BaseResult):
> > os: OS
> > cpu: CPUType
> > compiler: Compiler
> > + compiler_version: str | None
> > + dpdk_version: str | None
> >
> > def __init__(self, build_target: BuildTargetConfiguration):
> > super(BuildTargetResult, self).__init__()
> > @@ -213,6 +220,12 @@ def __init__(self, build_target:
> BuildTargetConfiguration):
> > self.os = build_target.os
> > self.cpu = build_target.cpu
> > self.compiler = build_target.compiler
> > + self.compiler_version = None
> > + self.dpdk_version = None
> > +
> > + def add_build_target_versions(self, versions: BuildTargetInfo) ->
> None:
> > + self.compiler_version = versions.compiler_version
> > + self.dpdk_version = versions.dpdk_version
> >
> > def add_test_suite(self, test_suite_name: str) -> TestSuiteResult:
> > test_suite_result = TestSuiteResult(test_suite_name)
> > @@ -228,10 +241,17 @@ class ExecutionResult(BaseResult):
> > """
> >
> > sut_node: NodeConfiguration
> > + sut_os_name: str
> > + sut_os_version: str
> > + sut_kernel_version: str
> >
> > - def __init__(self, sut_node: NodeConfiguration):
> > + def __init__(self, sut_node: NodeConfiguration, sut_version_info:
> NodeInfo):
> > super(ExecutionResult, self).__init__()
> > self.sut_node = sut_node
> > + self.sut_version_info = sut_version_info
> > + self.sut_os_name = sut_version_info.os_name
> > + self.sut_os_version = sut_version_info.os_version
> > + self.sut_kernel_version = sut_version_info.kernel_version
> >
> > def add_build_target(
> > self, build_target: BuildTargetConfiguration
> > @@ -258,6 +278,7 @@ class DTSResult(BaseResult):
> > """
> >
> > dpdk_version: str | None
> > + output: dict | None
> > _logger: DTSLOG
> > _errors: list[Exception]
> > _return_code: ErrorSeverity
> > @@ -267,14 +288,17 @@ class DTSResult(BaseResult):
> > def __init__(self, logger: DTSLOG):
> > super(DTSResult, self).__init__()
> > self.dpdk_version = None
> > + self.output = None
> > self._logger = logger
> > self._errors = []
> > self._return_code = ErrorSeverity.NO_ERR
> > self._stats_result = None
> > self._stats_filename = os.path.join(SETTINGS.output_dir,
> "statistics.txt")
> >
> > - def add_execution(self, sut_node: NodeConfiguration) ->
> ExecutionResult:
> > - execution_result = ExecutionResult(sut_node)
> > + def add_execution(
> > + self, sut_node: NodeConfiguration, sut_version_info: NodeInfo
> > + ) -> ExecutionResult:
> > + execution_result = ExecutionResult(sut_node, sut_version_info)
> > self._inner_results.append(execution_result)
> > return execution_result
> >
> > @@ -296,7 +320,8 @@ def process(self) -> None:
> > for error in self._errors:
> > self._logger.debug(repr(error))
> >
> > - self._stats_result = Statistics(self.dpdk_version)
> > + self._stats_result = Statistics(self.output)
> > + # add information gathered from the smoke tests to the
> statistics
> > self.add_stats(self._stats_result)
> > with open(self._stats_filename, "w+") as stats_file:
> > stats_file.write(str(self._stats_result))
> > diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
> > index 0705f38f..5df5d2a6 100644
> > --- a/dts/framework/test_suite.py
> > +++ b/dts/framework/test_suite.py
> > @@ -11,10 +11,21 @@
> > import re
> > from types import MethodType
> >
> > -from .exception import ConfigurationError, SSHTimeoutError,
> TestCaseVerifyError
> > +from .exception import (
> > + BlockingTestSuiteError,
> > + ConfigurationError,
> > + SSHTimeoutError,
> > + TestCaseVerifyError,
> > +)
> > from .logger import DTSLOG, getLogger
> > from .settings import SETTINGS
> > -from .test_result import BuildTargetResult, Result, TestCaseResult,
> TestSuiteResult
> > +from .test_result import (
> > + BuildTargetResult,
> > + DTSResult,
> > + Result,
> > + TestCaseResult,
> > + TestSuiteResult,
> > +)
> > from .testbed_model import SutNode
> >
> >
> > @@ -37,10 +48,12 @@ class TestSuite(object):
> > """
> >
> > sut_node: SutNode
> > + is_blocking = False
> > _logger: DTSLOG
> > _test_cases_to_run: list[str]
> > _func: bool
> > _result: TestSuiteResult
> > + _dts_result: DTSResult
> >
> > def __init__(
> > self,
> > @@ -48,6 +61,7 @@ def __init__(
> > test_cases: list[str],
> > func: bool,
> > build_target_result: BuildTargetResult,
> > + dts_result: DTSResult,
> > ):
> > self.sut_node = sut_node
> > self._logger = getLogger(self.__class__.__name__)
> > @@ -55,6 +69,7 @@ def __init__(
> > self._test_cases_to_run.extend(SETTINGS.test_cases)
> > self._func = func
> > self._result =
> build_target_result.add_test_suite(self.__class__.__name__)
> > + self._dts_result = dts_result
> >
> > def set_up_suite(self) -> None:
> > """
> > @@ -118,6 +133,8 @@ def run(self) -> None:
> > f"the next test suite may be affected."
> > )
> > self._result.update_setup(Result.ERROR, e)
> > + if len(self._result.get_errors()) > 0 and self.is_blocking:
> > + raise BlockingTestSuiteError(test_suite_name)
> >
> > def _execute_test_suite(self) -> None:
> > """
> > diff --git a/dts/framework/testbed_model/node.py
> b/dts/framework/testbed_model/node.py
> > index d48fafe6..c5147e0e 100644
> > --- a/dts/framework/testbed_model/node.py
> > +++ b/dts/framework/testbed_model/node.py
> > @@ -40,6 +40,7 @@ class Node(object):
> > lcores: list[LogicalCore]
> > _logger: DTSLOG
> > _other_sessions: list[OSSession]
> > + _execution_config: ExecutionConfiguration
> >
> > def __init__(self, node_config: NodeConfiguration):
> > self.config = node_config
> > @@ -64,6 +65,7 @@ def set_up_execution(self, execution_config:
> ExecutionConfiguration) -> None:
> > """
> > self._setup_hugepages()
> > self._set_up_execution(execution_config)
> > + self._execution_config = execution_config
> >
> > def _set_up_execution(self, execution_config:
> ExecutionConfiguration) -> None:
> > """
> > diff --git a/dts/framework/testbed_model/sut_node.py
> b/dts/framework/testbed_model/sut_node.py
> > index 2b2b50d9..9b17ac3d 100644
> > --- a/dts/framework/testbed_model/sut_node.py
> > +++ b/dts/framework/testbed_model/sut_node.py
> > @@ -1,14 +1,27 @@
> > # SPDX-License-Identifier: BSD-3-Clause
> > # Copyright(c) 2010-2014 Intel Corporation
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> > +# Copyright(c) 2022-2023 University of New Hampshire
> >
> > import os
> > import tarfile
> > import time
> > from pathlib import PurePath
> > -
> > -from framework.config import BuildTargetConfiguration, NodeConfiguration
> > -from framework.remote_session import CommandResult, OSSession
> > +from typing import Union
> > +
> > +from framework.config import (
> > + BuildTargetConfiguration,
> > + BuildTargetInfo,
> > + InteractiveApp,
> > + NodeConfiguration,
> > + NodeInfo,
> > +)
> > +from framework.remote_session import (
> > + CommandResult,
> > + InteractiveShell,
> > + OSSession,
> > + TestPmdShell,
> > +)
> > from framework.settings import SETTINGS
> > from framework.utils import EnvVarsDict, MesonArgs
> >
> > @@ -16,6 +29,52 @@
> > from .node import Node
> >
> >
> > +class EalParameters(object):
> > + def __init__(
> > + self,
> > + lcore_list: LogicalCoreList,
> > + memory_channels: int,
> > + prefix: str,
> > + no_pci: bool,
> > + vdevs: list[VirtualDevice],
> > + other_eal_param: str,
> > + ):
> > + """
> > + Generate eal parameters character string;
> > + :param lcore_list: the list of logical cores to use.
> > + :param memory_channels: the number of memory channels to use.
> > + :param prefix: set file prefix string, eg:
> > + prefix='vf'
> > + :param no_pci: switch of disable PCI bus eg:
> > + no_pci=True
> > + :param vdevs: virtual device list, eg:
> > + vdevs=[
> > + VirtualDevice('net_ring0'),
> > + VirtualDevice('net_ring1')
> > + ]
> > + :param other_eal_param: user defined DPDK eal parameters, eg:
> > + other_eal_param='--single-file-segments'
> > + """
> > + self._lcore_list = f"-l {lcore_list}"
> > + self._memory_channels = f"-n {memory_channels}"
> > + self._prefix = prefix
> > + if prefix:
> > + self._prefix = f"--file-prefix={prefix}"
> > + self._no_pci = "--no-pci" if no_pci else ""
> > + self._vdevs = " ".join(f"--vdev {vdev}" for vdev in vdevs)
> > + self._other_eal_param = other_eal_param
> > +
> > + def __str__(self) -> str:
> > + return (
> > + f"{self._lcore_list} "
> > + f"{self._memory_channels} "
> > + f"{self._prefix} "
> > + f"{self._no_pci} "
> > + f"{self._vdevs} "
> > + f"{self._other_eal_param}"
> > + )
> > +
> > +
> > class SutNode(Node):
> > """
> > A class for managing connections to the System under Test, providing
> > @@ -30,9 +89,11 @@ class SutNode(Node):
> > _env_vars: EnvVarsDict
> > _remote_tmp_dir: PurePath
> > __remote_dpdk_dir: PurePath | None
> > - _dpdk_version: str | None
> > _app_compile_timeout: float
> > _dpdk_kill_session: OSSession | None
> > + _dpdk_version: str | None
> > + _node_info: NodeInfo | None
> > + _compiler_version: str | None
> >
> > def __init__(self, node_config: NodeConfiguration):
> > super(SutNode, self).__init__(node_config)
> > @@ -41,12 +102,14 @@ def __init__(self, node_config: NodeConfiguration):
> > self._env_vars = EnvVarsDict()
> > self._remote_tmp_dir = self.main_session.get_remote_tmp_dir()
> > self.__remote_dpdk_dir = None
> > - self._dpdk_version = None
> > self._app_compile_timeout = 90
> > self._dpdk_kill_session = None
> > self._dpdk_timestamp = (
> > f"{str(os.getpid())}_{time.strftime('%Y%m%d%H%M%S',
> time.localtime())}"
> > )
> > + self._dpdk_version = None
> > + self._node_info = None
> > + self._compiler_version = None
> >
> > @property
> > def _remote_dpdk_dir(self) -> PurePath:
> > @@ -75,6 +138,32 @@ def dpdk_version(self) -> str:
> > )
> > return self._dpdk_version
> >
> > + @property
> > + def node_info(self) -> NodeInfo:
> > + if self._node_info is None:
> > + self._node_info = self.main_session.get_node_info()
> > + return self._node_info
> > +
> > + @property
> > + def compiler_version(self) -> str:
> > + if self._compiler_version is None:
> > + if self._build_target_config is not None:
> > + self._compiler_version =
> self.main_session.get_compiler_version(
> > + self._build_target_config.compiler.name
> > + )
> > + else:
> > + self._logger.warning(
> > + "Failed to get compiler version because"
> > + "_build_target_config is None."
> > + )
> > + return ""
> > + return self._compiler_version
> > +
> > + def get_build_target_info(self) -> BuildTargetInfo:
> > + return BuildTargetInfo(
> > + dpdk_version=self.dpdk_version,
> compiler_version=self.compiler_version
> > + )
> > +
> > def _guess_dpdk_remote_dir(self) -> PurePath:
> > return
> self.main_session.guess_dpdk_remote_dir(self._remote_tmp_dir)
> >
> > @@ -84,6 +173,10 @@ def _set_up_build_target(
> > """
> > Setup DPDK on the SUT node.
> > """
> > + # we want to ensure that dpdk_version and compiler_version is
> reset for new
> > + # build targets
> > + self._dpdk_version = None
> > + self._compiler_version = None
> > self._configure_build_target(build_target_config)
> > self._copy_dpdk_tarball()
> > self._build_dpdk()
> > @@ -262,48 +355,43 @@ def run_dpdk_app(
> > f"{app_path} {eal_args}", timeout, verify=True
> > )
> >
> > -
> > -class EalParameters(object):
> > - def __init__(
> > + def create_interactive_shell(
> > self,
> > - lcore_list: LogicalCoreList,
> > - memory_channels: int,
> > - prefix: str,
> > - no_pci: bool,
> > - vdevs: list[VirtualDevice],
> > - other_eal_param: str,
> > - ):
> > + shell_type: InteractiveApp,
> > + path_to_app: PurePath | None = None,
> > + timeout: float = SETTINGS.timeout,
> > + eal_parameters: EalParameters | None = None,
> > + ) -> Union[InteractiveShell, TestPmdShell]:
> > + """Create a handler for an interactive session.
> > +
> > + This method is a factory that calls a method in OSSession to
> create shells for
> > + different DPDK applications.
> > +
> > + Args:
> > + shell_type: Enum value representing the desired application.
> > + path_to_app: Represents a path to the application you are
> attempting to
> > + launch. This path will be executed at the start of the
> app
> > + initialization. If one isn't provided, the default
> specified in the
> > + enumeration will be used.
> > + timeout: Timeout for reading output from the SSH channel.
> If you are
> > + reading from the buffer and don't receive any data
> within the timeout
> > + it will throw an error.
> > + eal_parameters: List of EAL parameters to use to launch the
> app. This is
> > + ignored for base "shell" types.
> > + Returns:
> > + Instance of the desired interactive application.
> > """
> > - Generate eal parameters character string;
> > - :param lcore_list: the list of logical cores to use.
> > - :param memory_channels: the number of memory channels to use.
> > - :param prefix: set file prefix string, eg:
> > - prefix='vf'
> > - :param no_pci: switch of disable PCI bus eg:
> > - no_pci=True
> > - :param vdevs: virtual device list, eg:
> > - vdevs=[
> > - VirtualDevice('net_ring0'),
> > - VirtualDevice('net_ring1')
> > - ]
> > - :param other_eal_param: user defined DPDK eal parameters, eg:
> > - other_eal_param='--single-file-segments'
> > - """
> > - self._lcore_list = f"-l {lcore_list}"
> > - self._memory_channels = f"-n {memory_channels}"
> > - self._prefix = prefix
> > - if prefix:
> > - self._prefix = f"--file-prefix={prefix}"
> > - self._no_pci = "--no-pci" if no_pci else ""
> > - self._vdevs = " ".join(f"--vdev {vdev}" for vdev in vdevs)
> > - self._other_eal_param = other_eal_param
> > + # if we just want a default shell, there is no need to append
> the DPDK build
> > + # directory to the path
> > + default_path = shell_type.get_path()
> >
> > - def __str__(self) -> str:
> > - return (
> > - f"{self._lcore_list} "
> > - f"{self._memory_channels} "
> > - f"{self._prefix} "
> > - f"{self._no_pci} "
> > - f"{self._vdevs} "
> > - f"{self._other_eal_param}"
> > + if shell_type != InteractiveApp.shell:
> > + default_path = self.main_session.join_remote_path(
> > + self.remote_dpdk_build_dir, shell_type.get_path()
> > + )
> > + return
> self.main_session.interactive_session.create_interactive_shell(
> > + shell_type,
> > + path_to_app if path_to_app else default_path,
> > + str(eal_parameters) if eal_parameters else "",
> > + timeout,
> > )
>
> I forgot to mention that I'd like to change to structure a bit here -
> calling self.main_session.create_interactive_shell() would makes more
> sense to me, as the interactive_session is basically an implementation
> detail.
>
Sure, I'll make that change as well.
>
> > diff --git a/dts/tests/TestSuite_smoke_tests.py
> b/dts/tests/TestSuite_smoke_tests.py
> > new file mode 100644
> > index 00000000..b7e70ee1
> > --- /dev/null
> > +++ b/dts/tests/TestSuite_smoke_tests.py
> > @@ -0,0 +1,118 @@
> > +# SPDX-License-Identifier: BSD-3-Clause
> > +# Copyright(c) 2023 University of New Hampshire
> > +
> > +import re
> > +
> > +from framework.config import InteractiveApp
> > +from framework.remote_session import TestPmdShell
> > +from framework.settings import SETTINGS
> > +from framework.test_suite import TestSuite
> > +
> > +
> > +class SmokeTests(TestSuite):
> > + is_blocking = True
> > + # dicts in this list are expected to have two keys:
> > + # "pci_address" and "current_driver"
> > + nics_in_node: list[dict[str, str]] = []
> > +
> > + def set_up_suite(self) -> None:
> > + """
> > + Setup:
> > + Set the build directory path and generate a list of NICs in
> the SUT node.
> > + """
> > + self.dpdk_build_dir_path = self.sut_node.remote_dpdk_build_dir
> > + for nic in self.sut_node.config.ports:
> > + new_dict = {
> > + "pci_address": nic.pci,
> > + "current_driver": nic.os_driver.strip(),
> > + }
> > + self.nics_in_node.append(new_dict)
> > +
> > + def test_unit_tests(self) -> None:
> > + """
> > + Test:
> > + Run the fast-test unit-test suite through meson.
> > + """
> > + self.sut_node.main_session.send_command(
> > + f"meson test -C {self.dpdk_build_dir_path} --suite
> fast-tests",
> > + 300,
> > + verify=True,
> > + )
> > +
> > + def test_driver_tests(self) -> None:
> > + """
> > + Test:
> > + Run the driver-test unit-test suite through meson.
> > + """
> > + list_of_vdevs = ""
> > + for dev in self.sut_node._execution_config.vdevs:
> > + list_of_vdevs += f"--vdev {dev} "
> > + list_of_vdevs = list_of_vdevs[:-1]
> > + if list_of_vdevs:
> > + self._logger.info(
> > + "Running driver tests with the following virtual "
> > + f"devices: {list_of_vdevs}"
> > + )
> > + self.sut_node.main_session.send_command(
> > + f"meson test -C {self.dpdk_build_dir_path} --suite
> driver-tests "
> > + f'--test-args "{list_of_vdevs}"',
> > + 300,
> > + verify=True,
> > + )
> > + else:
> > + self.sut_node.main_session.send_command(
> > + f"meson test -C {self.dpdk_build_dir_path} --suite
> driver-tests",
> > + 300,
> > + verify=True,
> > + )
> > +
> > + def test_devices_listed_in_testpmd(self) -> None:
> > + """
> > + Test:
> > + Uses testpmd driver to verify that devices have been found
> by testpmd.
> > + """
> > + testpmd_driver =
> self.sut_node.create_interactive_shell(InteractiveApp.testpmd)
> > + # We know it should always be a TestPmdShell but mypy doesn't
> > + assert isinstance(testpmd_driver, TestPmdShell)
> > + dev_list: list[str] = testpmd_driver.get_devices()
> > + for nic in self.nics_in_node:
> > + self.verify(
> > + nic["pci_address"] in dev_list,
> > + f"Device {nic['pci_address']} was not listed in
> testpmd's available devices, "
> > + "please check your configuration",
> > + )
> > +
> > + def test_device_bound_to_driver(self) -> None:
> > + """
> > + Test:
> > + Ensure that all drivers listed in the config are bound to
> the correct driver.
> > + """
> > + path_to_devbind = self.sut_node.main_session.join_remote_path(
> > + self.sut_node._remote_dpdk_dir, "usertools",
> "dpdk-devbind.py"
> > + )
> > +
> > + regex_for_pci_address = "/[0-9]{4}:[0-9]{2}:[0-9]{2}.[0-9]{1}/"
>
> This shouldn't be tucked away in a test case. It should be in some
> high-level module - let's put it to utils.py for the time being.
>
Good point, this could be something that is useful in many other places as
well. I just created it there because that's just where I needed it but it
would be good to allow other people to reach it easily.
>
> > + all_nics_in_dpdk_devbind =
> self.sut_node.main_session.send_command(
> > + f"{path_to_devbind} --status | awk
> '{regex_for_pci_address}'",
> > + SETTINGS.timeout,
> > + ).stdout
> > +
> > + for nic in self.nics_in_node:
> > + # This regular expression finds the line in the above
> string that starts
> > + # with the address for the nic we are on in the loop and
> then captures the
> > + # name of the driver in a group
> > + devbind_info_for_nic = re.search(
> > + f"{nic['pci_address']}[^\\n]*drv=([\\d\\w]*) [^\\n]*",
> > + all_nics_in_dpdk_devbind,
> > + )
> > + self.verify(
> > + devbind_info_for_nic is not None,
> > + f"Failed to find configured device
> ({nic['pci_address']}) using dpdk-devbind.py",
> > + )
> > + # We know this isn't None, but mypy doesn't
> > + assert devbind_info_for_nic is not None
> > + self.verify(
> > + devbind_info_for_nic.group(1) == nic["current_driver"],
> > + f"Driver for device {nic['pci_address']} does not match
> driver listed in "
> > + f"configuration (bound to
> {devbind_info_for_nic.group(1)})",
> > + )
> > --
> > 2.41.0
> >
>
Thank you for the comments!
Jeremy
[-- Attachment #2: Type: text/html, Size: 91376 bytes --]
^ permalink raw reply [flat|nested] 8+ messages in thread
* Re: [PATCH v2 0/2] Add DTS smoke tests
2023-07-10 16:19 [PATCH v2 0/2] Add DTS smoke tests jspewock
2023-07-10 16:19 ` [PATCH v2 1/2] dts: add " jspewock
2023-07-10 16:19 ` [PATCH v2 2/2] dts: add paramiko to dependencies jspewock
@ 2023-08-18 18:34 ` Adam Hassick
2 siblings, 0 replies; 8+ messages in thread
From: Adam Hassick @ 2023-08-18 18:34 UTC (permalink / raw)
To: jspewock; +Cc: dev
[-- Attachment #1: Type: text/plain, Size: 2584 bytes --]
Recheck-request: iol-unit-amd64-testing, iol-compile-arm64-testing,
iol-broadcom-Functional
This email is a test for our (in development) email based retesting
framework. Please ignore.
On Mon, Jul 10, 2023 at 12:23 PM <jspewock@iol.unh.edu> wrote:
> From: Jeremy Spewock <jspewock@iol.unh.edu>
>
> This version of the series addresses comments and improvements listed on
> the previous non-RFC patch.
>
> RFCs for this patch:
> * v3: https://mails.dpdk.org/archives/dev/2023-June/269859.html
> * v2: https://mails.dpdk.org/archives/dev/2023-May/267915.html
> * v1: https://mails.dpdk.org/archives/dev/2023-April/266580.html
>
> Previous patch:
> * v1: https://mails.dpdk.org/archives/dev/2023-June/271309.html
>
> Jeremy Spewock (2):
> dts: add smoke tests
> dts: add paramiko to dependencies
>
> dts/conf.yaml | 17 +-
> dts/framework/config/__init__.py | 116 +++++++++--
> dts/framework/config/conf_yaml_schema.json | 142 +++++++++++++-
> dts/framework/dts.py | 88 ++++++---
> dts/framework/exception.py | 12 ++
> dts/framework/remote_session/__init__.py | 10 +-
> dts/framework/remote_session/os_session.py | 24 ++-
> dts/framework/remote_session/posix_session.py | 29 ++-
> .../remote_session/remote/__init__.py | 10 +
> .../remote/interactive_remote_session.py | 118 ++++++++++++
> .../remote/interactive_shell.py | 99 ++++++++++
> .../remote_session/remote/testpmd_shell.py | 67 +++++++
> dts/framework/test_result.py | 37 +++-
> dts/framework/test_suite.py | 21 +-
> dts/framework/testbed_model/node.py | 2 +
> dts/framework/testbed_model/sut_node.py | 180 +++++++++++++-----
> dts/poetry.lock | 160 ++++++++++++----
> dts/pyproject.toml | 1 +
> dts/tests/TestSuite_smoke_tests.py | 118 ++++++++++++
> 19 files changed, 1118 insertions(+), 133 deletions(-)
> create mode 100644
> dts/framework/remote_session/remote/interactive_remote_session.py
> create mode 100644
> dts/framework/remote_session/remote/interactive_shell.py
> create mode 100644 dts/framework/remote_session/remote/testpmd_shell.py
> create mode 100644 dts/tests/TestSuite_smoke_tests.py
>
> --
> 2.41.0
>
>
--
*Adam Hassick*
Senior Developer
UNH InterOperability Lab
ahassick@iol.unh.edu
iol.unh.edu <https://www.iol.unh.edu/>
+1 (603) 475-8248
[-- Attachment #2: Type: text/html, Size: 4406 bytes --]
^ permalink raw reply [flat|nested] 8+ messages in thread
* Re: [PATCH v2 2/2] dts: add paramiko to dependencies
2023-07-10 16:19 ` [PATCH v2 2/2] dts: add paramiko to dependencies jspewock
2023-07-10 22:09 ` Patrick Robb
@ 2023-08-18 19:44 ` Adam Hassick
1 sibling, 0 replies; 8+ messages in thread
From: Adam Hassick @ 2023-08-18 19:44 UTC (permalink / raw)
To: jspewock; +Cc: dev
[-- Attachment #1: Type: text/plain, Size: 10932 bytes --]
Recheck-request: iol-unit-arm64-testing
This email is a test for our (in development) email based retesting
framework. Please ignore.
On Mon, Jul 10, 2023 at 12:23 PM <jspewock@iol.unh.edu> wrote:
> From: Jeremy Spewock <jspewock@iol.unh.edu>
>
> added paramiko to the dependency files
>
> Signed-off-by: Jeremy Spewock <jspewock@iol.unh.edu>
> ---
> dts/poetry.lock | 160 ++++++++++++++++++++++++++++++++++-----------
> dts/pyproject.toml | 1 +
> 2 files changed, 124 insertions(+), 37 deletions(-)
>
> diff --git a/dts/poetry.lock b/dts/poetry.lock
> index 0b2a007d..dfd9a240 100644
> --- a/dts/poetry.lock
> +++ b/dts/poetry.lock
> @@ -1,20 +1,33 @@
> [[package]]
> name = "attrs"
> -version = "22.1.0"
> +version = "23.1.0"
> description = "Classes Without Boilerplate"
> category = "main"
> optional = false
> -python-versions = ">=3.5"
> +python-versions = ">=3.7"
>
> [package.extras]
> -dev = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest
> (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins",
> "zope.interface", "furo", "sphinx", "sphinx-notfound-page", "pre-commit",
> "cloudpickle"]
> -docs = ["furo", "sphinx", "zope.interface", "sphinx-notfound-page"]
> -tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest
> (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins",
> "zope.interface", "cloudpickle"]
> -tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler",
> "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins",
> "cloudpickle"]
> +cov = ["attrs", "coverage[toml] (>=5.3)"]
> +dev = ["attrs", "pre-commit"]
> +docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page",
> "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
> +tests = ["attrs", "zope-interface"]
> +tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)",
> "pympler", "pytest-mypy-plugins", "pytest-xdist", "pytest (>=4.3.0)"]
> +
> +[[package]]
> +name = "bcrypt"
> +version = "4.0.1"
> +description = "Modern password hashing for your software and your servers"
> +category = "main"
> +optional = false
> +python-versions = ">=3.6"
> +
> +[package.extras]
> +tests = ["pytest (>=3.2.1,!=3.3.0)"]
> +typecheck = ["mypy"]
>
> [[package]]
> name = "black"
> -version = "22.10.0"
> +version = "22.12.0"
> description = "The uncompromising code formatter."
> category = "dev"
> optional = false
> @@ -33,6 +46,17 @@ d = ["aiohttp (>=3.7.4)"]
> jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
> uvloop = ["uvloop (>=0.15.2)"]
>
> +[[package]]
> +name = "cffi"
> +version = "1.15.1"
> +description = "Foreign Function Interface for Python calling C code."
> +category = "main"
> +optional = false
> +python-versions = "*"
> +
> +[package.dependencies]
> +pycparser = "*"
> +
> [[package]]
> name = "click"
> version = "8.1.3"
> @@ -52,18 +76,39 @@ category = "dev"
> optional = false
> python-versions =
> "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
>
> +[[package]]
> +name = "cryptography"
> +version = "41.0.1"
> +description = "cryptography is a package which provides cryptographic
> recipes and primitives to Python developers."
> +category = "main"
> +optional = false
> +python-versions = ">=3.7"
> +
> +[package.dependencies]
> +cffi = ">=1.12"
> +
> +[package.extras]
> +docs = ["sphinx (>=5.3.0)", "sphinx-rtd-theme (>=1.1.1)"]
> +docstest = ["pyenchant (>=1.6.11)", "twine (>=1.12.0)",
> "sphinxcontrib-spelling (>=4.0.1)"]
> +nox = ["nox"]
> +pep8test = ["black", "ruff", "mypy", "check-sdist"]
> +sdist = ["build"]
> +ssh = ["bcrypt (>=3.1.5)"]
> +test = ["pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov",
> "pytest-xdist", "pretend"]
> +test-randomorder = ["pytest-randomly"]
> +
> [[package]]
> name = "isort"
> -version = "5.10.1"
> +version = "5.12.0"
> description = "A Python utility / library to sort Python imports."
> category = "dev"
> optional = false
> -python-versions = ">=3.6.1,<4.0"
> +python-versions = ">=3.8.0"
>
> [package.extras]
> -pipfile_deprecated_finder = ["pipreqs", "requirementslib"]
> -requirements_deprecated_finder = ["pipreqs", "pip-api"]
> -colors = ["colorama (>=0.4.3,<0.5.0)"]
> +colors = ["colorama (>=0.4.3)"]
> +requirements-deprecated-finder = ["pip-api", "pipreqs"]
> +pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs",
> "requirementslib"]
> plugins = ["setuptools"]
>
> [[package]]
> @@ -87,7 +132,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*,
> !=3.3.*"
>
> [[package]]
> name = "jsonschema"
> -version = "4.17.0"
> +version = "4.17.3"
> description = "An implementation of JSON Schema validation for Python"
> category = "main"
> optional = false
> @@ -129,15 +174,33 @@ reports = ["lxml"]
>
> [[package]]
> name = "mypy-extensions"
> -version = "0.4.3"
> -description = "Experimental type system extensions for programs checked
> with the mypy typechecker."
> +version = "1.0.0"
> +description = "Type system extensions for programs checked with the mypy
> type checker."
> category = "dev"
> optional = false
> -python-versions = "*"
> +python-versions = ">=3.5"
> +
> +[[package]]
> +name = "paramiko"
> +version = "3.2.0"
> +description = "SSH2 protocol library"
> +category = "main"
> +optional = false
> +python-versions = ">=3.6"
> +
> +[package.dependencies]
> +bcrypt = ">=3.2"
> +cryptography = ">=3.3"
> +pynacl = ">=1.5"
> +
> +[package.extras]
> +all = ["pyasn1 (>=0.1.7)", "invoke (>=2.0)", "gssapi (>=1.4.1)", "pywin32
> (>=2.1.8)"]
> +gssapi = ["pyasn1 (>=0.1.7)", "gssapi (>=1.4.1)", "pywin32 (>=2.1.8)"]
> +invoke = ["invoke (>=2.0)"]
>
> [[package]]
> name = "pathspec"
> -version = "0.10.1"
> +version = "0.11.1"
> description = "Utility library for gitignore style pattern matching of
> file paths."
> category = "dev"
> optional = false
> @@ -156,15 +219,15 @@ ptyprocess = ">=0.5"
>
> [[package]]
> name = "platformdirs"
> -version = "2.5.2"
> -description = "A small Python module for determining appropriate
> platform-specific dirs, e.g. a \"user data dir\"."
> +version = "3.5.1"
> +description = "A small Python package for determining appropriate
> platform-specific dirs, e.g. a \"user data dir\"."
> category = "dev"
> optional = false
> python-versions = ">=3.7"
>
> [package.extras]
> -docs = ["furo (>=2021.7.5b38)", "proselint (>=0.10.2)",
> "sphinx-autodoc-typehints (>=1.12)", "sphinx (>=4)"]
> -test = ["appdirs (==1.4.4)", "pytest-cov (>=2.7)", "pytest-mock (>=3.6)",
> "pytest (>=6)"]
> +docs = ["furo (>=2023.3.27)", "proselint (>=0.13)",
> "sphinx-autodoc-typehints (>=1.23,!=1.23.4)", "sphinx (>=6.2.1)"]
> +test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest-cov (>=4)",
> "pytest-mock (>=3.10)", "pytest (>=7.3.1)"]
>
> [[package]]
> name = "ptyprocess"
> @@ -176,25 +239,33 @@ python-versions = "*"
>
> [[package]]
> name = "pycodestyle"
> -version = "2.9.1"
> +version = "2.10.0"
> description = "Python style guide checker"
> category = "dev"
> optional = false
> python-versions = ">=3.6"
>
> +[[package]]
> +name = "pycparser"
> +version = "2.21"
> +description = "C parser in Python"
> +category = "main"
> +optional = false
> +python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
> +
> [[package]]
> name = "pydocstyle"
> -version = "6.1.1"
> +version = "6.3.0"
> description = "Python docstring style checker"
> category = "dev"
> optional = false
> python-versions = ">=3.6"
>
> [package.dependencies]
> -snowballstemmer = "*"
> +snowballstemmer = ">=2.2.0"
>
> [package.extras]
> -toml = ["toml"]
> +toml = ["tomli (>=1.2.3)"]
>
> [[package]]
> name = "pyflakes"
> @@ -228,9 +299,24 @@ tests = ["pytest (>=7.1.2)", "pytest-mypy",
> "eradicate (>=2.0.0)", "radon (>=5.1
> toml = ["toml (>=0.10.2)"]
> vulture = ["vulture"]
>
> +[[package]]
> +name = "pynacl"
> +version = "1.5.0"
> +description = "Python binding to the Networking and Cryptography (NaCl)
> library"
> +category = "main"
> +optional = false
> +python-versions = ">=3.6"
> +
> +[package.dependencies]
> +cffi = ">=1.4.1"
> +
> +[package.extras]
> +docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"]
> +tests = ["pytest (>=3.2.1,!=3.3.0)", "hypothesis (>=3.27.0)"]
> +
> [[package]]
> name = "pyrsistent"
> -version = "0.19.1"
> +version = "0.19.3"
> description = "Persistent/Functional/Immutable data structures"
> category = "main"
> optional = false
> @@ -270,7 +356,7 @@ python-versions = ">=3.7"
>
> [[package]]
> name = "types-pyyaml"
> -version = "6.0.12.1"
> +version = "6.0.12.10"
> description = "Typing stubs for PyYAML"
> category = "main"
> optional = false
> @@ -278,7 +364,7 @@ python-versions = "*"
>
> [[package]]
> name = "typing-extensions"
> -version = "4.4.0"
> +version = "4.6.2"
> description = "Backported and Experimental Type Hints for Python 3.7+"
> category = "dev"
> optional = false
> @@ -299,13 +385,16 @@ jsonschema = ">=4,<5"
> [metadata]
> lock-version = "1.1"
> python-versions = "^3.10"
> -content-hash =
> "a0f040b07fc6ce4deb0be078b9a88c2a465cb6bccb9e260a67e92c2403e2319f"
> +content-hash =
> "c119901b1c13b14adf9b53624430aff8720d9a3a180b028579b82c65d49474df"
>
> [metadata.files]
> attrs = []
> +bcrypt = []
> black = []
> +cffi = []
> click = []
> colorama = []
> +cryptography = []
> isort = []
> jsonpatch = []
> jsonpointer = []
> @@ -313,20 +402,17 @@ jsonschema = []
> mccabe = []
> mypy = []
> mypy-extensions = []
> +paramiko = []
> pathspec = []
> -pexpect = [
> - {file = "pexpect-4.8.0-py2.py3-none-any.whl", hash =
> "sha256:0b48a55dcb3c05f3329815901ea4fc1537514d6ba867a152b581d69ae3710937"},
> - {file = "pexpect-4.8.0.tar.gz", hash =
> "sha256:fc65a43959d153d0114afe13997d439c22823a27cefceb5ff35c2178c6784c0c"},
> -]
> -platformdirs = [
> - {file = "platformdirs-2.5.2-py3-none-any.whl", hash =
> "sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788"},
> - {file = "platformdirs-2.5.2.tar.gz", hash =
> "sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19"},
> -]
> +pexpect = []
> +platformdirs = []
> ptyprocess = []
> pycodestyle = []
> +pycparser = []
> pydocstyle = []
> pyflakes = []
> pylama = []
> +pynacl = []
> pyrsistent = []
> pyyaml = []
> snowballstemmer = []
> diff --git a/dts/pyproject.toml b/dts/pyproject.toml
> index a136c91e..928837cb 100644
> --- a/dts/pyproject.toml
> +++ b/dts/pyproject.toml
> @@ -13,6 +13,7 @@ pexpect = "^4.8.0"
> warlock = "^2.0.1"
> PyYAML = "^6.0"
> types-PyYAML = "^6.0.8"
> +paramiko = "^3.1.0"
>
> [tool.poetry.dev-dependencies]
> mypy = "^0.961"
> --
> 2.41.0
>
>
[-- Attachment #2: Type: text/html, Size: 14993 bytes --]
^ permalink raw reply [flat|nested] 8+ messages in thread
end of thread, other threads:[~2023-08-18 19:44 UTC | newest]
Thread overview: 8+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2023-07-10 16:19 [PATCH v2 0/2] Add DTS smoke tests jspewock
2023-07-10 16:19 ` [PATCH v2 1/2] dts: add " jspewock
2023-07-11 9:42 ` Juraj Linkeš
2023-07-11 18:30 ` Jeremy Spewock
2023-07-10 16:19 ` [PATCH v2 2/2] dts: add paramiko to dependencies jspewock
2023-07-10 22:09 ` Patrick Robb
2023-08-18 19:44 ` Adam Hassick
2023-08-18 18:34 ` [PATCH v2 0/2] Add DTS smoke tests Adam Hassick
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).